A/B test results: bounce improved, more pageviews - but fewer transactions? What does it mean?

Hey everyone,

I’m new to A/B testing and have begun running some small tests using google optimize.

I recently ran a test for a couple weeks where I moved a button that was hidden beneath the scroll/fold up so that it would be easier to find.

As expected, people seemed to find the button, and bounces and pageviews improved - but transactions did the exact opposite and decreased.

I’m not sure how to interpret this so I’m not sure what to do next. Could it be that people who find the button are progressing too quickly through the site and missing valuable information? Is there a followup test or additional data I should look at?

Anyone been in a similar spot?

1 Like

This could be it. Why not run an exit-survey on the users that clicked but did not covert asking them “Why they didn’t complete the form”… it might shine light on the reasons they didnt continue.
Ultimately, if the goal is conversions then the AB test has proven that you shouldnt chage the position… its no use forcing a result when the right decision right in front of you

2 Likes

Maybe people aren’t interested in the product or it’s too expensive.

How big is your sample data set? Is it big enough to be statistically significant? Could be anything though, as alluded to by uxdude, could be seasonal or affected by marketing? Hard to pinpoint. Are u tracking people who clicked your button and then converted? Or conversions overall?

that would have always been the case. The point here is that something happened to decrease conversions.

My thoughts are A/B tests are unreliable.

I wouldn’t say A/B tests are unreliable. From the sound of it, users stuck around longer but weren’t convinced to buy the product because of other reasons.

That sounds like a possibility. My guess is perhaps people who used to read down below the fold were more invested in the product than people who just like to click buttons.

1 Like

We’re tracking conversions overall on the A version vs. the B version

If that’s the case, I’m not sure if you can correlate the 2 events. The overall conversions could be affected by anything not just the placement of this button. If you tracked instead the number of conversions who clicked the button then that is better. But i think the real problem is in the numbers, a/b testing only works if u have a large enough data set. Read into statistical significance and confidence intervals to find out more.