A/B Testing
In collaboration with my team for Liferay · 20191. Challenge
In early July 2019, the team started working on the new A/B testing functionality in coordination with the Analytics Cloud team. Two months later, the feature was almost dev-complete.
With Liferay DXP 7.2, marketers can create content pages with personalized experiences for different audiences. But how can they be sure the changes are for the better? For instance, will the experience’s content increase the number of subscriptions for that audience? With Liferay DXP 7.2 SP1 and Analytics Cloud, marketers can create A/B tests to evaluate which variant of a content page experience performs better for a given audience, and eventually publish it as the final experience for that audience.
2. Solution
Once your DXP instance is connected to Analytics Cloud, getting started with A/B testing in Liferay DXP is straightforward thanks to the new A/B test sidebar. First you select the page experience you want to evaluate and create your test with a goal to optimize (clicks, time on page, max scroll depth of bound rate). Then you add some variations of the page experience to test against the existing one (control). With the page editor, you can perform any changes in the content or the layout of the page. Finally, you set how the visitor's traffic will be distributed among the different variations and the desired accuracy, and run the test. From that point, you just have to wait for Analytics Cloud to collect enough data and notify the results. After analyzing the results on Analytics Cloud, if a winner variant is found, you can choose whether to replace the previous experience with it or not.
While the A/B test creation and the variant edition takes place on DXP, the test management can be performed from both DXP and Analytics Cloud, thanks to the continuous synchronization of data between these environments. Attached to this post you’ll find some screenshots depicting the steps to create, run and complete an A/B test on DXP. We’ll have a better demo to share soon.
Completing a feature like A/B testing - which has been in Liferay’s roadmap for years - in less than 3 months has been really challenging.
3. Test
After having the first design proposal, we decided to run a usability test to see if there were some pain points that could be fixed by design before the development phase. We start from some hypothesis we wanted to test. We wrote some tasks for users to perform and observe their behavior and if at the end, they achieved what was asked. After running the tests with selected participants we got some conclusions. And with those conclusions we proposed some changes:
- Enhanced the A/B testing click goal element selection, highlighting more the elements that could be selected.
- Help the users know what they need to set before running an A/B test. We made this through a validation system to show the users what is missing. This includes warning the user about having at least 2 variants and if they have selected the goal of the click target from the whole test, it reminds them to select an element to measure it. Take into account that if one variant does not have the element set as the target, that variant can never win the test.
- Help the users know what the winner variant is. We used an alert with an action button. Not just to make clear what the winner is, but also to allow them to publish directly from the top of the sidebar.
Apart from these improvements on A/B Testing, we also worked hard to include the information of the A/B test inside the Experience Panel. Right now it's possible to see what experiences have A/B tests in progress.