If you are looking to change behaviour with your campaign, it’s crucial to design the most effective messages to influence people’s outlook and actions. One popular method for optimising these messages is A/B testing. In this episode, I’ll explain what A/B testing is, how to conduct it in behaviour change campaigns, and tips for maximising its effectiveness.
I cover what A/B testing is for comms pros, and seven tips on how to get started.
This episode was inspired by a ‘Deep Dive’ episode with behaviour change expert, Shayoni Lynn of Lynn Globalwhoit was a pleasure to host on the pod earlier in the year. I’ve added the link to the show notes today so you can listen to that whole episode.
Let’s dive in!
Links in this episode:
Shayoni Lynn on Behaviour Change:
Statistical Significance – Meaning:
Liked Listening today? What to do next:
Get my FREE roadmap to get more strategic with communication activity in your business.
Listen to more episodes, take some training, or download a resource: Find out more here.
Hire my expertise
Whether that’s support with a one-off comms project or an entire strategy for your business, drop me a line if you want to explore this further. You can also work with me 1:1 as a trainer and mentor – firstname.lastname@example.org
Work with me closely
If you’d like to work with me to develop and implement your communication strategy through 1:1 work, podcasts, workbooks, sharing ideas, and lots of accountability and up-skilling, then email me at email@example.com to register your interest for you or your entire team.
Leave me a voicemail on my Speakpipe page I would love to hear your feedback on this episode and thoughts on any topics I could include in future ones too.
Full Transcript (Unedited)
What is A/B Testing?
A/B testing, also known as split-testing, is an experimental approach used to compare two or more variations of a message, ad, or webpage or any piece of collateral really. The goal is to determine which version generates the best response in terms of desired outcomes (e.g., clicks, conversions, sign-ups). By gathering data on user behaviour, you can make more informed decisions about your campaign messaging.
Shyoni and I discussed on the pod a while back how her company sometimes uses A/B testing on ads and visual pieces of communication as well as messages before launching campaigns.
Let’s just take messaging for now, how can you Conduct A/B Testing on Messages?
Define your goal:
Before starting an A/B test, its important to clearly identify the specific behaviour change you want to achieve. This might involve increasing awareness about a certain issue, boosting donations or sign-ups for a cause, or motivating users to adopt a new habit.
Create message variations:
Develop two or more distinct versions of your message focused on the same goal. This could involve creating multiple headlines or changing key elements within the content such as images, call-to-actions (CTAs), or tone.
Split your audience:
Randomly assign your target audience into different groups, ensuring that each person only receives one version of the message. This control helps eliminate bias and ensures that any differences in outcomes can be attributed to the variations in messages.
Track relevant metrics (e.g., click-through rates, conversions) for each message variation during a predetermined time frame. Make sure your sample size for each group is large enough to provide statistically significant results.
Compare the performance of each variation against your preset goals and determine which version was the most effective at driving the desired behaviour change.
Refine and repeat:
Use the insights gained from your A/B test to make data-backed improvements to your messaging, then continue iterating and testing to further optimise your campaign.
Tips for Successful A/B Testing in Behaviour Change Campaigns:
- Test one variable at a time: When crafting message variations, only change one element at a time to ensure that any differences in performance can be attributed to that specific change.
- Prioritise high-impact variables: Focus on testing elements that are likely to have the biggest impact on your campaign’s success, such as headlines, CTAs, or personalization features.
- Run tests simultaneously: To minimise external factors influencing the results (e.g., news events, seasonal trends), it’s important to run all test variations simultaneously.
- Allow adequate time for results: Conduct your A/B test over an appropriate time period to allow for meaningful data collection. Be cautious not to end the test too early or run it for too long.
- Utilise pre-existing benchmarks
If you have them, start with your pre-existing benchmarks for comparison.