{"id":87400,"date":"2021-04-27T08:00:55","date_gmt":"2021-04-27T12:00:55","guid":{"rendered":"https:\/\/www.drift.com\/?p=87400"},"modified":"2021-04-27T17:34:58","modified_gmt":"2021-04-27T21:34:58","slug":"ab-testing","status":"publish","type":"post","link":"https:\/\/www.drift.com\/blog\/ab-testing\/","title":{"rendered":"Engage Your Target Visitors with the Perfect Message. Introducing Drift A\/B Testing."},"content":{"rendered":"

Data. It\u2019s the most objective way to measure results. As the saying goes, the numbers speak for themselves. <\/strong><\/p>\n

I bet most digital marketers would agree. You need data to know what top-of-funnel messages are performing best. Are they engaging your target customers? Where are the visitors coming from? Are they helping you capture an email to create a new lead? Are they converting into a booked meeting or sales pipeline?<\/p>\n

Top of the funnel content has a big impact down the funnel in generating sales-qualified pipeline. And with today\u2019s customers spending more than 65% of their buying process<\/a> alone and online, it\u2019s imperative that your top and middle of the funnel content is spot on<\/em>. It has to be informative and compelling enough to engage the right visitors, when they\u2019re ready to interact with you.<\/p>\n

But how do you know what <\/em>the right message is? And what variation<\/em> of the message attracts different types of customers?<\/p>\n

It\u2019s actually quite simple.<\/strong> Test a few messages, review the data, and pick the winner. Let the message run for a bit, and then test, review and adapt again. Testing different messages helps you see what you are leaving on the table<\/em> \u2013 and realize that often your first message is not<\/em> the best variation.<\/p>\n

So, we need to test <\/em>to get the data. And,<\/em> we need a simple way to visualize<\/em> the data to determine the perfect message. The problem is that many marketing teams either aren\u2019t doing playbook A\/B testing or they\u2019re using a legacy A\/B testing product that is manual, time-consuming, and siloed into different UI views.<\/p>\n

That\u2019s where Drift comes in.<\/p>\n

Do More, With Less \u2013 and in Less Time<\/h2>\n

Our new bot playbook A\/B testing capability empowers digital marketers to<\/strong>:<\/p>\n

    \n
  1. Create and test different bot playbook messages to your defined audiences<\/li>\n
  2. See which message is performing the best<\/li>\n
  3. Use this data to optimize your playbook content to convert more visitors into qualified leads<\/li>\n<\/ol>\n

    \"\"<\/p>\n

    Already using Drift? Get trained in how to use Drift A\/B testing in only 4 minutes right here<\/a>.<\/em><\/h4>\n

    One Place to Manage All Your Bot Playbook A\/B Testing<\/h2>\n

    And best of all, marketers can do all of this in one place, with just two clicks, saving them time and making it easy to continuously optimize performance. They can create bot playbook A\/B tests, compare results side-by-side, pick a winner and make edits live all within one single UI. It\u2019s fast, simple, and easy to use. And, it encourages digital marketers to use data to create better content and increase conversions of qualified leads.<\/p>\n

    \"\"<\/p>\n

    Increase Conversions with A\/B Testing<\/h2>\n

    I sat down with Tim Ozmina<\/a>, our Drift playbook guru, who runs our Drift on Drift experience, to learn about how he uses A\/B testing on Drift\u2019s website.<\/p>\n

    \u201cIt is so easy and saves me so much time. I can create and launch 10 separate A\/B playbook tests in only 5 minutes.\u201d<\/p>\n

    \"\"<\/p>\n

    Here are some of the results we\u2019ve seen using our A\/B testing \ud83d\udc47<\/p>\n