Headline testing is the process of using live experiments to get to know your audience on a deeper level. By tracking what kinds of headlines are most successful at getting your audience to actually click into -- and read -- your articles, you can better adapt your homepage strategy to maximize your content exposure, audience engagement, and visitor loyalty.
When you create and run a headline test, new visitors that come to your site are served one of the headline options and we record which version people are clicking into the most. What’s unique about our testing methodology is that it actually exploits the success of winning headlines by 'playing' them more frequently as they prove to outperform other headlines. Then the winning headline is played permanently to your entire audience.
If you’re interested you can read more information about the methodology we use, called Thompson Sampling, here.
- Main Metrics
- Getting Started
- Starting a new test
- Active and completed tests
- Test Results view
Trials is a count of sessions to your page where a test headline was loaded and served to a visitor.
Quality Clicks % - When a visitor clicks onto a story and subsequently spends at least 15 seconds of engaged time, it is considered a quality click. Note: whenever a visitor reaches the threshold of 15 engaged seconds it will be retroactively be counted as a quality click. Quality Clicks % displays the ratio of Quality Clicks to total clicks for a given headline.
Click Through Rate is the percentage of visitors who click on a given trial headline. This can be lower than 1%.
CTR Increase is the percent rise in CTR of the winning headline compared to the original for a single test.
(CTR of winner / (CTR of original - 1))
Average CTR Increase is the average percent rise in CTR of all winning headlines compared to orignals for all tests in the past seven days.
AVG (CTR of winner / (CTR of original - 1))
Play % is the percentage of people being served a given headline, as well as the confidence that headline is "better". For example at 65%, a headline is served to 65% of people and has a 65% certainty that it is "better". Note that at 95% play percentage a headline "wins" and is played 100% of the time.
Headline Testing is just one of the ways that you can be using data to make the most of your homepage audience. So that you can quickly incorporate headline tests into your homepage strategy without interrupting your team's workflow, the Headline Testing tool is built into the Heads Up Display itself.
To get started, just make sure that you've installed the Heads Up Display, then just and select any link that’s receiving a pin. From there, simply toggle from the 'Performance' tab to the 'Headline Testing' tab at the top of the card and you’re ready to get started.
Please note that Engaged Headline Testing is a premium feature. If you have any questions about your current plan, reach out to your Customer Success Manager, or send an email to email@example.com.
Starting a new test
Select a headline on the page that you want to start a test for, and click on its Heads Up Display pin. Next, toggle into the Headline Testing tab in the pin card.
If your articles feature multiple headlines, for example, the main headline and a sub-headline, or a section name as well as a headline, you’ll be prompted to specify which one you want to run an experiment for. Note that you cannot test a primary and sub-headline simultaneously in this case.
Enter the various headlines you want to test. The first headline will be auto-filled with the headline that appears by default in the page, and as additional headlines are entered they appear inline on the page so you can preview their appearance before starting the experiment.
While there's no limit to the number of headline variants that can be tested, remember that an experiment's progress will be slower the more headlines are added.
Simply select 'start' to begin testing.
Active and completed tests
To monitor the progress of a test, select the pin next to the headline currently being tested. Active headlines experiments will be indicated with a pie chart representing the number of headlines being tested and the play percentage of each headline. Headlines with a higher play percentage are ones that are performing better.
Editing tests, adding variants, removing variants, and editing the copy of headline variants will automatically stop and restart the test.
A test ends when the winning headline has been determined with 95% confidence and Chartbeat begins to automatically serve that headline 100% of the time.
To make sure that tests come to a conclusion as efficiently as possible, we have an alternate way to determine a winner, called 'soft convergence'. If 20 minutes have passed and we're 95% confident that no headline is better by a margin of 25%, the leading headline will win.
As we find winning headlines for your active tests, our code begins serving this headline on your homepage to 100% of new visitors to the page. At this point, you may decide that it's best practice for your team to manually end the test by changing the original headline to the winning headline variant in your CMS.
To end an active headline test at any point, open the Heads Up Display pin for that headline, and in the 'Headline Testing' tab, select 'stop'. Or, in the Headline Testing results page, headlines that are autoplaying have a green play button next to them. Hover over this button to reveal the red 'Stop' button depicted in the screenshot below.
Test Results view
The results view shows you all currently running, completed, and canceled tests, as well as a series of overview metrics to help you better understand how much of an impact you're making on your traffic, and what your results can teach you about your audience.
From the Heads Up Display, select ‘Headline Testing Results’ at the bottom of the dock, from the Heads Up Display pop-out menu, or select ‘Headline Testing’ under the Optimization tab from the left-hand navigation.
Past 7 Days
The top of the page visualization shows your total number of completed tests from the past seven days—broken out into two sections. On the left, the pie chart shows you the percentage of the time that original or alternative headlines won, as well as how often tests resulted in no winner.
The bar graph in the middle takes the breakdown of original vs. alternative winner and splits it over the course of the past week, so you can see on a day by day basis which headlines won more often.
The right-hand module surfaces the Average CTR Increase for the week. This number tells you how the average CTR of this week’s winning headlines compare to CTR of the original headline variants. At a glance, it informs you how much more click-through activity you’re getting as a result of your headline testing.
Beneath the top-of-page metrics, you can see details on all your active and past 50 headline tests. Active tests all update in real time as they happen, and offer the option to cancel the test. Whereas active tests show a running count of how long it’s been playing and how many trials are being served for each option, completed tests show how long it played for, and the total number of trials for each variant.
The bar graphs in the middle of the card display the ratio of quality clicks to total CTR for each headline. The right-hand number is the percent of viewers who, after being served that headline variant, clicked through to that article page. The percent of that group who successfully clicked through and spent 15 seconds of engaged time are expressed with the percent on the left.
For example, this winning headline (above) was served to a total of 39,084 unique visitors, and 1.12% of those people who saw it clicked through to the article. Of those people who clicked through, 83% engaged for at least 15 seconds.
On the right-hand side, each test also lists its status (active, winner found, no winner found, or canceled), the CTR Increase for completed tests, who created the test, and the page it ran on.
Completed tests also mark the winning headline in green, and if that winning headline is being autoplayed on your page, with a green playing icon. Select the playing icon to stop the autoplay and revert to playing the original headline to all new, unique visitors to the page.
If you want to run deeper analysis on your headline testing results, beneath the CTR Increase module you can export all your results data as either of two CSVs.
The “Tests.csv” is a summary of every test you’ve run, giving you insight into how many variants is the optimal number for your site, which user is running the most (or best) tests, or which section is running the most (or best) tests.
The other format, “Headlines.csv” is a breakdown of how every headline variant performed within every test you’ve ever run. Use this option to dive into characteristics of winning headlines like character count, words, and language.