Start here: how our SEO split tests work
If you aren't familiar with the fundamentals of how we run controlled SEO experiments that form the basis of all our case studies, then you might find it useful to start by reading the explanation at the end of this article before digesting the details of the case study below. If you'd like to get a new case study by email every two weeks, just enter your email address here.
In this week’s #SPQuiz, we asked our followers on LinkedIn and X/Twitter whether unconventional title tags could boost SEO performance. Half of our voters said it depends on the change, while the rest split evenly between yes and no. Looks like context is key, keep reading to see how these tests have played out in practice.
Poll Results
Title Tag Case Studies
Title tags have been around since the beginning of SEO. They are short, highly visible, and still matter today. Not only do they play a role in rankings, but they also have a major impact on click through rate from search results. With such a small amount of space to work with, every character counts. Should you add more keywords, include pricing, or highlight content formats like video? These are the kinds of questions SEOs debate all the time, and the answers aren’t always obvious.
To find out what actually works, we’ve run a series of controlled SEO split tests across different industries. Just like with our internal linking experiments, the results reveal both clear wins and some surprising losses. Here are a few of the more interesting ones we’ve run.
Extra Keywords in Title Tags
Title tags are often the first place SEOs try to squeeze in more keywords. The logic is simple, if one keyword is good, then surely more must be better. For an e-commerce customer, we tested this on product listing pages by appending extra variations to the titles. A page that originally had the title “Women’s Dresses” became “Women’s Dresses Best Women’s Dresses, Dresses for Women, Stylish Women’s Dresses”. Despite the broader keyword coverage, the test was inconclusive at the 95 percent confidence level. There was no measurable uplift in organic sessions for the variant group. This suggests that repeating near duplicate phrases doesn’t add value and can make titles look cluttered or spammy.
Destination City or Airport Codes
For a customer of ours in the travel industry, we looked at whether adding airport codes to titles could capture more searches. Travelers sometimes search using codes such as LHR or JFK, so it seemed logical to include them in title tags. We changed titles like “Flights to London” to “Flights to London (LHR)” and also experimented with four letter city codes. The result was a clear negative, organic traffic to the variant pages dropped by 16 percent at the 95 percent confidence level. While codes may appear in query data, for most users they added confusion rather than clarity. The takeaway is that insider jargon doesn’t always translate into SEO gains.
Asking a Question About Cost
Many users type full questions into Google, especially around pricing. For a retail customer we wanted to see whether titles phrased as questions could better capture this intent. We tested changing titles from descriptive formats like “Dental Implants Options and Pricing” to question based phrasing such as “How Much Do Dental Implants Cost?”. The result was a win, with the question based titles delivering more than a five percent uplift in organic sessions. By mirroring the way users search, the variant titles felt more relevant and drove more clicks.
Age Ranges in Titles
For a customer of ours in the care industry, we wanted to test whether adding age ranges into titles could make them more relevant. Parents and caregivers often refine their searches by age, so it felt like a natural qualifier to highlight. A page originally titled “Care Services in London” was updated to “Care Services in London (Ages 5 to 12)”. The result was positive, with about a four percent uplift in organic sessions at a 90 percent confidence level. This suggests that clear, specific qualifiers can make titles more appealing to the intended audience.
Static vs Dynamic Prices in Titles
For a travel rental customer we compared the impact of static prices versus dynamic prices in title tags for location pages. Price transparency can be a powerful driver of clicks, but only if the information is accurate. We ran two tests to compare static and dynamic pricing in titles. Static prices such as “Apartments in Paris from €59” led to a negative seven percent result. Outdated prices misled users and discouraged clicks. Dynamic prices that were updated automatically from live feeds saw a ten percent uplift in organic sessions. The comparison makes the lesson clear: freshness matters. Prices can improve click through rate, but only when they are accurate and current.
Adding “with Video” to Titles
For a media customer of ours, we tested whether highlighting video content in article titles would help. Highlighting content formats seems like it should be beneficial, so to test this, we added labels like “(video)” and “(with video)” to article titles, so “Football Highlights” became “Football Highlights (with Video)”. Both versions performed worse. Organic traffic dropped for the variant pages. Instead of enticing more clicks, the extra wording cluttered the titles and may have confused users. Some may even have assumed the content was only a video rather than a full article. This experiment shows that not every label adds value. In title tags, brevity and clarity usually win.
What did we learn
Looking across these experiments, a few themes emerge. Adding more keywords doesn’t always help and can make titles worse. Titles that mirror how users actually search, whether as questions or with qualifiers like age ranges, tend to perform best. Accurate, fresh details like dynamic pricing can deliver strong results, while outdated or unnecessary information can actively harm performance. Above all, these tests show the importance of experimentation. Even the smallest changes to a title can move the needle significantly, and the only way to know for sure is to test. Results may vary across sites and industries and we always recommend testing changes first.
To receive more insights from our testing, sign up for our case study mailing list, and please feel free to get in touch if you want to learn more about this test or our split testing platform more generally.
How our SEO split tests work
The most important thing to know is that our case studies are based on controlled experiments with control and variant pages:
- By detecting changes in performance of the variant pages compared to the control, we know that the measured effect was not caused by seasonality, sitewide changes, Google algorithm updates, competitor changes, or any other external impact.
- The statistical analysis compares the actual outcome to a forecast, and comes with a confidence interval so we know how certain we are the effect is real.
- We measure the impact on organic traffic in order to capture changes to rankings and/or changes to clickthrough rate (more here).
Read more about how SEO testing works or get a demo of the SearchPilot platform.