Since the rise of 'not provided' keywords in Google Analytics, SEOs have had to rely on multiple data sources to track the progress of on-page content optimisation. But, as our own example shows, this can only be a good thing...
When Google announced encrypted search was to be the default for all Google users, it dealt a heavy blow to SEO – as most keywords from organic search would now appear as 'not provided' in Analytics.
This meant it was much more difficult to track the efficacy of on-page content optimisation. After all, if we didn't know which keywords were bringing users to a page, how were we supposed to know if our carefully keyworded content was doing its job?
Now the dust has settled, with 'not provided' keywords standing at 87.58% of the total, according to Not Provided Count.
SEOs have, for the most part, found clever ways around the issue – typically by combining data from several sources to bolster the reliability of their reports.
The fact is, savvy SEOs have always used several data sources when putting together reports. Relying on one set of data means setting yourself up for a fall when developments like encrypted search inevitably happen.
At the time of the announcement, we had several reliable data sets at our disposal already, with one of the most notable being Google Webmaster Tools' Search Queries feature. This shows the top keywords that brought traffic to a site, with impressions, clicks and average position metrics.
At first, many dismissed this data as being unreliable and inaccurate, with one of the most common gripes being that the numbers were rounded, or 'bucketed'.
However, with Google deciding to stop 'bucketing' the data at the start of this year, Search Queries has become a reliable source for actionable data insights.
Allow us to share an example. We'll show how tracking Search Queries data has provided one of our clients with a wider, more dependable view of their on-page content optimisation strategy.
Here are the impressions, clicks and average position of two keywords that we tracked over a six-month period, from January to July this year. In May, the client uploaded new, optimised content on separate pages of its website that targeted these keywords.
If we take the results from April as our benchmark and compare them to the latest figures, we can see:
Keyword one
Keyword two
With impressions, clicks and position improving so strongly for these keywords, it would be reasonable to assume that our content optimisation efforts are working. However, these figures aren't counted in the same way as Google Analytics data – crucially, there are things this data doesn't show that means we can't know for sure that it's our on-page optimisation work that has driven these results.
For instance:
Despite these drawbacks, this data does provide a good overall insight into the effectiveness of our campaign. We can't be absolutely certain that our work is driving results, but by bringing in data from other sources, we can drastically reduce the margin for error.
For example, we can look at Google Analytics data for the pages in question and look for traffic uplifts. And we could use results from ranking monitoring tools to get a firmer idea of rankings over the same time period.
The final result is a data set with a very small margin of error, and a more insightful set of metrics than Analytics alone would provide. In essence, we're getting a 'before and after' picture of users' journeys.
We can monitor metrics like bounce rate and pages per session to evaluate the effectiveness of on-page content and conversion optimisation, and also gain insights into how users behave when they come across the results in Google SERPs, and optimise meta data accordingly.
So, the fact that SEOs have had to move to using more data sources for reporting can only be a good thing. The more data you have at your disposal, the more you can discover about your SEO campaigns, and the more you can do to make things better.