Imagine you walk into a restaurant and, instead of a menu, you are given a list of all the ingredients that the kitchen uses. You know what could be going into the meal but you don’t exactly know what the recipes will be. This is essentially what has happened with the recent leak of Google Search's Content Warehouse API documentation. Google has stated, though, that there is a lot of missing context from this leak.

For many SEO professionals, much of this leak does not come as a huge shock but rather confirms what was already believed including. There are, however, a number of things that directly contradict what Google have previously communicated.

By and large, the information leaked has helped reinforce our approach and strategy but there have been some key learnings that came as a surprise to the SEO world.

Google’s Sandbox

For a long time, it was believed that Google may suppress the rankings of new websites, keeping them in a metaphorical sandbox until they establish trust and credibility. Google’s position on this has consistently been that a sandbox does not exist and that all websites are treated equally in its search algorithm.

This leak has brought to light the apparent existence of a sandbox mechanism that Google employs to control the visibility of a new website within its results. It essentially acts as a probationary period for websites to avoid low-quality or spammy websites quickly achieving high visibility and cluttering the results pages. While the intention is to limit low-quality websites and pages, it has a knock-on impact on and poses a challenge to new websites looking to build their brand.

The key take-away from this is that it highlights the importance of patience and consistency when launching a new website, allowing time for Google to thoroughly assess your website’s credibility and value to a user. We have been strong advocates for consistent and regular publishing of high-quality content, building reputable backlinks and engaging with your audience to effectively demonstrate Google’s core ranking factors: Experience, Expertise, Authoritativeness and Trustworthiness.

Domain Authority

SEO professionals have looked at domain authority as a way to assess a website's overall strength and credibility in the eyes of search engines.

Google, however, has consistently denied that domain authority is part of its algorithm and instead emphasised a more holistic approach to ranking, focusing on factors like content quality, relevance and user experience. This stance has led to a perception among SEO professionals that Google's algorithms are much more complex and nuanced than simply assigning a single number website's authority.

In the leak, though, we can see a metric called "siteAuthority." This metric, similar in concept to domain authority, suggests that Google assesses a website's overall authority and credibility when determining its rankings in search results and then factors it in down the line. Again, this is not necessarily a bad thing as it is aimed at ensuring the best user experience when searching different queries, however it is another hurdle that new websites will have to overcome.

Chrome Data Controversy

Google has faced scrutiny over its handling of user data and the extent to which it influences search rankings.

The crux of the issue lies in Google's potential utilisation of Chrome to track user behaviour beyond traditional cookies. While user behaviour has historically influenced Search Engine Results Pages (SERPs) through cookie consent, the concern appears to be Google's exploration of methods beyond cookies within Chrome to monitor user activity, potentially impacting search outcomes.

Click Count Contradiction

Google has consistently maintained that clicks are not a direct ranking factor in its search algorithm, instead emphasising the importance of high quality and relevant content as well as other on-page and off-page factors for determining search rankings. The leak has, in fact, revealed a system known as NavBoost, which seems to contradict these claims. 

NavBoost is described as a mechanism that leverages click-driven measures to influence search rankings, suggesting that user behaviour plays a role in determining which websites appear at the top of search results.

It categorises clicks into "badClicks" (bounces), "goodClicks" (engagement), and "LastlongestClick" (dwell time). The idea here is that user behaviour (how long someone stays on a page for, for example), can impact rankings which is a contradiction of  Google's previous statements that their algorithm doesn’t factor in clicks. It instead raises the question: to what extent does user behaviour shape search results?

Authorship

As mentioned, there is not much of this leak that comes as a total shock. The importance of authorship, for example, was already known in most SEO circles; it is the A in Google’s E-E-A-T ranking factors that they have encouraged for years.

What the leak has shone a light on, though, is just how valuable these measurements are. Google is believed to assess the credibility and expertise of content creators, using authorship signals to evaluate the trustworthiness and relevance of web pages. The suggestion here is that recognised authors and their previous work can influence search rankings.

In the world of SEO and digital marketing, this is all quite interesting and somewhat revelatory. But to suggest that it is a ground-breaking, stop-the-presses scoop that changes everything would be a touch dramatic. If it has revealed anything, it is that much of our beliefs were in fact right from the outset: produce content that is aimed to generate good quality clicks, building and demonstrating authority is only a good thing and that being consistent and patient will pay dividends down the line.

If you want to know more about what this API leak means for you or how we can help you with your SEO, please contact hello@fingo.co.uk.