Understanding Google Algorithms

Google tries to get a holistic view of websites. It avoids disclosing details about its core algorithms. However, some general information has been made available in the Google Webmaster hangout videos.

In general, Google algorithms try to answer the following questions:

What the site is about?
Where should we show it in rankings?
How should we show it in search results?
How much should we trust the information on the site?
How much should we trust the links to the site?

Many wish Google told them how to make their website rank higher, but its algorithms are not producing such data. Their purpose is to find the most relevant content for user queries. When Google provides feedback in Search Console, it makes sure the information is actionable and that users can do something about the reported issues.

Updates & The Release Process

• Google does not communicate about algorithm updates unless they provide value to the user.
• Before releasing a new algorithm or an update, Google performs many tests using existing data. This may include user reviews.
• The output of algorithms is doubled-checked and tweaked until Google becomes confident about it.
• The release cycle of algorithms is shortened as Google become more confident with them.
• Some live evaluations can be performed on a small percentage of the traffic, these may include A/B testing.
• Google can collect live data after updates have been published to make sure they are doing things correctly.

SEO Mistakes & Best Practices

- Some algorithms are performed when a page is visited or revisited, some run at intervals (daily, weekly, monthly...) and some are performed manually from time to time.
- It may take a couple of weeks before new websites settle at the level Google wants them to be.
- Google tries to identify the different parts of a website, URLs can be used to identify its structure.
- Sometimes, subdomains are treated as part of main domains.
- Algorithms take a look at the ratio of valuable versus low value content to assess the global value of a website.
- If you fix issues on your website, algorithms will take them into account automatically, even if this can take some time.
- If you have a penalty, it must be revoked on top of fixing your website issues to lift any corresponding ranking demotions.
- Google has algorithms aiming at detecting and neutralizing the impact of negative SEO.
- Algorithms measure different things. They don't overlap.
- It does not take algorithms much time to figure out a whole picture of a small website, but it can take much longer for big websites.
- Google algorithms try to understand the context of keywords when different meanings are possible.
- Google algorithms are not good at guessing or reading at between the lines: make sure keywords appear on your pages.
- Google has no definition of grey hat, there is only white hat and black hat. Anything that falls in between may be reviewed manually.
- There is a doorway page filtering algorithm.
- Google algorithms don't like conflicting signals from websites for ranking.
- Google algorithms do set weights to websites.
- Most Google algorithms try to be as granular as possible, when this is possible.

SEO Misconceptions & Mistakes

- Algorithms don't hold grudge, if it was bad in the past and you fixed it, they forget about it.
- Algorithms don't make exceptions for individual sites.
- Algorithms don't try to assess the usability of websites.
- Algorithms don't stop crawling and analyzing your site when it is under a manual penalty.
- There is no such thing as a Google sandbox (i.e., limitation) for new websites.
- A new website can blow existing websites away, algorithms don't prevent this.
- Google Analytics data is not taken into account to compute rankings.
- Quality related algorithms don't take manual penalties into account to produce metrics.
- Algorithms are as global as possible, not country specific.