SEO Mistakes & Best Practices

- Google does not index dynamic content hidden behind tabs.
- Google really wants to know what users will see, rather than content they could see on a web page.
- Google algorithms discounts content not immediately displayed to users, for example, if a user needs to click on a tab or on a button.
- Web content which is never shown to user is ignored for search results, algorithms try to detect such content.
- Google algorithms separate primary content from boiler-plate content (i.e., present on every page, like templates) to establish relevancy.
- Web content in footers, headers and menus does not have much impact on relevancy, but it does in <h1> headers.
- Google algorithms filter out pages full of advertising.
- Google differentiates popularity (visitors) and reputation/authority (backlinks).
- Sometimes, very similar content is treated as same content.
- Text and content surrounding images is taken into account to establish their relevance.
- Rich markup can be used for snippets, in the future, it may be used as a ranking factor at Google.
- Google algorithms may flag pages as soft 404's and then treat them as 404's.
- Not only content must be high in quality, people must recognize it as such by linking to it.
- Shared content is interpreted as engaging content.
- If Google already has content in its, it does not display it twice in search results.
- Algorithms do try to pull content from iFrames.
- If you underscore and link every other word, content becomes unreadable. Hence, it is considered low content.
- If content between a source page and a redirected 301 page is the same, any algorithmic penalty can be forwarded too. Ditto for manual penalties.
- Content in tab must be easy to access for the user to be taken into account for ranking.
- More than one <h1> per page is acceptable, but don't overdo it.

SEO Tips

- Avoid hiding or blocking access to resources with robots.txt.
- The more Google trusts a website, then more its metadata is trusted.
- User generated content (UGC), such as comments and reviews, is considered as relevant content on a webpage.
- Comments and reviews should be reviewed against spam and low quality for better rankings.
- UGC is not fully trusted by Google algorithms.
- There is a maximum limit (in megabytes) of content downloaded by crawlers per page.
- Use feeds and submit them to Pubsubhubbub to help Google find about new content.
- Hosting a lot of identical content as other websites is a signal of low value.
- Publishing automatically translated content is a signal of low value.
- Do NOINDEX low quality or obsolete content for better rankings.
- Web content age is not a predictor of low value content.
- Google algorithms try to detect breadcrumbs, using markup helps.
- Do use Schema.org's Article markup for in-depth articles.
- Implementing a blog brings diversity (user queries and traffic).
- Is it content users would want to see? If not, then forget about it!
- User generated content is in general of low quality.
- don't develop a "All content is going to help with ranking" mentality.
- Creating lists of keywords is not a good idea, because they quickly look like keyword stuffing.
- Setting tumbs up/down button helps getting feedback from users about content quality.