Webfia SEO/SEM Consulting ~ LinkedIn SEO/SEM Consultant

Google struggles with Angular
— tried submitting comprehensive sitemaps
— Google managed to find around 200 of the ~25,000 pages
— struggled to do any more even with repeated submissions

AngularJS SEO
— it’s hard to find an indexed site that is built with Angular and a lot of people seeking help
— you can use Angular, you just need to know what to look out for to make the website crawlable


1. Remove hashtag
— remove forced hashes that Angular sets as default i.e. com/#/page
— https://scotch.io/quick-tips/pretty-urls-in-angularjs-removing-the-hashtag

2. Fix relative URLs
— strings in the links were appending to whatever the URL already was
— this “<a href=”en/countries/united-kingdom”>…</a>”
— instead of <a href=”/en/countries/united-kingdom”>…</a>
— homepage link would append an extra “/en” to the URL

3. Pre-render
— pre-render the Angular server-side and direct Google to that version, whilst sending the user directly to the Angular version (used phantom JS for this)
— if server can’t handle the crawlers’ requests, this can result in an increase in server errors
— solution is to pre-render in advance server-side, so that when crawlers reach the server, the page is already rendered

4. Fragment Meta
— Google passes a URL string parameter that looks like: ?_escaped_fragment_
— it redirects Googlebot to the server in order to fetch a pre-rendered version

Topics SEO, AngularJS SEO
Source https://www.deepcrawl.com/knowledge/best-practice/angular-js-and-seo/

Webfia SEO/SEM Consulting ~ LinkedIn SEO/SEM Consultant