In recent months, we’ve seen many important technically focused Google announcements, such as an update on JavaScript crawling support, the migration toward mobile-first indexing, the release and extended support of AMP in search results and the expansion of search results features, from rich snippets to cards to answers.
As a result, a number of technical items must be taken into consideration when doing an SEO audit to validate crawlability and indexability, as well as to maximize visibility in organic search results:
1. Mobile web crawling
Google has shared that a majority of its searches are now mobile-driven and that they’re migrating toward a mobile-first index in the upcoming months. When doing a technical SEO audit, it is now critical to not only review how the desktop Googlebot accesses your site content but also how Google’s smartphone crawler does it.
You can validate your site’s mobile crawlability (errors, redirects and blocked resources) and content accessibility (Is your content correctly rendered?) with the following technical SEO tools:
- Google page-level mobile validators: Google’s Mobile-Friendly Test and Search Console’s “Fetch as Google” functionality (with the “Mobile: Smartphone” Fetch and Render option) are the simplestand best ways to check how Google’s mobile crawler “sees” any given page of your site, so you can use them to check your site’s top pages’ mobile status. Additionally, Google Search Console’s “Mobile Usability” report identifies specific pages on your site with mobile usability issues.
- SEO crawlers with a ‘Smartphone Googlebot’ option: Most SEO crawlers now offer the option to specify or select a user agent, allowing you to simulate Google’s mobile crawler behavior. Screaming Frog SEO Spider, OnPage.org, Botify, Deepcrawl and Sitebulb all allow you to simulate the mobile search crawler behavior when accessing your site. Screaming Frog also lets you view your pages in a “List” mode to verify the status of a specific list of pages, including your rendered mobile pages.
- SEO targeted log analyzers: Last year, I wrote about the importance of doing log analysis for SEO and the questions that this would allow us to answer directly. There are log analyzers that are now completely focused on SEO issues, such as Screaming Frog Log analyzer (for smaller log files), Botify and OnCrawl (for larger log files). These tools also allow us to easily compare and identify the existing gap of our own crawls vs. what the mobile Googlebot has accessed.
If you want to learn more about Mobile-First SEO, you can check out this presentation I did a couple of months ago.
2. JavaScript crawling behavior & content rendering
Three years ago, Google announced they were now able to execute JavaScript in order to better understand pages. However, JavaScript tests — like this recent one from Bartosz Goralewicz or this one from Stephan Boyer — have shown that it depends on the way it’s implemented and the framework that’s used.
It’s then critical to follow certain best practices, with a progressive enhancement approach to keep content accessible, as well as to avoid others, such as the former AJAX Crawling proposal, and only rely on JavaScript if it’s completely necessary. Indeed, tests run by Will Critchlow also showed results improvements when removing a site’s reliance on JavaScript for critical content and internal links.
When doing an SEO audit, it is now a must to determine if the site is relying on JavaScript to show its main content or navigation and to make sure it is accessible and correctly rendered by Google.
[Read the full article on Search Engine Land.]
Some opinions expressed in this article may be those of a guest author and not necessarily Marketing Land. Staff authors are listed here.
About The Author
Popular Stories