You, Me and the Googlebot

If you have heard it once, you have heard it a million times - the speed of a website influences everything from the digital experience of users to the algorithmic rankings determined by search engines.

 

The speed of a website, how quickly it takes to fully load in the user's browser, is only one variable among hundreds (likely thousands), however, that may be negatively impacting the efforts an enterprise makes in terms of search engine optimization and ultimately generating a positive response from an audience. Performance from a truly technical perspective is far different than performance of a tactical nature, and technical performance may ultimately be more telling of SEO success in the long run than most 'Net professionals may realize. There are many influencing factors that prevent one's site from ranking above another and the more one knows about the reason, the closer he or she is to implementing a solution. The problem is, it's not always easy to know what the triggers are for poor-performing SEO campaigns.

 

3 Recent Developments to Know for Better Search Engine Optimization

 

Let's review three new Google-related developments that will matter to high placement on the search engine results and might even provide a quick fix to the performance problems of a company's SEO initiatives.

 

LOCAL INDEXING:

Google recently announced the introduction of local-aware crawl configurations for Googlebot on pages it detects may adapt content served based on the request's language and perceived location. Currently, Googlebot requests pages without setting an Accept-Language HTTP request header and uses IP address that appear to be located in the U.S., so not all content variants of local-adaptive pages may be indexed completely. The new crawling configurations are enabled for pages Google detects to be local-adaptive, so SEOs may notice changes in how Google crawls and displays a site in its search results.

 

DESIGN & SEO:

Don't believe that Web design plays a role in SEO? Think again. In late 2014, Google announced that its indexing system started rendering pages similar to a modern Web browser; that means with JavaScript and CSS turned on. To ensure that a website is rendering correctly and being indexed properly, allow Googlebot (within the robots.txt file of a website) to access the design-related files that are being used in the page. Google has warned that preventing (e.g. disallowing) Googlebot from crawling these files can result in sub-optimal rankings as it impacts how the search engine's algorithms render and index content.

 

SEMANTIC MARKUP:

Semantic search is once again in the digital news, as Google expanded a feature that enables musical artists to add their events on the search results pages, to now show expanded answer cards with the on-sale date, the availability of tickets and a link to the preferred ticketing site.

 

Once again, there are countless variables that influence the position of a website's SEO performance - both from a technical and tactical perspective. Let these three developments open your digital eyes to what may be preventing your enterprise from achieving higher rankings and serve as inspiration to look at the relationship a brand has with Googlebot and how it can be improved. What this means for SEOs is that Google is putting more virtual stock in its Knowledge Graph and providing Web workers with additional opportunities to refine and improve their listings.

 

BONUS: Does Domain Name Choice Influence Ranking?

 

The SEO community has long debated whether domain names infl uence ranking. Find out if exact match domains, registration length and general transfer volatility matter to SEO today.