Skip to Main Content

6 Ways to Get Slammed By Google

Posted on 9.19.2016

:: By Travis Bliffen, Stellar SEO ::


One of the keys to success online is to establish an effective SEO strategy, one that services a niche and attracts people to your site. This makes Google an incredibly important piece of the puzzle for website owners and content creators, as it’s what will likely drive most site traffic. 

While building a good website and creating a strong content plan isn’t rocket science, it takes patience and planning: take the time to create good content, and Google will reward you. However, Google can give, and Google can take away: while appearing on the front page of Google can be tough, it’s frighteningly easy to get dumped and banned from search results all together.

One day you might have spectacular growth in traffic, only for that traffic to disappear and your site to be a ghost town the next. It’s happened to countless sites, and most of the time it’s impossible to know why. The dreaded “Google ban” can be devastating for a website’s traffic, but there are ways to protect your site.

Understanding Google

Google is a complex, ever-evolving algorithmic beast. It started off as merely an indexer, a list of websites that directly matched whatever keyword was being search. It slowly evolved over the years, however, to become so complicated it can be impossible to understand. Of course, there are ways to “game” the system, but push Google up against a wall and it’ll push back, only you have far more to lose.

The search engine’s main goal is to provide the best possible results for a search query. “Best” is obviously rather vague, because everyone has different interests and tastes. What it ultimately comes down to is authority, credibility and quality (and quantity, so long as the other prerequisites are met). Websites often try to appear to have these traits so as to trick Google and boost their ranking in search results. As you might imagine, Google pushes away these sites with ease, banishing them from search results, sometimes for good.

The scary thing is that a website can be targeted by Google despite not knowingly doing anything wrong. That’s the harsh reality of something as big and complex as Google: it has no sympathy, and it expects everyone to play by the rules. You may think your modest blog is immune, but you never know what might be lurking in the basement of your website’s comments section or backend. That’s why it’s so important to be proactive.

6 Reasons Why Google Has Penalized Your Site

Hosted content may be infected

Any website that hosts malware or redirects from a search result to a suspicious website can be deindexed from Google and removed from results. Sometimes these sites purposefully infect a visitor’s computer, however occasionally a site may be unintentionally infected, which can be particularly devastating for the site owner. It’s a no-brainer that Google is going to remove any site that is distributing a virus, Trojan or other harmful programs. 

The best way to avoid this would be to check and double-check any software you plan to distribute from your website, just to make sure it isn’t harmful. This is especially important for software you have received from another website and are planning to host on your own site. 

Hackers can also gain access to your site and use it to distribute malicious software without you knowing. Some 30,000 websites are hacked every day, and the intentions of the hackers vary, from manipulating keywords and Metas, to using a site to host malicious content and replacing safe software with malware. 

Websites that are infected will generally be locked behind a message that reads, “This site may harm your computer,” which visitors will see when they visit the site. Thankfully, Google offers a number of useful tools to webmasters, including a detailed website that helps address this issue.

The site is using automatically generated content

Google lists automatically generated content as one of the “techniques” that webmasters should avoid. Automated content is written by specifically designed tools, and is generally published online without human review or curation. It can also be content that is automated using synonymizing or obfuscation techniques, or text that has been generated from scraping Atom/RSS feeds or search results.

Not only does your website risk being removed from Google search results, but you’re also likely to be locked out of the AdSense program -- good luck getting back in once you’ve broken the rules.

Google wants to offer a good user experience, and content that misleads or redirects users, or exploits the algorithm will be removed. If it’s a “bad user experience,” as Google explains, the site will be deindexed.

Bots ruin the user experience

Did you know that 30 percent of all Internet traffic comes from malicious spambots? That’s bad news for anyone that doesn’t protect their site. These bots can scrape the site’s content and overload servers, steal content and in some cases alter a site’s natural SEO. Chances are that if you’re a victim of a bot attack, your site will receive an algorithmic penalty on Google. 

This can happen any number of ways. The Penguin algorithm update focuses on backlinks and anchor text distribution, and if bots are altering these and adding backlinks to sites that are already banned from Google, then your site can be hit with a penalty or ban. The Panda update focuses on content quality, which again can come under fire from bots.

Bots can not only devastate the functionality and user experience of your site, but they can also harm your site’s relationship with Google. Database startup Crunchbase experienced this when bots crashed the site from scraping long-tail pages. 

These sorts of attacks can be stopped ahead of time. On large, high-volume sites, software like the one created by Distil Networks is an option to protect websites from the majority of all bots and hacks, working for webmasters and allowing them to shift their focus away from malicious attacks and back onto their site’s organic growth. For smaller websites, there are several good plugins for popular content management systems that can help. Here are a few:

WordPress -  A great plugin called WP-SpamShield Anti-Spam blocks all automated spam bots. It’s compatible with all of the leading social plugins and registration forms, and is great for locking out the endless stream of spambot registrations and comments on WordPress sites.  

Drupal - The simple yet efficient Honeypot module is un-intrusive and powerful in its effort to stop bots from registering and commenting on a site. 

Joomla - Securitycheck Pro is designed to protect your website without affecting your server's speed. Its firewall protects against more than 90 SQL, LFI and XSS attack patterns, and also offers a malware scan to fight against infected software.

Link scheming

Don’t get sucked in by unsolicited emails or calls offering backlinks and a good Google rank. Google measures the credibility of a site partly by how many high-authority backlinks it has (sites that link to it).  If someone is offering to link to your site and exchange links, ignore them. Google calls this “Link scheming,” and defines it as exchanging money or posts for links, or offering services for links. 

A harmless “Link me and I’ll link you” agreement with another site may not seem like a big deal, but Google is known to clamp down on sites that do this. In 2013, lyrics website Rap Genius offered to promote blogs that linked to its pages offering Justin Bieber lyrics. Google banned the site from search listings soon after.

The takeaway from this, if a company is offering, cheap, fast rankings that sound too good to be true, it probably is. Don’t fall victim to “cheap” providers or you are going to spend a lot more in the long run!

Abusing Rich Snippets

Google is firm when it comes to sites abusing rich snippets. These boxes appear in search results, scraping content from sites that have been marked up so as to appear cleaner and more detailed in search results. It requires webmasters to identify important information in their content, and mark it as such so that Google sees it and uses it in search results.

As a means to improve these results, it’s understandable that Google dislikes sites that abuse the feature. Rich snippets can be exploited by sites that use fake content like reviews, or perhaps utilize cloaking tactics to show Google one thing but deliver something completely different. As with pretty much anything that ruins the user experience, Google will promptly remove the snippet and site from search results. 

Neglecting Technical SEO

Technical SEO is simply based upon how reliable and accessible your site is to access: the easier it is for a search engine to crawl for content, the higher the Google ranking. Unfortunately, many sites neglect technical SEO, instead focusing on cheap black hat SEO tactics and overly aggressive link building to attract traffic.

Google values two things very highly: high quality content and a good user experience. If you have one but not the other, your chances of a good Google placing are significantly decreased. Site speed, for example, improves the user experience, and as such Google factors speed as a signal in its search algorithm. 

Mobile friendliness is also very important. Google says that if your page isn’t mobile friendly, it may hurt its appearance in search results. You could improve the accessibility and design of your mobile site by using a responsive design, while separate URLs for desktop and mobile is also an option (albeit an outdated one). 

There are a few other key components of technical SEO that you need to manage to ensure healthy Google placement:

Create HTML and XML sitemaps: Without these, Google’s crawling bots may have trouble finding content on your website.

Improve quality of content: Get rid of any thin or duplicate content. The Panda update made this type of content very damaging for sites that hosted it, and Google continues to evolve and change the way it measures the usability of a site.

Use rich snippets the right way: As mentioned above, don’t abuse rich snippets by trying to trick Google into thinking it’s content that it’s not. Rich snippets provide a useful means of attracting real organic traffic to your website. 

Online real estate is becoming more valuable by the second and with that, the tactics being employed by unscrupulous companies to take that real estate for themselves are becoming more common, and more complex. Monitoring your site is a must in 2017.


About the Author

Travis Bliffen is the founder of Stellar SEO, a Web design and marketing firm located in Franklin, TN. Travis and his team are equipped to handle any size SEO project and have helped numerous businesses to date build a rock solid online presence. When you are ready for more leads and sales, it is time to get #stellarized. Connect on Facebook or Twitter @theseoproz

 Request Website Magazine's Free Weekly Newsletters 

Website Magazine Logo

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up

 

Leave a comment
    Load more comments
    New code
  •