Skip to Main Content

From Privacy to Penalties: Is Google giving users what they asked for? (Part III)

Posted on 1.19.2015

:: By Ben Oren, Dynamic Search ::


This is the third and last part of a series (read Part 1, read Part 2) of articles examining Google’s practices and standing policies, specifically in light of its stated objectives in favor of increasing access to information and improving user experience. This article takes a deeper look into one of Google’s most controversial actions, which have been at the heart of heated debates: Google’s repeated user privacy violations.

Users’ Fragile Right to Privacy

The grand majority of Google’s income originates from advertising, and totals approximately 16 billion USD per quarter, according to the latest report. In order to optimize advertisers’ spend, Google uses users’ private information, obtained through scanning their Gmail accounts, search history, surfing history and ad clicks. All of this information is stored on Google’s services where it’s analyzed and helps determine which ads should be displayed. This information also feeds Google Trends, and helps them determine which topics are about to go viral. All in all, it seems this isn’t so bad – who wants to see irrelevant ads?

It doesn’t stop there. In 2012, the Wall Street Journal exposed that Google had blatantly exploited Safari browser’s security settings to stalk the usage habits of iOS and Mac users, even though they had signaled they weren’t interested in providing said information. Google systematically collected information about users without their knowledge and without their consent, thanks to a loophole in said security settings. After this came to light, Google was fined 22 million dollars – a ridiculously low, insignificant sum when considering the volume of information they managed to collect and profit from.

Similarly, in early 2013, the New York Times reported on yet another scandal involving Google’s disregard for user privacy. After being sued by 38 different countries, Google admitted that while capturing photography for their Street View Tool, the odd vehicle with cameras didn’t just record footage of surrounding streets – but also collected 600 GB of data from nearby Wi-Fi networks, including passwords, e-mails and private data. For this, Google was fined 7 million dollars, once again a laughable sum. The list of examples that demonstrate Google’s less-than-innocent practices could go on further, but the message is clear.

To anyone who thinks these are isolated incidents or human errors, take a look at this excerpt from an interview Eric Schmidt, Google’s CEO, gave the Huffington Post on the subject of user privacy:

"If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place."

Confusing. Isn’t that the point of having privacy laws (or a personal computer) to begin with? Everyone has things they’d like to keep private, and not share with the entire Internet community. This doesn’t necessarily mean it’s something embarrassing, it can be as innocent as pictures of loved ones, private letters, documents containing ideas in development that haven’t yet been published or patented and many other types of information. Just because something’s private, it isn’t automatically shameful or guilt-inducing – it just means someone would prefer to keep it to themselves as part of their innate, basic right to privacy.

In light of these repeated breaches, the European Union passed a law, several months ago, that every person has the right to be forgotten. This law enables people whose name yields unfavorable results in Google search, to file a request for Google to remove these results. Naturally, Google didn’t look kindly at this attempt to interfere with it practices, and quickly found a loophole. They conceded to the law, and indeed removed any results they were requested to remove. However, when searching for said person’s name, in lieu of the removed results, the person searching would see a statement attesting to the result having been removed following a request in compliance with this law. In doing this, Google ensures few people will request to remove unfavorable results about them; after all, what people assume upon seeing such a message could be far worse than the original reason the person requested that the result be removed in the first place. 

Is There Hope?

Google, the innovation giant, the legendary corporation scores of people turn to on a daily basis, has infinitely helped make data on the worldwide Web accessible to many. However, it has simultaneously brought about new and previously unknown problems it’s unclear how to tackle. The unfortunate result is a sense of helplessness, not being able to control where private information ends up or how it’s used, nor their site’s visibility and financial income.

In light of these repeated breeches in trust, unfair competition and the apparent disregard for user experience, it’s high time to seriously reconsider the depth of this dependency on one service. After weighing the consequences of continuously (and sometimes unknowingly) giving one corporation nearly unlimited access to users’ private data as well as disproportionate control over their financial destiny, perhaps there’s room to consider, individually, which measures can be taken to feel protected.

Read Part 1, Read Part 2 of this three-part series


Ben Oren specializes in handling Web marketing efforts and boosting online conversion for large corporations in highly competitive niches, mostly in the U.S. and Europe. Ben Oren is the Head Marketing Consultant at Dynamic Search - a U.S. based, reputable Web marketing agency handling small and medium clients worldwide. He is currently the Director of Web Marketing at WhiteWeb and a contributor to leading industry publications.

 Request Website Magazine's Free Weekly Newsletters 

WebsiteMagazineMiniLogo

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up

 

Leave a comment
    Load more comments
    New code
  •