Is the J.C. Penney SEO scandal relevant to the web performance industry?

If you haven’t yet read The New York Times article The Dirty Little Secrets of Search, you should. It reads like a shady underground exposé of an art smuggling ring — making SEO sound almost glamorous. One of the most interesting things I’ve read all week.

What I found most interesting is the distinction the article makes between activities that are actually illegal and those that are “Google illegal”:

Despite the cowboy outlaw connotations, black-hat services are not illegal, but trafficking in them risks the wrath of Google. The company draws a pretty thick line between techniques it considers deceptive and “white hat” approaches, which are offered by hundreds of consulting firms and are legitimate ways to increase a site’s visibility. Penney’s results were derived from methods on the wrong side of that line, says Mr. Pierce. He described the optimization as the most ambitious attempt to game Google’s search results that he has ever seen.

In 2006, Google announced that it had caught BMW using a black-hat strategy to bolster the company’s German Web site, BMW.de. That site was temporarily given what the BBC at the time called “the death penalty,” stating that it was “removed from search results.”

BMW acknowledged that it had set up “doorway pages,” which exist just to attract search engines and then redirect traffic to a different site. The company at the time said it had no intention of deceiving users, adding “if Google says all doorway pages are illegal, we have to take this into consideration.”

J. C. Penney, it seems, will not suffer the same fate. But starting Wednesday, it was the subject of what Google calls “corrective action.”

The article doesn’t specify the exact terms of this corrective action, but it notes that J.C. Penney’s ranking has plummeted from #1 down to #68 and #71 (and worse) for several key search phrases.

A unilateral ranking change of that magnitude would be devastating to any web site. When the “mall” of the world places your store in the basement of the annex building, sales plummet and business dies.

I’ve been wondering if any of this J.C. Penney scandal is relevant to the performance industry. To date, I’ve always assumed that the automated optimization techniques used by Strangeloop and other solution providers fall into the “white hat” camp, since we derive some of our techniques from Google’s own list of performance best practices, and especially since Google has spearheaded the development of an open source tool for automating fundamental best practices called mod_pagespeed.

However, when you examine Google’s design and technical guidelines  perhaps we should not so confident. One area that caught my interest was around cloaking:

Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.

Most, if not all, of the sophisticated automated optimization solutions — Strangeloop included — serve different content to users and search engines. We do this to optimize performance, not to game search results. I have asked my friends at Google to look into this and I have been repeatedly reassured that the techniques we (and by extension our competitors) employ are safe.

Can someone at Google (Matt Cutts et al) confirm publicly for us that optimization by browser group is not “Google illegal”? That what we are doing is safe, helpful and harmonious with the Google mission of a faster web?

Has anyone else asked this question of Google? Can someone point me to a Google article that demonstrates that browser-based optimization is safe from the “cloaking” brush?

Related posts:

6 thoughts on “Is the J.C. Penney SEO scandal relevant to the web performance industry?

  1. Pingback: Tweets that mention Is the J.C. Penney SEO scandal relevant to the web performance industry? — Web Performance Today -- Topsy.com

  2. As long as you serve the exact same content to Googlebot as you would other users (depending on the segment), you’re fine.
    just like with IP geolocation tools and A/B testing. It’s claking when it’s done to manipulate ranking, or has the potential to manipulate them – as in BMWs redirects, or showing Googlebot a keyword stuffed page and users a clean page.

    http://googlewebmastercentral.blogspot.com/2008/06/how-google-defines-ip-delivery.html

  3. Josh, this is a great subject to clear out.
    Google aren’t always clear about what is good and what is not. the basic rule is that if the actual content remains the same – it should be OK and no penalty should be applied.
    there are some threads and posts by the google team, specifically around mobile and the mobile crawler – where they make it clear that redirection and serving different content based on user-agent is legit, as long as the differences are related to device capabilities and presentation and not to content. see for instance http://googlewebmastercentral.blogspot.com/2009/11/running-desktop-and-mobile-versions-of.html

    As you are working hard on delivering content in the best way for the requesting device (but having the same content show) – you should be safe.
    It will be great though if someone from Google will join this thread and give some stamp of approval/rules and guidelines.

  4. Pingback: FAQs: The 12 most-asked questions about how Google factors page speed into its search rankings — Web Performance Today

  5. Pingback: Why you should care about Google’s changes to its mobile AdWords algorithm — Web Performance Today

  6. Pingback: 8 things the web content optimization community can learn from Matt Cutts’s cloaking video

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>