Don’t be Evil? BULL S**T!

Everyone in SEO BrainTrust has already heard me rant about the duplicity and lies from Google (nofollowed as all untrusted links should be!) in regards the so-called “secure search” initiative but this is just too horrible for me to keep bottled up any longer. Hence, this public rant.

This supposed public service initiative is nothing more-nor-less than the arrogance of absolute power and is the most damaging thing to hit webmasters in the history of webmastering. Somehow it is ok for Google to know every last detail of a user’s search behavior – which is unavoidable without proxies and constant cookie dumps – but the inconsequential percentage of that traffic that hits any one website is somehow a huge security problem that searcher’s, if only they knew, would simply freak out about. BULL SHIT!

But why should webmasters care? Here’s an example.

ANALYTICS STRIPPED OF ALL MEANING!

This is a snapshot from a private client’s GA account (revealing nothing sensitive). For the period covered, this is 10% of total organic traffic and the bounce rate at 80% is fully 20% higher than the terms we actually get reported. So here’s the dilemma: How the F**K do I fix this?

For 10% of search traffic I can not begin to diagnose and decrease bounce rate because I have no clue what the visitor was even searching for. Thanks a lot Google – that’s a big help improving the search experience. Clearly it does my client/partner no good to rank where only 2 in 10 visitors stick around and that would pretty much define “poor search experience” too but despite Google’s stated mission to be “all about the searcher” they have intentionally removed – using a complete and outright lie – the one thing that we can use to improve both the search experience and the visitor experience to our site.

Organic search has always been, and will always be, a symbiosis where Google gets OUR!! content for free so they can run a paid traffic division while providing us “free clicks” in exchange. They have continued to “change the deal” over time with more and more non-organic results reducing the value to us while increasing their revenues. OK, so maybe all’s fair in love and war (and business but that’s just another war!) but if deceit and misdirection at our expense is not “doing evil” than I am at a complete loss for what would qualify.

Add the completely ridiculous BS around G+ … a topic for another day … and I’m seeing FTC in big red letters on the horizon.

Dear Google: Power corrupts, and absolute power corrupts absolutely. With your 80% market share (like who the hell is in the 20%?) you have now reached a level of arrogance that is life threatening – to you!

The Purported Death of PageRank Sculpting

During the recent SMX-Advanced conference in Seattle – which I was not able to attend (I do occasionally have to work for a living) – there was a confusion of reports of comments attributed to Matt Cutts that resulted in the provocative (outlandish even?) conclusion that nofollow no longer works to sculpt PageRank, but in fact now causes PageRank to "evaporate" instead.

Dan Thies was at the show, witnessed the entire sordid ordeal and has editorialized on the matter in the way that only Dan can in a post he calls Operation BENDOVER (Huh? You’ll just have to watch the video!).

Completely lacking as I am in Dan’s sense of humor, not to mention a suitable picture to trump the one he uses of me, I’ve instead resorted to my old standby — Math. So for the real PageRank computations that show why this reported obit just does not "add up" see The Math of PageRank Sculpting. And if you like that kind of thing, you’ll really dig the included PageRank algorithm written in 25 lines of Perl.

Finally, with humor and math taken care of, be sure to read Andy Beard’s take on the death of PageRank Sculpting, but just remember that the real point of most "news" in SEO is the humor.

“…Google will determine…”? Not on my site!

In the StomperNet forums today I responded to a member who noticed a Google post here. Reproduced here is my acidic response.

That was the most useless, vague, non-actionable and *irresponsible* post I have EVER seen from Google. It looks like something from webmasterworld or the warrior’s forum. The examples used are just plain stupid and the sweeping generalization they make about Google somehow figuring out URL parameters is dangerously silly.

  1. No one would consider rewriting a (so-called) dynamic url into a "static" one while retaining the session id. I mean DUH! If you are smart enough to even be able to enable mod_rewrite how could you not know to turn off session ids when serving content to bots? Ridiculous example that serves to paint all rewriting as somehow dangerous. Worst still, why would anyone rewrite like the example shown? That’s plain stupid.
  2. " … Google will determine which parameters can be removed …" — You have got to me Sh*t**g me! Is there anyone who can spell S-E-O that would like to just simply trust Google to "determine" what URLs should be the same and which should be different?? Not me thanks. My site. I’ll decide. If they get it wrong, you get flagged with widespread duplicate content and they don’t tell you about it.
  3. They leave completely unanswered the OBVIOUS (just look at SERPs) problems they have today with session ids — not so good at "determining" after all, eh? At every single StomperNet Live event we’ve held, I have reviewed at least one site that had pages indexed at Google showing multiple different session id values. This is a widespread problem for sites that serve session ids to bots and for Google to publicly post about "dynamic" URLs and sweep this under the rug while vaguely claiming to handle it borders on misrepresentation.
  4. They also don’t say a damn thing about parameter order — another place they fail COMPLETELY to "determine". Example: p1=v1&p2=v2 leads to the same content as p2=v2&p1=v1 and this is a REQUIREMENT of the HTTP spec (named parameters are NOT positional so may appear in any order) but Google treats these as different URLs and will ignorantly and incorrectly index both URLs as different pages. This problems appears in several CMSs today, Endeca in particular has it bad.