“I wouldn’t say I’ve been missing it Bob”

For more than two years now I have been working "full time" – a common euphemism for not sleeping – in StomperNet operations.  Just last week I turned over the last of these  responsibilities to other people.  WooHoo!

Now I am just a faculty member so I get to wax rhapsodic with members and do actual SEO research instead of chasing infrastructure cost reductions.

Of course, I still have a lot I am doing with StomperNet, including some bit parts in the upcoming launch starting on the 9th – you will definitely want to watch that!

My comrade Dan Thies made the same transition of late and beat me to the post – like tell me something new – so be sure to read his post as well.  He’s on my blog roll.

Looking forward, I’m finally working on my own damn software for a change and working on some product plans that have long been on hold.  Stay tuned.

The Purported Death of PageRank Sculpting

During the recent SMX-Advanced conference in Seattle – which I was not able to attend (I do occasionally have to work for a living) – there was a confusion of reports of comments attributed to Matt Cutts that resulted in the provocative (outlandish even?) conclusion that nofollow no longer works to sculpt PageRank, but in fact now causes PageRank to "evaporate" instead.

Dan Thies was at the show, witnessed the entire sordid ordeal and has editorialized on the matter in the way that only Dan can in a post he calls Operation BENDOVER (Huh? You’ll just have to watch the video!).

Completely lacking as I am in Dan’s sense of humor, not to mention a suitable picture to trump the one he uses of me, I’ve instead resorted to my old standby — Math. So for the real PageRank computations that show why this reported obit just does not "add up" see The Math of PageRank Sculpting. And if you like that kind of thing, you’ll really dig the included PageRank algorithm written in 25 lines of Perl.

Finally, with humor and math taken care of, be sure to read Andy Beard’s take on the death of PageRank Sculpting, but just remember that the real point of most "news" in SEO is the humor.

Do People Choose Features or do Features Choose People?

Classically, we think of product features providing benefits that consumers weigh and contrast relative to competing products in making a buying decision.

What happens when you turn this around? How do your product features act to select the customers you get? Here’s a great case in point.

A couple of good friends and colleagues of mine, Jerry West and David Bullock, are putting on their third seminar in a series they call The SEO Rainmaker. It’s just a couple weeks away and if you can arrange to attend I highly recommend it, but the point of this post is to point out how two features of the event served to pre-select the audience they obtained.

First, the event is on Thursday and Friday, not the weekend. If you guessed that they got push-back on this point … you’d be right! But this also stands to pre-select for people that have already left the J.O.B. and graduated to being the boss. Speaking from that vantage point, I don’t want seminars on the weekend because I already work long enough hours as it is thank you! I view the weekday schedule as a positive, not a negative.

Benefit is just a synonym for "positive feature" and positive is in the eye of the beholder. What is positive for one group may be negative for another and you have decide which group you appealing to.

Second, the day is only scheduled from 10 to 4. Is this positive or negative? For the person that expects to have their butt planted in a chair and their mind fed information then it’s a negative — they want the day scheduled from 9 to 9 to "get their money’s worth". But is that who you want?

At the latest three day StomperNet Live event I had one-on-one meetings with partners and long term StomperNet members booked for so much of the weekend that I missed most of the show. Ask any long-time business person and you will hear the same story — it is the contacts you make and the side conversations you have that make live events pay for the travel.

The value of the instruction you get at these events you could (mostly) get from online delivery. It is the personal interaction you get with the organizers and other attendees can not be had any other way.

By the way, a third friend and colleague, Paul Lemberg is guest speaking for Jerry and David and I may drive up for Friday afternoon just for fun so if you can make it, I’ll see you there, but please don’t "take vacation" to come. πŸ˜‰

Extortion SEO – Take 2

Do a search for cydcor.  This is a company that employs a large face-to-face sales force so it naturally gets a high profile in the public because of contact with potential customers and employees alike so sure enough there are complaints.  But look closely.  They are from 2002 and 2004!

And yet, these old, unverifiable complaints from a site with no discernible editorial policy outranks:

A Cydcor Client Story at Reuters
Cydcor Opportunity Page at Monster
Cydcor Company Overview at Hoover’s
Cydcor’s LinkedIn Profile
Cydcor News at eMediaWire
More Cydcor News at PRWeb
Cydcor Investment Overview at BusinessWeek

How?  A variation of the Google Bomb!  But instead of the company website, it is the search result page itself that gets hijacked.  With just 25,000 results for Cydcor, it takes only a very few negative comments in these large complaint sites to rank right along side the company name for these navigational queries.

Welcome to the tyranny that is lawless democracy.

Extortion SEO

It was just a matter of time until the so-called "democratic nature of the web" developed precisely the same problems that caused our founding fathers to eliminate democracy as a viable form of government for these United States.  Specifically, the tyranny of majority.

What they knew, and we have forgotten, is that the majority never need protection, even from oppressive government.  It is the minority that law protects.

How does this relate to SEO you may ask?  The "wisdom of the crowds" gone terribly wrong.

Imagine a site where people can complain about companies anonymously in an environment with no editorial review.  Since complainers generally have lots of free time on their hands, such a site will rapidly grow to enormous size and naturally rank for most company names with little or no effort.

Welcome to ripoffreport.  Examples next.

“…Google will determine…”? Not on my site!

In the StomperNet forums today I responded to a member who noticed a Google post here. Reproduced here is my acidic response.

That was the most useless, vague, non-actionable and *irresponsible* post I have EVER seen from Google. It looks like something from webmasterworld or the warrior’s forum. The examples used are just plain stupid and the sweeping generalization they make about Google somehow figuring out URL parameters is dangerously silly.

  1. No one would consider rewriting a (so-called) dynamic url into a "static" one while retaining the session id. I mean DUH! If you are smart enough to even be able to enable mod_rewrite how could you not know to turn off session ids when serving content to bots? Ridiculous example that serves to paint all rewriting as somehow dangerous. Worst still, why would anyone rewrite like the example shown? That’s plain stupid.
  2. " … Google will determine which parameters can be removed …" — You have got to me Sh*t**g me! Is there anyone who can spell S-E-O that would like to just simply trust Google to "determine" what URLs should be the same and which should be different?? Not me thanks. My site. I’ll decide. If they get it wrong, you get flagged with widespread duplicate content and they don’t tell you about it.
  3. They leave completely unanswered the OBVIOUS (just look at SERPs) problems they have today with session ids — not so good at "determining" after all, eh? At every single StomperNet Live event we’ve held, I have reviewed at least one site that had pages indexed at Google showing multiple different session id values. This is a widespread problem for sites that serve session ids to bots and for Google to publicly post about "dynamic" URLs and sweep this under the rug while vaguely claiming to handle it borders on misrepresentation.
  4. They also don’t say a damn thing about parameter order — another place they fail COMPLETELY to "determine". Example: p1=v1&p2=v2 leads to the same content as p2=v2&p1=v1 and this is a REQUIREMENT of the HTTP spec (named parameters are NOT positional so may appear in any order) but Google treats these as different URLs and will ignorantly and incorrectly index both URLs as different pages. This problems appears in several CMSs today, Endeca in particular has it bad.

McAfee Revisited

UPDATE: So I’m a bit out of touch on this one, but McAfee actually backed away from what they were doing, in no small measure it seems from the stink she rasied. πŸ™‚ Read the full story at Cresta’s blog.

Oh, and I’ve restored the image to commerce websites.

I promise to just let this go — soon as I get this bit of satire posted!

For everyone who still wants to use the McAfee seal, here’s a logo providing "Full Disclosure" of what the ScanAlert "service" really means.

Satirical Commentary on Totally Dumb Ass move Made by McAfee

Enjoy. πŸ˜‰

LEGAL NOTICE to McAfee: This is satirical commentary covered by Fair Use. If you don’t like it, tough shit. Maybe you should have thought of that before pillaging your customers’ traffic.

SEO Trick – Sub-Domains vs. Directories

Two SEO questions I get asked a lot:

  • How important is the URL to ranking and
  • Which is better, sub-domains or directories

In general, both have only minor impact on ranking (I think they are important to click-through) but I just saw an example of the latter that is worth some thought.

In searching for "swing treeview" (a Java thing) at Google, the top two results are treeview-java-swing.qarchive.org and java-treeview.qarchive.org and Google did NOT do the second as an indented listing which they would do if these were treated as being from the same domain.

If the same content were served via pages or directories at the root domain, the best this site would get is an indented listing, and even that is open to question.

This is likely a generally applicable result. Look at the results for searching for "blogspot" for example. Predictably, there are pages and pages of blogspot sub-domains. The previous example is no different.

The lesson here is that sub-domains really are different domains (which we knew).

The action item is to find out which is easier to get:

  • Multiple listings from sub-domains or
  • An indented listing from a single domain

I’ll let you know what I find.

HackerSafe? Not Now. Now It’s HackerSOURCE. Yikes!!

McAfee has done something with the HackerSafe logo that I think totally crosses the line.  Thanks to Cresta’s Blog post and subsequent Tweet
for pointing this out to me.

Today, I am pulling the seal off of my sites; disabling all the domains in the ScanAlert control panel; and penning a nasty ass message to McAfee. Why you ask?

The change they made is to the page you get when someone clicks your the McAfee seal on your site. Right in the middle of the page is a link "Attention Shoppers" that leads to http://secureshopping.mcafee.com/. Excuse me!! WTF do they think they are doing?? I’m paying them for the seal AND giving them traffic?? I don’t think so.

This demonstrates a really disturbing lack of understanding on McAfee’s part. So bad in fact, I’m not interested in even discussing the point with them. Any partner of mine that could let something this brain-dead-stupid ever see light, simply can not be trusted.

 

Portals vs. Mashups

A critical Information Technology objective for most mid and larger companies these days is the integration of disparate business systems.  The reasons are several, but Business Intelligence (BI) is a decent overarching label for all of it.
The JSR-168 and related WSRP standards are intended to facility this but are they too late?
Something old … new again
Once upon a time, way back when a big disk was just M’s and RAM was merely K’s, our stone age solution to "integration" was to get different applications to at least run on the same terminal — and even that was rough.  I don’t remember that we even had a name for that.  Fast forward about 20 years and we call it "integration at the glass".
Of course, it’s more than just glass.
The real deal is that we now have a common UI framework (HTML/Javascript) and network communication standards(HTML) that allow even the oldest and crufty’st legacy apps [mostly] to appear together in harmony on the same desktop.  So why build a portal when you already have a browser?
Granted, that is something of a simplification.  There’s still some tricky stuff to work out, like common authentication and inter-app state synchronization to name two, but those are arguably as easy to solve near the desktop as they are on the server — the network connection is simply not the barrier it once was.  We no longer are looking at making the binary choice between thin-client and thick-client, but instead can spread application functionality almost arbitrarily across what was once the "great divide".
The choice is where you do your mashing.
If you mash using WSRP, we call the result a portal, but if you mash in the browser, you’re suddenly a Web2.0 social app — and likely with a better valuation too. πŸ˜‰
Now, I do think there is an unanswered technology question here and that is the life cycle cost of these semi-thin clients — our tools are simply better and more mature for server style implementation approaches.  This is changing.
A real test case
An application we recently built at StomperNet is split in just this way.  It is composed of two user facing components implemented in Firefox that wrap local functionality as well as remote services hosted by a PHP server.  There were some lessons learned, both positive and not, but on balance our approach provided solution features that would have been very difficult to provide any other way.  With the maturation of XULRunner we can expect this type of application partitioning to become even easier.