Archives for 2006

Static vs. Dynamic URLs

The terms ‘static’ and ‘dynamic’ do not correctly represent any real technological differences in how pages are delivered. The real issue is ‘clean URL’ or ‘messy URL’ — query strings is a common example. The engines are far better today at dealing with these URLs, so the historical advantage of clean URLs has dissipated considerably, but messy URLs still remain a problem and their use should be minimized.

“SEO Secrets”

Recently I was told that "there are no SEO Secrets" and to claim such a thing was deceitful. Hmmm, maybe. So I looked up the word "secret" — I already know what SEO means :-).

Secret:

  1. done, made, or conducted without the knowledge of others;
  2. kept from the knowledge of any but the initiated or privileged;

 

Now go look at some random search results. What percentage of these web masters have any clue at all about SEO? Maybe 5%. So if 5% of people know something, is it still a secret? If not, then when does something go from "secret" to "not widely known" and from there to "common knowledge"?

I’m not even going to propose a number. Insread, I’ll rely on the second definition: SEO secrets are known to the "initiated or privileged". And it’s easy to find out who they are: they are at the top of the search results!

How Long Does it Take to See PageRank Changes?

This is a common question. I got it again today from an OptiSmarts subscriber:

I’ve found a lot of PR5 pages (and a few PR6 pages) on [my site] that don’t link to the homepage. I’ve now changed that and added nofollow to the unimportant links. What kind of a boost can I expect the homepage to get from, say 20 PR5 internal pages and a few PR6 pages pointing at it? Will I notice a difference?

That’s really three questions.

First, yes, you will notice a difference. All PR is additive so more is always "a good thing" but you might not notice it on the Google toolbar, because it is such a coarse measure. And with your existing home page PR of 6, the next step up at 7 is a very long way. But don’t let that stop you from optimizing it, because it is the "real" number that is used internally at Google, and 6.2 is still better than 6.1.

Which leads us to the next question: how much boost should I expect. There’s no tellin’. The SEO game is really just about going in the right direction and stopping when you get there. Figuring an ETA while enroute is not generally possible.

Which leads to the final question: when should I see the change? And that’s the worst news of all I’m afraid :-(. PageRank is the slowest changing aspect of Google ranking. I routinely see the PR on client sites take 3-6 months to adjust to major internal linking changes. It will not take that long when done from scratch — there I can usually get PR in 60 days — so it is the rearrangement of PR that takes serious time. Here’s part of the reason why.

I’m sure everyone has noticed pages in the Google index that have been 404 for 6-9 months and they are still cached and showing PR. Google appears reluctant to drop a page once it is indexed. Since the distribution of PR within your site is all about the way pages are linked together, your changes will not be complete until Google gets its entire picture of your site changed, missing pages, new pages, changed links and all. So long as the "old stuff" is hanging around, Google’s image of your site will be different than your new site design.

So, the simple answer to all of this is, just do it right and wait it out.

Awe Does Not Survive Arrival

An obtuse title to be sure — I just love doing that ๐Ÿ™‚ — but this really is about SEO and Internet Marketing. I was once in awe of Michael Campbell, Stephen Mahaney, Marlon Sanders, John Reese, etc. They were all somehow larger than life.

Earlier today, yesterday by the time you see this, I sent out the personal story of my association with the person who is now "the most famous OptiLink customer", and for good reason, given the amount of money he makes and the stir he has caused! You might be in awe of Brad, as I was once in awe of others.

So now let me tell you "the story before the story".

Before you get to be a "guru", gurus appear to be cast in marble and somehow slightly taller and better looking than anyone else. That’s all PhotoShop. Once you get to be one, and that notice arrives some considerable time after you actually become a guru, you suddenly notice that they (my God, you!) are not marble after all and sadly ๐Ÿ™ are not all that great looking either!

It’s not that gurus are not gurus. That’s not how "awe" dies. It is that all gurus were first people, did a few ultimately simple but not easy things that anyone can do, and yet still remained after the fact more-or-less the unchanged pre-guru people they were.

You might be wondering what the whole point of being guru is in the first damn place. As it turns out, not much!

Ultimately, awe is fully and simply the result of not having done it yet. Once you yourself have done it, is it still awesome? Enjoyable? Yes. Worth doing again? You bet. But is there awe? No, there’s identity, understanding and comradeship. The bond that exists between, more-or-less, peers that have all had to overcome very similar challenges.

Rejoice, as I have, in the death of awe! Embrace in its place gratitute for your teachers; your connections to others of like mind; and your new found opportunity to guide others on that path you have followed. All of it way better than awe.

Oh yeah, and about Brad: he’s just this guy! But… he is a guy who has actually "done it". So unless you’ve already done what Brad has done; or are getting the straight scoop from someone else who has; you might consider trying to hook up!

Adwords + Organic = Law Suit

If you are not subscribed to SE News, then you should be. I read every issue in the first few days of the month. This month’s articles on Supplemental Results and Google Sitelinks are top notch. Mostly I agree with their articles and that’s more-or-less true this month as well, with a couple of exceptions.

In the list of "Top 10 Quality Indicators" there are a number of items presented as fact that I can not back up with actual measurements. Conversely, I don’t yet have the data to disprove them either, so I’ll let all that slide for now.

But there is a non-technical issue that keeps coming up that I will take issue with today. The referenced article [subscribe to read the whole thing as it does provide some good information and food for thought] claims:

"Now that Google has a spider to determine page quality for sites in their ad program, it won’t be long before that data is folded into Google’s organic search results."

In my opinion, this is just plain incorrect. Here’s why.

 

As one of several forms of actionable "anittrust" or "unfair trade practices" the U.S. Federal Trade Commission has identified the "tying arrangement" whereby use of one product or service is predicated on use of another. This is a very complex area of law loaded with judicial discretion and balancing corporate rights and "public policy" so all bets pre-trial are pretty much off, but I’ll give you mine anyway.

The FTC case against Microsoft vis-a-vis Windows and IE was precisely a "tying arrangement" case where the FTC claimed that Microsoft improperly used its market influence in the PC operating system market to create an (unfair) advantage in the browser market. While this is a classic example of unfair practice, it is not the only one.

With respect to Google, if the paid advertising programs — either Adwords or Adsense — impacted organic ranking in a positive way, this would create a tying arrangement between their "free" search and their paid programs seeking to coerce webmasters to buy advertising in exchange for better ranking.

Conversely, if participation in paid advertising programs created a negative organic ranking influence compared to non-participants, this would constitute contract fraud in as much as a material aspect of the advertising contract was non-disclosed. The FTC might or might not act on this as an unfair trade practice, but you can bet some large firm of attorneys will be happy to take it as a class action lawsuit.

This also came up in my recent review of VEO for OptiSmarts subscribers where the author of VEO claims various similar effects with Adsense. Again, same problem, undisclosed contract terms and tying arrangements.

In all cases, when it comes to tying organic and paid programs together, "there be dragons."

Why Google Can Not Track “Visitor Experience”

Like most of what is written about SEO, when you don’t know how computers work, everything is a mystery and (worse) impossible garbage sounds totally reasonable. This is not the first time in four and half years that we’ve gone ’round about Google tracking page visit times and it’s no more true this time than it was the last half dozen. Give it another six months and we’ll hear it again.

SEO is Dead? Yeah Right, and Hell’s Frozen Too!!

I knew it was B.S. when I bought it, so I won’t return it, but I’ve just about run a highlighter dry marking up the crap in "Visitor Optimization". As soon as I stop laughing, I’ll post a less hostile, and more informative analysis, but in twenty words or less … the technology that would make VEO possible simply does not exist. This is not an opinion, it is simply about how the web works. Stay tuned, I’ll prove it, as soon as I settle down. ๐Ÿ˜‰

Can You Have Too Much NoFollow?

This has come up many times, but a new OptiSmarts subscriber asked…

I have enjoyed your optismarts videos so far and am looking forward to future videos. Your speaking voice and style are very pleasant and I appreciate your ability to speak simultaneously to both novices and more advanced users.

I am in the top 10 for my 4 preferred keywords at Google (thanks to OptiLink) and do not want to do anything to lose my rankings. Is it possible to overdo the usage of the no follow attribute to the point where Google thinks I am overoptimized?"

…so I’ll answer again in greater detail.

"There are theoretically two ways nofollow can be a problem, one I’ve seen, and the other I have not. The first is simple misuse, generally due to lack of complete understanding, and the second is an actual "penalty" which I have never seen. Let’s spend a moment on both of these.

Misuse is really really easy, and the problem starts with not knowing what pages you actually want to have ranked. A classic example is an Adsense site where nofollow is used to push all the PR to the home page. OOPS! It is likely that the internal pages optimized for the low traffic but high conversion search phrases are de-ranked in favor of a page (the home page) that never earned any money. For many sites the wholesale use of Dynamic Linking (via nofollow) is not what you want. These sites should use it very selectively where a site more focused on home page traffic should nofollow more extensively.

But an outright penalty is another matter. I do not think this exists and am not aware of even a single case where nofollow can be objectively construed as the "smoking gun". Moreover, there are many very high ranking sites making effective use of nofollow without any sign of penalty. In a recent OptiSmarts video seminar I show one that is a household name. Others exist as well.

As a general rule, I find that "over-optimization penalties" are actually caused by something unrelated, but I do still keep looking.

Hilltop? You Must be Joking

Okay, this one comes up a lot, and it came up again today in an email from my good friend Michael Campbell so I thought I’d try to actually blog for a change.

It seems that someone (who shall remain namesless for the momeent at least) is raising all a fuss about how Google is using the Hilltop algo, and you therefore have to have their brand new widget to rank and nothing else will do.

Drivel! Here’s part of my note to Michael:

To suggest that Google is using Hilltop is like saying they use the Meta Keywords tag. Please! Hilltop was never used, and will never be used, because it is totally old news that never out performed Google circa 2000. Moreover, Google gets Hilltop for free, but they paid really money for what Taher brought with him, so if anyone is looking for a new algo at Google, they should look at TSPR and its cousins. We can be certain they are using BlockRank or a kin even now.

Worse, these guys seem to think that "Hubs and Authorities" is synonomous with Hilltop. Wrong again. Hubs and authorities is Kleinberg’s algorithm, which Hilltop references as prior art, but they are not the same.

And finally, always follow the money: why would Google roll out a new logorithm when search quality is not a critical success factor? New revenue streams is what they are about. They already own search, why would they need to "improve" it?

I’ve blogged this before, but…

Google is just sitting on search technology to keep some other pair of grad students from retiring before they grow facial hair. They use maybe one tenth of what they have licensed or patented. But they continue to buy and patent more because it keeps other grad students poor and drives SEOs crazy reading academic papers that don’t matter instead of turning the crank on what does matter.

Good Ranking at Google is about Balance

Ranking is a balance of several factors, the most important being title tag, (inbound) link text, and PageRank. Every page ranks based on its own measures of these factors sorted against all other pages in the index. Theoretically, every single page ranks for every single search query, but the vast majority don’t rank very well, so we never see this aspect.

The title tag is an on-page factor, and the most important of the on-page factors at all the engines, Google included. In fact, at Google I can get a blank page to rank in a competitive search so long as I use a good title tag and work hard on my text links. This would be much harder to pull off at Yahoo, where on-page text factors are more important.

The first of the off-page factors is link text. In their original technical papers the founders of Google told use that they take the link text pointing at a page and store it with the target page. This is huge. What it means is that the query engine has only to look at a page to analyze the links to the page. It also tells us that nothing other than the link text itself mattered in link between pages. There is some current argument that this is no longer true, but we’ll leave that for another day.

This accumulation of link text to a target page did not have a name in the Spring of 2002 when I was completing OptiLink, so we had to come up with a name to tell people what we were doing. We decided to name this analysis of the links Link Reputation to differentiate it from Link Popularity which names the mere counting of links.

Link Reputation, because of the way Google handles links, is not "transitive", that is, a link from page A to page B, adds to the Reputation of page B, but none of the links to page A have any influence on the Reputation of page B. For example, suppose we get a link to our hammock site from a site on gardening and the gardening site has a bazillion links to it that say "gardening". Will our hammock site rank for gardening? No. This is easy to prove for yourself by looking search results and doing some analysis with OptiLink.

But PageRank by contrast is transitive. It assigns a number to every page in the Google index using a recursive computation over the entire link graph of the index — a mathematical implementation of "what goes around, comes around". PageRank is one of "elegant" ideas that takes some thinking to understand. If you are interested, take a look at my publications archive, my Dynamic Linking eBook, or my Mastering PageRank video.

But for now let’s talk some about balance.

We’ve all seen cases where a page with few links can beat a page with many links if it has higher PageRank (from its few links). The converse is also true. So how do we get top ranking? First, we do the best we can on each of the major factors (title, link text, PageRank) and then we go look at the pages we are competing against and look for their weakest aspect. It might that we can create better link text; maybe their title is not as good as it could be; or maybe we have to build more PageRank. Whatever it is, the search results themselves tell us what works, we just have to do better.