In August of 2005 I tested for — and passed 🙂 — my 3rd dan promotion test in tae kwon do. Here is the testing committee.
Left to right are Master J.K Kim, Grand Master B.C. Kim (my instructor), Master Lim, and Master Park. With all due respect to tkd instructors the world over, I believe that a finer collection of TKD talent you will not find anywhere.
Testing Committee for my 3rd dan test
Boards actually do hit back
Bruce Lee said "Boards don’t hit back" — which made for a great movie line, but it ain’t quite so: Newton’s laws assure us that boards hit us back with a force equal to how hard we hit them. Fine as far as it goes, and not much help either. The trick ultimately is to do it fast and let the derivatives do work on the relatively inflexible materials we break while doing little or no damage to the highly flexible body structures we use to break with.
This esoteric description is neither necessary nor ultimately particularly helpful in actually breaking stuff — that’s purely about doing it, no matter what your brain tells you ;-).
Here are some samples of my breaking from my 3rd dan promotion [click’em to take a closer look].
Smiling through the pain
A certain amount of testing is just figuring out what you can make yourself do when you obviously can’t do anything more. To highten the tension, and thereby make everything more difficult, every test is different and the specifics of any challenges invented by the testing committee seemingly on the fly.
Case in point: Master Lim had already directed the 1st dan candidates to do 100 kicks and the 2nd dan folks 200 kicks so it is logical that for 3rd dan we should do 300 kicks. Here I am near the end of that ordeal gutting out the last of them. This picture catches me smiling — other moments might show more of a scowl, but I find the smile generally feels better.
Links that open a new window. Do they pass PageRank and Link Reputation?
Spiders are simple little animals and the theory that guides the way search engines are built admit of few special cases, so generally speaking, a link is a link is a link. Let’s consider some cases.
The most often question is what happens for the target="_blank" case. This is very common so it would have staggering consequences if it did not act as a "normal" link. Moreover, treating it differently is theoretically unsound, as linking into a new window is not conceptually different, in terms of citation considerations, than linking in the current window.
The question of the use of style classes and DOM ids also comes up. I am certain that these are simply ignored by all search spiders, but that is a subject for another day (why spiders are so dumb).
BTW, OptiLink and OptiSpider both treat all of these links the same. The only exception is the rel="nofollow" attribute (Google’s simplified approach to Dynamic Linking) which is optionally processed by both programs so that our computation of Link Reputation will follow what Google does.
What kinds of “links” are spiderable
When most webmasters talk about links, they almost always mean "navigation". There are several ways to connect your pages together so that humans can get from one to the next.
By contrast, there is precisely one way for spiders to get from one page to another and that is the HTML <A> tag. To look at a page in your browser, you will often be unable to tell what navigation is actually spiderable links. That’s one of the primary uses of OptiSpider.
OptiSpider, like the search engines, is only able to follow <A> tags — all other kinds of navigation, like Javascript, and form buttons as two examples, are completely invisible to OptiSpider.
So, if you are uncertain if your navigation is spiderable or not, just run OptiSpider and see which links are found and which are not.
What are the problems in ranking?
First, let’s define terms.
Dynamic content generally means that pages are generated by some chunk of software by reading data from a database. This is a common design pattern for big catalog style websites.
But no matter how the content gets generated, it still ends up being just plain-ol’ HTML by the time it leaves your web server. What the browsers and search engine spiders see is no more nor less HTML than so-called "static" pages. So the content itself is certainly not a problem — the URLs might be.
To generate the right output, a catalog style site will typically use one or more parameters in the query string. For example, /product.jsp?model=12&color=green&style=7.
Historically, search engines have not liked these complex URLs because they will routinely lead to a (nearly) never-ending sequence of pages. A "plain" URL — one without a query string — tended to have an advantage over complex URLs.
This is not as much of a problem today as it once was, but it is still better to avoid complex URLs. To do so requires the use of a URL rewriting engine like Apache’s mod_rewrite module so it is fairly technical, but once done, a "dynamic" website can be made to look entirely "static".
Do the engines discount links from the same domain versus links from a different domain?
Not now. Not ever. To be sure, just follow the money.
Big sites involve big dollars and it is big dollars that makes the world go ’round, so you can bet that the now public Google will have joined the other public financed engines in a gentle but certain catering to the American greenback.
Consider: If my name is Bill and I build a 10,000 page website to support my software business, I expect to have a high PageRank at Google and I expect to be able to control the Link Reputation of my home page using my thousands of internal links. Now suppose some disgruntled open-source weenie links to me with the text "Windows is Evil" and gets ranked ahead of me for my own product name? I would have good reason to be upset.
So, you can bet that at the very next social event for young billionares, Bill will corner Larry and get it fixed.
But seriously, if internal links are significantly discounted relative to external links, then small sites always gain an advantage over large sites. This is very bad. In general, big sites actually do deserve to rank better than small sites, external linking being more or less equal, which internal links will accomplish automatically.
If you rely instead solely on external links, what you will find is that the first to get a top rank will continue to keep top rank because it is top ranked pages that get most of the links. This would make it even harder for a large site to displace a small site that happens to get top ranking.
And finally, just go look at some search results with OptiLink. Big sites have a clear advantage. Do they have more external links than small sites? Only some of the time. It still appears that links from any source are sufficient, so they might as well be your own.
Is a MiniNet an effective strategy for ranking against a million competitors?
The simple answer is yes. If that’s all you need to know, stop now and get back to work! 😉 But if you want to know why…
I am often asked variations of this question and the answer is always yes. Huh? Much like Hitchhiker’s Guide, it is the question itself that is wrong, leading therefore always to the same, unhelpful, answer.
The capacity to rank depends on total number of pages plus the stategy used to link those pages together. Megasite, MiniNet, Blog — doesnt’ matter — pages are pages and the way they are organized into domains simply does not matter. It is the linking that matters.
Once you understand how to do the linking to make best use of the pages you do have, then you can get to the "right" question — the one the Vogons (almost) destroyed to build an intergalactic bypass. 😉 Fortunately I caught it just in time.
For any ranking task, the real question is "how many pages do I need?"
Pages are the ultimate source of ranking power. Smart linking allows you to get the best use of that power. If you are not ranking where you want to, you must either use what you have more effectively (via linking) or create increase the raw power you have available (via more pages). Most ranking solutions involve some of both.
So back to those MiniNets…
Michael Campbell’s network structures are some of the best at using ranking power, so they are indeed a good place to start for most purposes (of course, there are always exceptions), leaving us only the real question of "how many pages". That is what OptiLink is designed to help answer. By examining the quantity and quality of linking employed by top ranking pages, you can estimate what you will need to build to be top-ranked yourself.