18-1

So I’ve had a couple of months to digest this one. I didn’t want to post anything until I could step back and look at the issue without confusing my thoughts. The Patriots are my home team and as such there is always a resentment when they lose. As any loyal fan I pass blame – the referees didn’t call the game fairly, we had a freak injury, the other team got away with something they shouldn’t have, etc.

In the end though, the more I think about it the more I come to the same conclusion. The Giants simply outplayed us. They wanted it more. This very thought frightens me.

The Patriots always won because they wanted it more than the other guy. They might have less talent, less speed, lack of star players – it didn’t matter. Somehow they’d pull it through at the end. My fear is that after 3 Super Bowls, after years of success, and after being labeled as the new dynasty by everyone else that they started believing their hype.

We’re used to seeing Brady with the ball, under 2 minutes on the clock, and seeing fear in the other teams eyes. They know he’s going to drive down the field. They know he’s going to pull the come back. They know that their worst nightmare is about to be realized.

The last 2 years we’ve gotten used to a different sight. Brady with the ball, under 2 minutes on the clock, and the other team stopping us.

Maybe it’s just other teams catching up. Maybe it’s parity catching up. I really hope it’s not us losing the core of our team. That hard work, blue collar, underdog philosophy that made us all proud to be Patriots fans. I’m thankful for what the Patriots have given us and for players like Bruschi. I realize we can’t win every year. But to get so close to the perfect season, to the greatest season in football history, to Mercury Morris finally shutting the hell up… and to fall short. I just don’t know.

Sadly, I find myself for the first time in a long time not wanting to watch football. Not caring about the draft. Not caring that we let possibly one of the best cornerbacks in the league go to sign an aging and (playoff) under performing wide receiver. Not looking forward to next season.

I miss that anticipation and love for the sport. I want it back. I fear it’s death on a Sunday in early February when the undefeated became perhaps the greatest disappointment in football history.

I wish I knew where we went wrong.

Random Tidbit:  Being a self-proclaimed – ok maybe publicly proclaimed – geek I found this blog post on why geeks make good lovers to be self-satisfying.  Is it true?  Find out.  Date a geek.

Search engine optimization (SEO) techniques

Search Engine Optimization (SEO) is a strategy to allow a site to rank in search engines (Google, Yahoo, Ask) for terms. Typically the goal is to rank in the top ten for terms relevant to the main focus of the site and within the top 1-2 pages (10-30 results typically) for secondary focus areas. Since most new sites are found via search engine results this becomes the main source of traffic for smaller sites like blogs and startups. However, even main web staples rely heavily on this referral traffic.

In order to SEO a site a dual strategy is needed.

Internally, a site must have good technical design and well-written content. This makes it more attractive to search engines and helps with “natural indexing” a search engine spider finding a single link to your site and being able to traverse the entire site tree to add it to it’s database.

Externally, a site must rely on strong inbound links in order to build the trust factor associated to its domain by search engines (mainly Google). Means of accomplishing this include using social book marking sites (Del.icio.us, Ma.gnolia), social news / technology sites (Digg, Reddit), popular blogs (TechCrunch), and niche link building (inbound links from other sites that rate for the same search terms).

Internal Design

Internal design should focus on semantic web design and well-written content. On the web, it’s said that “content is king.” Well-written content will trump any attempts at “keyword stuffing”, hidden keywords, or any other “black hat” SEO strategies (those frowned upon and/or banned by search engines). While black hat strategies might earn a short term gain inherently the search engines catch onto the strategy resulting in a long term loss either in reducing the site’s trust so they rank lower or simply banning them from the index altogether.

Semantic design is the process of writing HTML code so that content on the page is contained in semantic elements. This movement came about after the fiasco of 1990s web design including “table-itis” using tables and other semantic elements non-semantically in order to display the page the way the designer wanted. With the widespread acceptance of CSS and the (mostly) widespread implementation of it in browsers such as Firefox, Opera and IE 6+ the move to semantic design seriously began and started gaining a foothold in the web standards community.

At it’s heart, semantic design is basically wrapping content in elements that describe it semantically paragraphs in p tags, lists (many times navigation links) in ul (or if ordered ol) tags, tabular data (like graphs or excel documents) in tables, definition lists in dl tags, and headers in h1-6 tags. The use of non-semantic tags divs and spans mostly along with liberal use of classes, ids and Cascading Style Sheets (CSS) then allow the designer to have semantic content in semantic tags but still display it in any manner that they wish.

The reason semantic design is important is because it tells search engines what the data means it outlines header hierarchies to allow for keyword sensing and allows it to sense how data is formed and related (paragraphs under a header being a “section” etc). Since search engine spiders can only parse and not actually read the data this allows them to parse the site more intelligently and results in better keyword matching for the site.

The final internal design facet is likely one of the most important the title tag. This is a tag that is only shows at the top of the browser window, above the address bar and is thought in the SEO community to be the most highly weighted element by spiders. Having unique, meaningful, concise, and useful titles on each of your pages is the first step to being indexed for the terms you want.

After the title tag is surmised that the header elements h1-h6 are the next most heavily weighted internal element because they perform a function like a “table of contents” for the page. These should be used intelligently and not abused though as this can be considered “black hat” as well.

External Link Building

Beyond good internal design, a well-executed inbound linking strategy is key to SEO. In the SEO community it is thought that this is actually the most important overall part of the process. Search results tend to sway towards this thinking as many times a site that has poor internal design but strong inbound linking for terms will rank higher (many times much higher) than well designed sites with poor inbound linking.

Google is the largest search engine and likely the one that values this most. Although it’s algorithm is unknown many hypothesizes have been put forth by the SEO community and results seem to provide validation.

The first hypothesis is that search engines (specifically Google) place an amount of “trust” on a domain and page (sometimes confused with PageRank). This trust for search terms shares a one-to-one relationship with how that page and domain rank for those same terms.

In order to build this trust, a site must be thought of as an expert for the terms. Typically this is show by inbound links that meet a combination of criteria. The most important is number of links combined with some sort of freshness multiplier. The more inbound links for a term the more trust. The freshness multiplier comes into effect when, for example, an older site might have more links for a term however has not had any recent links for those terms. A newer site with less overall links but many recent links for those terms might then have more trust. The logic is that data is timely so more new links earn more trust than many old links.

Beyond total number of links is links from other sites that have trust for the terms. So, for example, if a site wishes to rate for “dog breeding” having inbound links from other sites that rank well for “dog breeding” show to spiders than those trusted sites consider the linked to site a peer.

Finally, the terms in and around the anchor text of the referring link assign terms. So a link set with the text “dog breeding” in the previous example would pass on trust for that keyword phrase. This is thought to be the least heavily weighted method.

There are many other hypothesizes, however these seem to be the most prevalent and well trusted.

Inbound links are typically generated though networking in the niche community a site is looking to enter as well as using popular social networking sites (Digg, Reddit, StumpleUpon, Del.icio.us, Ma.gnolia) to increase the exposure of the site and, hopefully, gain inbound links from various sources. A campaign of using social networking sites intelligently to garner inbound links is typically referred to as “viral marketing.”

In conclusion, SEO relies on both internal and external methods. The most important is a strong campaign of collecting links from valuable sources preferably in the same niche. The second most important is strong internal design so when a spider reaches the site it has the highest chance of success to index it correctly and rank it for preferred terms.

Random Tidbit: Want to learn more about SEO?  Try reading some of the 15 most popular SEO websites.  If you use WordPress learn more about improving it’s SEO – I actually use a different plugin called Add Meta Tags. Finally, check my SEO page on Ma.gnolia for more interesting sites and tools I find.