Search Engine Optimization, or SEO, is simply the act of making the pages and navigation of your site as friendly to the search engines as possible. You make sure that you take into account all the factors that help tell the search engine what your page is about.
As far as I am concerned, it is about getting targeted traffic to the appropriate pages. It is not about driving sheer numbers to pages that do not serve their needs. In fact, sometimes it is useful to use your knowledge of SEO to reduce your traffic from inappropriate keywords.
I am also not much of a fan of most professional SEOs. There are certainly some good ones out there, but the truth of the matter is that most of them rely on the trick-of-the-day gleened from the various SEO forums to get you quickly and easily to the top for now but they do little to try and secure your long term rankings. Then again, who can blame them, most people aren't going to paythem to slowly build up traffic in a natural manner.
When putting links on one of your pages to other pages on the same site, there are three basic choices:
- Full URL - This is the protocol + domain + path version where there is absolutely no mistaking where you are intending to point. An example of this would be http://www.example.com/directory/file.html.
- Absolute Path - This version does not include the domain name, but does have the entire path name from the root directory down. This version always starts with a "/" instead of a directory name, filename or a ".". And example would be /directory.file.html
- Relative Path - This version is generally the shortest and points to the target relative to the current position in the directory tree. If you are in /dir1/file.html and you have a link to "../file1.html" you would move up one directory and open /file1.html. If you linke to "dir2/file2.html" you will go down one level to /dir1/dir2/file2.html because it will look for dir2 relative to your current location in dir1. Link to a file "file3.html" and you will end up at /dir1/file3.html.
Lately I've been trying to come up with some ideas for some new websites, and some of these sites seem to be better suited for software like Scoop than Drupal. I'm actually a member at sevarl Scoop sites and really like the interface, as well as the ability to rate comments and gain Mojo.
The big problem is that Scoop sites are now being targeted by spammers, and there are no effective tools available to help deal with them. The idea that users will vote down story submissions that are full of spam seems to be valid, even on non-tech sites, but the only sites where users gain enough mojo to be able to hide spammy comments seem to be the tech-centric sites. That just doesn't work out well. And if you enable diaries, only admins can delete spammy diary entries.
On the various fora that discuss search engines, there are regularly some posts by some rather misguided webmasters who view Google as the enemy in their battle to get higher rankings in the search results.
Let me make this as clear as possible, Google is only your enemy if you make them your enemy!
In fact, I would go so far as to say that Google should be one of a webmaster's best friends. They send you traffic for free, and they will continue to do so in varying amounts for as long as you play by their rules. Remember, Google does not owe you any of that traffic or any given ranking.
Google is now making Google Analytics available free to AdWords users as well as anyone that serves up less than 5 million pages a month (like me). Google Analytics used to be the Urchin statistics package before it was bought out by Google.
Just for fun, I decided to sign up to use it on my yahoo store, since Yahoo! has such shitty statistics tracking. The first thing that became painfully obvious is that Google didn't plan for the popularity of their offer! Just signing up was painfully slow. The pages containing the forms that you have to fill out aere taking several minutes each to get served up.
Drupal actually has a pretty sucky standard design from both a processor use and search engine optimization perspective.
Fortunately, at least the SEO part of the equation is helped out with the addition of a couple of modules and a setting.
Most important is the setting "Clean URLs", which uses mod_rewrite in Apache to change the URLs from http://www.example.com/index.php?q=node107 to http://www.example.com/node107. The "?q=" part, in addition to being rather ugly, has been known to confuse some search engine spiders. Not to mention that if you add more values beyond one or two, some spiders just refuse to follow those links.
I've decided that I'm going to try out one Drupal module a day on this site in addition tp any content that I might add. Today's module is the one to generate a sitemap of all the filenames that are available for Google to crawl.
Google sitemaps give the webmaster the ability to tell google which files exist and when the last time they changed. This reduces the bandwith requirements for both Google and my website. It also lets Google know when a new page has been created, so it you don't have to wait for Googlebot to find it on its own.