You’re Sabotaging Your Website’s Success!

VN:F [1.9.10_1130]
Rating: 0.0/5 (0 votes cast)

Colored LegosRegardless of how much time a SEO spends optimizing a website, if they neglect to take care of the basics and smaller details, they can seriously impact a website’s ability to succeed. Sure, you may have spent several months testing the effects of NoFollow sculpting, but if you got too caught up on that one approach and didn’t ensure that other smaller details were getting implemented, you may have missed out on quite a bit of potential organic search engine traffic. So, while they may not be as exciting as some of the newer approaches in the world of SEO, the basics can be the difference between a website’s search engine success and failure. In the spirit of taking care of business on a basic level, let’s look at a handful of basic strategies you may be overlooking:

ALT Tags: In my opinion, ALT tags can be a more useful tool than META keywod tags. Why? Well, there are a couple of reasons. For one, although ALT tags aren’t the only element used by search engines for indexing images, they do play a role. By including a few descriptive (non-keyword stuffed) words that explain what the image is, you can give the search engine image crawlers a nudge in the right direction. Secondly, ALT tags can help you pick up some long-tail traffic. This can be especially true if you are using images that fit with your content but aren’t necessarily directly related to the topic of your website. Once again, ALT tags aren’t a place to stuff keywords, but when used properly, they provide an extra opportunity to pick up some additional long-tail traffic. Finally, the use of ALT tags is a good usability practice. Whether it’s for someone using an Internet accessibility tool such as a screenreader or just in case your page fails to load properly, having a concise, accurate description of what an image is can help ensure the best experience for your users.

Paper RobotRobots.txt: Although I have always seen the robots.txt file as a fundamental element of SEO, I’ve talked to countless people within this field that don’t use ever use this tool (and some that have never even really heard of robots.txt). In case you aren’t familiar with what a robots.txt file is, it’s a way to control what content you want search engines to index. Even if there’s nothing that you want to block the search engines from indexing, it’s still a good practice to create a robots.txt file. Here’s what a robots.txt file looks like if you want to allow the search engines to index all of your content:

User-agent: *

So, why would you want to use robots.txt to block certain content from being indexed? Well, WordPress is one of the best examples. If you are having duplicate content issues, you can use robots.txt to make the search engines index only one version of your content instead of multiple versions. With the help of robots.txt, you can tell search engine crawlers to only index your original posts instead of your archives, category pages, etc.

Source Code: If you have ever looked at the source code for a page that was designed during the 90s, the first thing you did was probably cringe. Unfortunately, there are still people designing websites using the same awful source code structure that has been around for over a decade. Whether it’s the result of design software or just bad habits, ugly source code not only creates an unattractive website for human visitors, but it can also create problems for search engine crawlers.

Old TableAlthough search engines would like you to think otherwise, their bots aren’t perfect. So, in order to ensure that they can properly crawl your website, you need to make life for them as easy as possible. This means cleaning up your source code and getting rid of all the messy elements that can cause them to have problems indexing your site. If you are still using tables to design your website, it’s time to learn the new principles of web design. Even if you are using CSS, make sure it’s stored in an external file. Finally, although most fail to do, I recommend using PHP files to store and call different elements (such as Javascript) that make up your website.

URL Canoncalization: Although the definition of URL canoncalization (“the process by which URLs are modified and standardized in a consistent manner”) sounds somewhat confusing, the concept is fairly easy to grasp. In regards to search engine optimization, URL canoncalization refers to search engines picking the best version of a URL to index (,,, etc). Even they do a good job of this, as I already mentioned, search engines aren’t perfect. This means that they can end up indexing multiple versions of your website, which then causes you to have a duplicate content issue on your hands.

This may sound like a difficult issue to tackle, but it’s actually not. With the use of a 301 redirect, you can ensure Google and other search engines index the www version of your website and not the non-www. To do this, simply create a file called .htaccess and paste the code below into that file (of course, you’ll want to replace with the name of your website):

RewriteEngine On
RewriteCond %{HTTP_HOST} ^
RewriteRule (.*)$1 [R=301,L]

VN:F [1.9.10_1130]
Rating: 0.0/5 (0 votes cast)
Be Sociable, Share!