Viewing a Site through a Search Engine Spider’s Eyes

VN:F [1.9.10_1130]
Rating: 0.0/5 (0 votes cast)

Search engine optimization is a difficult balancing act.  On the one hand, you need to make your site surfer-friendly, so they’ll click your links and hopefully spend money while on the site.  Search engine spiders couldn’t care less about flashy web design – they just want the bare bones information.  If a website was to be looked at through a spider’s eyes, it would be the dullest website on record.  But this is definitely something optimizers should do to make sure your site is properly spidered and indexed.

People may balk at Google’s new terms, but really a lot of Google’s policies are improving the web surfing experience.  Gone are the days of creating overly keyword-generated content, so there’s no content on a credit card site that works in the words “credit card cheap site” or whatever ungrammatical keyword – and that’s a good thing.  In a way, Google’s new SEO rules are to blur the lines of a website through a spider’s and surfer’s eyes – so that stuff that is purely spider-centered is still intelligible to regular readers.

Still, though, it makes sense to see how your site is viewed by spiders.  There are a couple of ways to do this.  Google’s Matt Cutts himself recommends the Lynx viewer, which “allows webmasters to see what their pages will look like when viewed with Lynx , a text-mode web browser. It is also presumably, how search engines see your site. In addition to that, it can help determine if web pages are accessible to the vision impaired.”  That’s optimization on several different levels.

The Firefox tool – Web Developer – has similar properties.  With Web Developer, website owners can disable JavaScript, cookies, styles, and images, to give you a spider-eye view of your web pages.

However, it’s not only interesting to view how your own site is performing, but the competition.  Or, not even the competition in your own niche, but popular brands that are so popular that they likely don’t even need the SEO help.  In short, it’s pretty instructive to see how a multitude of different websites are performing.  It’s like looking at the skeleton of cyberspace.  You’ll find some common mistakes on popular sites – but, then, major brands can afford to play around more with flash and java.  After all, a brand like McDonald’s will get searches for the term McDonald’s and probably aren’t that worried about having people coming to the site who are looking for information on the cheeseburger.

So big time brands aren’t really the best litmus test – even if it’s still pretty interesting.  What you should really look at are the second-tier sites: the sites that need a blog and updated content to ensure people come to the site via searches – i.e. sites that are as information-based as image-based.  If you run a second-tier site that is as graphic/animation-happy as a major brand then you could run into serious trouble, as eye-catching as your site might be.  If you take away those graphics, you may just see how spare your site really is to spiders and this might jumpstart new content creation.

VN:F [1.9.10_1130]
Rating: 0.0/5 (0 votes cast)
Be Sociable, Share!