Monday, November 2, 2009

Content - SEO Friendly Content

Search engines obviously place a lot emphasis on a site’s content when assessing its relevance to a particular topic. There are many ways that they evaluate the content. Many years ago, search engines figured if a word was mentioned a lot on a page or site, then that site must be relevant to that topic. However, it became far too common for people to stuff a site full of keywords; many times the wording was unnatural, and people hid text by making it color of the background, etc. Search engines realized that this did not help provide their users with truly relevant results and have since lessened the emphasis they place on keyword density. While it is important to use a target keyword on a page or site, the search engines are aiming for relevance, and density isn’t the only measure. Therefore, it is very important to use target keyword(s) found in the target audience’s natural vocabulary.

When looking at content, a small way that search engines determine the importance of a word is its characteristics on the web page. Is it linked to another page? Is it in bold? These are hints that search engines can use since it makes sense that a word that is somehow made to stand out has some importance.

A common content issue for websites is duplicate content (both on the same site and amongst many websites). There are many reasons why content may be duplicated throughout a site (like browser and printer friendly versions of the same page). Search engines feel it is important to direct their users to the most relevant and original version of the content. Further, it can confuse the engines when identical content appears on a site in more than one place. There are a few ways to deal with this situation. One is redirecting from one version of the page to the main version of the page using what site administrators refer to as a 301 redirect. The other is to use the robots.txt file, a simple text file housed in the root directory of a website’s file structure that instructs web spiders to ignore a particular web page.

It is also important to note that search engine spiders cannot read all types of content. They cannot read the text within Flash; they cannot interpret the words in an audio file; and they are unable to determine the words used within a video. However, site designers can use tags (such as <ALT> tags with images) to help the web spiders know what a piece of media is about. The Flash, audio, or video can also be surrounded by regular HTML text content that is descriptive of the content. Using <ALT> tags is only really crucial when the majority of a website’s content is unreadable to web spiders.

0 Responses to “Content - SEO Friendly Content”

Post a Comment

Don't comment with anchor text, it's will be deleted.