Website Accessibility and Compatibility

How many websites do you come across on a daily basis that do not work? Either a page doesn’t exist, or the page doesn’t load or the page load’s, but doesn’t display properly.

The reason this can happen is because not enough testing has gone into the site. Some Webmasters assume if it works for them, it will work for everyone right? Wrong.

Different people use different browsers, the most common are IE, Firefox, Opera and Safari for Mac users. Because a website displays ok in IE, does not mean it will display the same in Firefox, this is why it is important to test your website compatibility in different browsers.

As well as testing how your site displays in different browsers, it is just as important to test the accessibility of your site.

webxact.jpgWebsite Accessibility is becoming more of an issue, so much so, website companies can be sued if their websites are not accessible to all users. It is in everyone’s interest who runs a website to check with their various laws on the stance of website accessibility and what you can do to ensure you website is accessible.

There are many tools you can use to ensure your website is accessible. World Web Consortium (W3C) website offers guidelines for web content accessibility which will help you tweak your site and bring it into line to make is fully accessible. Other tools you can use for testing are webxact.watchfire.com. WebXACT is a free online service that lets you test single pages of web content for quality, accessibility, and privacy issues.

If you use Firefox, a useful extension which acts a screen reader is Fangs. Fangs will show you how screen readers view your pages enabling you to tweak the textual content of your webpages.

octagate.jpgAnother thing to consider about your website is how fast your pages load. The quicker your pages load, the happier your visitors are. You can test the speed of your pages by using tools such as OctaGate.

Although the majority of Internet users are now on broadband, there are still those who access the web via dial-up. Therefore you should still consider the speed of your site and the load times of your pages.

Broken URLs are a common in alot of websites. A website may link to an external website, yet never check to see if that site or page still exists some months/years down the line. You may even move a webpage on your own site and not update your own links. W3c’s Link Checker is a useful online tool which enables you to check your website for broken links.

There are many tools on the web that you can use to check your website compatibility and your website accessibility. The testing of your website is a very important part of the success of your website performance. Testing does not just occur the first time your website goes live, it should be a continuous part of your overall maintenance schedule.

Duplicate Content

thief.jpgOne of the easiest (and annoying) things a website owner can do is copy content from your website and place it into their own without permission and not crediting you as the original author.

Search engines aim to deliver unique information to their users and do not want duplicate information showing in their results, they do not penalise websites for duplicate content, but they do choose which version of the content to index and rank, therefore, this can be a problem for the owner of the original content of search engines choose the wrong version.

Websites that display content taken from various other sites are called “Scraper sites”. Scraper sites are the most worrisome concern’s to webmasters when it comes to search engines handling duplicate content.

Lucky for us, Google offers some potential solutions to prevent scraper sites getting the credit for our content as well as those who duplicate content on their own sites.

There are times on your own website where you could have multiple URL’s pointing to the same content, a way to tell Google which version to use is by specifying the URL you would like search engine spiders to use. This can be done in Google Sitemaps. Nothing is guaranteed, but you might get lucky and the bots will take note of your requirement.

Another suggestion is to Authenticate ownership of your content. Unfortunately, although a good idea, it may not be the best solution, as most poeple who take your content are not interested in authentication.

There was a myth going around that search engines check creation dates of content to determine which version to use, however, Vanessa Fox of Google shot that one to peices when she said “creation dates aren’t always reliable”, so I guess we can’t rely on that.

Other ways of dealing with scraper sites taking your content could be by contacting the scraper webmaster and threatening legal action for breach of copyright (If your content is copywrited), sometimes this can bring positive results and the webmaster will remove your content.

The truth is, there is currently no magic formular that deals with duplicate content issues, however, search engines such as Google are talking to webmasters and interested in peoples ideas to find way’s to deal with these issues. It may take some time, but at least they are moving in the right direction.

Google Webmaster Guidelines

How well your website ranks in Google’s search engine depends on many different factors, such as:

  • How competitive is the term you are targeting
  • How old is your website
  • How many links do you have in comparison to your competitors
  • How large is your website
  • How well optimised is it
  • How well is it structured

The list can go on.

If you new to Search engine optimisation and a bit unsure how to go about getting your site listed in the search engine’s (Google inparticular) Google have provided basic guidelines to help you.

Of course, Google are not going to give too much away, all the guidelines do is tell you how Google want you to structure your site in a way that suits them, and in return, they say your website will at least be indexed in their search engine, and will rank. There are never any guarantee’s about the position your website ranks, but it will at least rank somewhere.

So, here are a few do’s and dont’s for you to consider when trying to get your website indexed and ranking in Google.

Design and Content

  • You should try to make every page reachable from at least one static text link
  • Include a Sitemap
  • Create useful unique content
  • Use keyphrases on your site that people are searching for relating to your products
  • Use text instead of images as spiders to not understand the text in images (I’m not actually convinced of this one as they also suggest to make use of ALT text in images which spiders DO understand as far as I’m aware)
  • Check for broken links
  • If you use dynamic URLS such as www.domain.com?1234 be aware that not all search engines spider these URLS
  • Try to not add more than 100 links to an individual page

Basic “Quality” Priniciples

  • Create a site for users not for search engines
  • Don’t trick the spiders by using unethical tactics
  • Don’t submit to link farms

Specific “Quality” guidelines

  • Don’t use hidden text
  • Don’t use cloaking or sneaky directs
  • Don’t spam pages with your keywords
  • Don’t create lots of websites with the same content
  • Don’t use pages to install viruses, trojans or other badware/spamware
  • If your site is an affiliate, ensure your content is unique and add’s value

So there we have it. This is how Google would like you to design and conduct your website. This of course it their guidelines, you website is your own and what you choose to do with it is your business, but if you want to give it a chance to rank in Google and not be pulled up for anything Google doesn’t like, then this is at least a good guideline to go by.