Chris Rudland SEO Ipswich

Using Search Engine Optimisation techniques and Marketing, relevant and targeted to your business is the key to search engine acceptance.

Do you have a low performing website in a competitive market? Don't get talked into on-page basic SEO as a quick fix. The experienced more knowledgeable SEO consultant will always advise you its not that simple and takes time. It will be more effective to focus on authority building and create citation-worthy content.

Home Services Adwords About Us Get In Touch SEO Articles Page Tester

W: www.chrisrudland.com

E: info@chrisrudland.com

M: 07786 262055

Website Design & Content Copyright 2017

11 Avocet Lane Martlesham Heath, ipswich Suffolk, IP5 3sf. England.

CHRIS RUDLAND SEO

Terms Conditions :: Privacy Policy :: Sitemap

POPULAR PAGES

1.

2. Google Trends Article One

3. Social Media Linking Article Two

4. Website Structures Article Three

5. Google+ My Business Article Four

6. Making Google+ Business Page Default

7. Get Listed in your Local Search

8. Free SEO Website Report

9. UK Directories with High Authority

Chris Rudland Google+ My Business Page Chris Rudland Search Engine Blog Follow me on Twitter Follow me on Facebook Google Partner Chris Rudland SEO

2. Pages with 5xx Status Code – Server Errors

Indicate Server errors, messages are sent when the server is, usually, aware that there is a problem but cannot handle the request. Typically these can be gateway, fulfilment, proxy, not ready or even unavailable.

Result - Will cause a negative impact for your websites search volume as users do not hang around with so many choices and will soon click away. Your authority and reputation as with 4xx errors will be effected and ultimately lead to poor user retention and high bounce rates.

Remedy - If you know you have no errors with your Domain Name Server configuration (DNS), A records or URL’s then contact your ISP to discuss the connection issues your users are having. All the errors will be listed in the Crawl area of Webmaster Tools.


3. Using a robots.txt text file on the root of your Website

A robots.txt file is very useful for blocking crawlers, restricting access to safe areas directories and URL’s. Google recommends you use this method to request removal of URLs from Google's search results. That way you can ensure aspects of your website are not indexed and show up in search engine results pages.

Result – Helps you get your principal information indexed (not guaranteed) and restricts crawlers and users discovering parts of your website you wish to restrict.

Remedy - You can manage your robots text file in Webmaster tools there is a section dedicated to this and there is a testing tool there so you can ensure that your robots.txt is returning the right information to the search engines It' is important to have a working robot.txt file on the root of your website, full instructions and testing tool are here.


4. Using a XML Sitemap on the root of your website

The same applies to the Sitemap, use it to display all your useful and current page URL’s. These URL’s may then be indexed by the search engines as they crawl your website. Particularly useful for new websites as the crawl may not discover all your pages if you have no links to them. A Sitemap allows you to list all pages. Also consider using “Fetch as Google” under Crawl in Webmaster Tools to Fetch, Render and submit a URL to the Google index with all related links.

Result – Helps you get your principal information URL’s and files indexed (not guaranteed) and submits all your URL’s even those with no links which may not be discovered by the crawlers.

Remedy - You can manage your Sitemap and can administer it in Webmaster tools under Crawl. There is a facility to add and submit a fresh Sitemap. Testing the Sitemap can also be done from Webmaster Tools. The testing tool will highlight errors and comparability issues, these need correcting before processing. If successful, your URL’s from your Sitemap are indexed but this is not guaranteed and Google may drop some of them if they are too similar.


5. WWW and non-WWW versions of your Website

WWW is becoming less used nowadays but you still may need to use your Webmaster tools to set your preferred domain to reduce the chances of duplicate data. I find the simplest way in Google Webmaster tools is to set the preferred domain.

Result – Helps avoid duplicate content and your website being downranked. Indexing of both versions of your website will be avoided and the search results will represent your preferred domain name.

Remedy - To do this you need to add a WWW.yourdomainname.com and a non-WWW version of your domain to Webmaster Tools. You can then you go to the Site Settings - Preferred Domain and set the domain version you wish to prioritise. Google will then regard all information on the other domain as the same. Meaning Google understands your preference and will not index both or present two versions of your website in their Search Results Page.


6. Pages with with 302 Status Code – Redirects

302’s are considered temporary so they don't pass any link juice. It is recommended that you use 301’s to redirect to a page that you have moved.

Result – Avoid using 302’s they can lead to search engines continuing to index your old duplicate data on other URLs and then disregard the new one as a duplicate. You may even find that the URLs get divided and so the link popularity between those versions will damage your rankings.

Remedy - It is recommended you use 301’s especially if you move a page or website. It is not recommended to use the 302’s especially if you're permanently moving a page or website. Use 301’s to preserve the link juice and avoid any duplicate content problems that can occur.


7. Pages containing Meta “refresh”

It is best to prevent pages with meta refresh this is a violation of the Google Quality Guidelines and would advise that you do not use them, you could get downranked and it is better to use 301’s and redirect instead as recommended by W3C.org.

8. Pages with frames

Avoid using frames where possible, if the search engines fail to properly index your pages the content from your framed pages won't rank very high. If there is a particular reason for using frames, perhaps add the no frames tag. Framed pages show more than one page in the browser window, therefore they don’t appear properly to the crawlers. Generally, try to avoid using Framed Pages or consider alternatives as described on W3C.


9. HTML CSS Errors and warnings

It goes without saying that you need to fix these where possible, you can validate your HTML and CSS using W3C markup validation.

Result – If the search engine crawlers, bots and spiders encounter too many errors in your markup language, they may fail to index your content correctly or simply move on to the next website. Most pages contain some invalid Markup as do mine, but try and reduce the number to a smaller amount than my example 62 in the included image from a clients website. I have 15 errors in CSS that don’t comply with W3C Markup but they have had no noticeable effect on my rankings.

Remedy – Validate your markup language using W3C validation and get as much as possible corrected.


10. Page size

It is good practise to try and get every page on your website under 256 kb. If you have bigger pages particularly those containing images then consider breaking those pages up into indexed pages, numbered pages for example still allow the users to browse your content and navigate your website, but you are sending better quality signals to the search engines and providing high performance to your users which they will remember.

Google provides a Site Speed tester in Google Analytics under Behaviour, use this to analyse your Pages Speed Insights and get suggestions. There is also a Google Chrome extension you can add to your Chrome Browser to analyse pages outside of Google Analytics.


11. Dynamic naming of URLs

Try to avoid using dynamic URLs for example ?/2548_6bcca/ automatic URL re-writing is not friendly to humans or search engines.

Result- Easily forgettable and not easy to recommend to humans, it is far easier to promote a sensible descriptive name which describes the content.

Remedy - Name your URLs sensibly in relation to the content and plan your website structure, for example a products page containing motor vehicle parts could be called /products/motor-spares/ford. Try to avoid underscores and numeric values which are easily forgettable and not memorised by the search engines or users. Avoid naming pages with too many characters, up to 115 is the recommended value and make those URLs very descriptive in relation to the content.


12. Broken links

it is essential that you use Webmaster tools regularly to check your links, check broken links and fix as quick as possible these are one of the factors that relate directly to your quality score. These errors can affect your whole website rank,  also check that you don't have too many links from one page.

Result - With many outgoing links broken you are sending a quality signal to the search engines so your rankings may be downgraded. if your has an excessive number of links Matt Cutts head of Google's web spam team recommended not going over 100 links per page and in his words he said “it was a bad experience for users and you will be overwhelming them.”

Remedy - It's therefore much better to have fewer links which are relevant and of high quality making the user experience better. Provide useful information in your links and ensure the quality and reputation are high. And remember you have no control over external links, people change content regularly. So assuming they all work is dangerous, try this free tool and get a detailed analysis from one of your URL’s. Use W3C Detailed Checker


13. Empty title tags on pages titles

These are one of the most important factors for your pages on your website make sure you create unique titles, be careful not to duplicate titles, make them too similar or excessively long.

Result – The first line of your page listings in the Search Engine Results Page is in fact your Meta Title tag.

Remedy – Always plan your website structure and naming convention before altering page titles, this will help avoid duplicate titles. Restrict the length of the page title to 65 characters. Remember narrow characters take up less space, so characters used may vary to achieve the best Title, see the MOZ viewer below. If your title is too long it will display… on the right hand side end of the title in the Search Engine Results Page.

It is best practise to keep the title shorter and descriptive it's not appealing to use all CAPS /// or exclamation marks and question marks. All CAPS sounds like you are shouting, appear desperate and take up more space. They are not attractive to the users and usually provides poor Click Through Rates (CTR). User experiences signals are likely to get your page downranked if you are sending too many signals to the search engines about user retention, clicks and your bounce rates.

Plan your website and do not call your homepage Home in its Title, call your homepage what your page is about if you are targeting keywords for your business include your business name and keywords in your title. Your website should be planned out well before you start renaming pages so you send very descriptive and clear instructions to the search engines what your website is about. Try the Free Page Title and Description Tester.

Example: The Search Term SEO Companies in Ipswich my Home Page is displayed with the Meta Title in blue, website in green and Meta Description in Grey.









:: SEO TIPS GUIDES AND ARTICLES

SEO Checklist Article 9. A Step By Step Guide to Checking Common Errors on Your Website

#topBack to Articles
SEO Checklist

SEO Checklist (A quick user guide with actionable tasks)

This is my quick start user guide and checklist, to use on your website to fix common errors. It is designed for intermediate users who will have some knowledge of using Bing or Google Webmaster Tools. You will also have an understanding of basic HTML and can access your websites root directory to upload files.

There are no guarantee your website URL’s will all be indexed by the Search Engine Providers, but with regular maintenance to your website and compliance with the guidelines, you will increase your chances of a fully indexed website.

   

1. Pages with 4xx Status Code – Client Errors

This error message points to problems on your website, including broken links, restricted areas, links to pages that have temporarily moved. When a visitor clicks a link, they may get a 4xx client error.

Result - This will cause a negative impact for your websites search volume as users do not hang around with so many choices and will soon click away. Your authority and reputation will ultimately be effected, leading to a poor experience for your users.

Remedy - It is important to regularly check for these errors using Webmaster Tools or other tools that you have, fix these errors promptly and submit fixed updates to the search Engine Providers using their interface whilst logged in. If you have moved pages consider using 301 Moved Permanently or if whole Directories have gone add Disallow remarks to your robots.txt to restrict indexing.

Consider implementing a customised 4xx error page. This is a friendly information page which you can set up to direct the users to another URL when errors are encountered. Customised pages can result in users staying on your website, leading to better user retention and you can redirect them to a newer page. Perhaps even display your Sitemap or redirect them back to your homepage.

14. Informative unique meta descriptions

Empty meta descriptions do not provide any guidance to the search engines when trying to discover your content, often the first line in your description is important to the search engines and they will decide what your page content is about when the first line is read. If you have a page title of Cavalier King Charles – UK Independent Breeders Ltd, then in your meta description describe the page including the word Dogs, their habits, food, your awards and their pedigree. The search engines don’t know a Cavalier King Charles is a dog, it could be a fish or a car as far as they are concerned and the search results page will reward you by displaying your descriptions under the first line of your listing if your ranking is good.

Do not misguide the search engines, be honest you can of course include your keywords or marketing text which will help attract users. Do not overdo the density of this and limit your meta descriptions to 165 characters and make it appealing. Try to avoid duplicating meta descriptions once you have a very good description for a page, do not copy and paste this description to other pages as this will not help your rankings or page score.

15. Directory listings

If you are considering adding you website and business to Directory Listings, for example Yelp Yell FreeIndex, make sure the content that you enter for each listing is not simply copy and pasted to each directory listing. Some elements including website URL, Business Address and Phone Number (NAP) will be identical.

Ensure you keep the NAP information consistent do not vary the name of the street or abbreviate Road to Rd or Avenue to Ave. Always write a summary, listing details about your company, it’s services, products and any awards. Keep the company name the same but ensure your summary listing is not copy and pasted to each Directory Listing. Write fresh informative content using keywords or marketing text that will encourage users to visit your website and include any special offers with terms and conditions.

Share on Facebook Share on Twitter Share on LinkedIn

Author: Chris Rudland

Date: 1 October  2014