SEO TECHNICAL SITE AUDIT
Once you have completed the on-page search engine optimization for your website, it’s important to look deeper to see what is being revealed to search engines. Your off-page SEO, or technical SEO, plays a big role in how your website is being ranked. The SEO experts at Stuart Morris Consulting can provide you with an in-depth audit that will tell us why your website is or isn’t performing well. Don’t make the mistake of ignoring this factor because it could put you at a lower ranking, costing you traffic and potential business. What Does an SEO Technical Site Audit Consist of?There are seven areas that search engines evaluate to determine the ranking a website should receive. Taking a look at these areas will tell our experts exactly what search engines are seeing and what needs to be done to improve your current ranking.
We will issue you an SEO report for your Website that will include
- Crawl and index – robots.txt, sitemap, crawl errors, broken links and more.
- Speed and performance – page load time, expire headers, http requests and more.
- On-page – URL structure, page titles, alt tags, code to content ratio and more.
- Analytics – Google Analytics review, Google and Bing webmaster tools check.
- 404 Errors
- Missing Cache Validation
- Slow “Time to First Byte”
- Excessive Domain Calls
- Compressed Images
- Images on an External Server
- Use Progressive JPEGs
- Leverage Browser Caching
- Excessive Redirect Chains
- Combine Images using CSS Sprites
- Unspecified Image Dimensions
- Reduce the number of DOM elements
- Empty Reference Tag
- Meta / Page Language Mismatch
- Meta Title Tags
- Meta Description
- Meta Keyword
- H1 Tags
- Page Copy – Content
- Image Alt Tags
- Duplicate Content
Not only can our technical SEO experts audit your website, but we can also help you develop a plan to fix all of the issues we find that are holding your site back from a high ranking. Give us a call today to have a technical SEO audit done for your website. Want to learn more about SEO? Check out our SEO 101 page.
Search engine spiders and crawlers are the key to search engine optimization. If they are unable to crawl through your web pages, they won’t be able to index and rank them. When you have error codes, broken links and other problems with your website, crawlers won’t be able to do their job, resulting to your pages not making it to search engine results.
Plagiarism is unacceptable to search engines, even if you are taking content from your own site. It’s important to remove any duplicate content on your website. If you don’t, your internal pages will compete with one another, since only one of them will be indexed by crawlers. A good example of this is when eCommerce websites use manufacturer descriptions for the products on their online store. This is why it’s essential to write unique product descriptions, articles and any other content that shows up on your website. There are Internet tools you can use, like CopyScape, to determine if there is duplicate content on your website.
Issues with XML Sitemap
To help search engines discover your content quicker and easier, you can submit your XML sitemap to them. This should be done regularly, as you continuously upload more content to your website.
If your site navigation is user-friendly, then you’re going to lose points with search engines. Overall, your website should be easy to navigate, proper labels should be used for categories and links, and your directory structure shouldn’t be too deep.
Server Response Codes
When people click a link to a page on your website and see an error code (301, 302 OR404), this means that there is an issue with your server directives, which interrupts access to certain web pages. These codes pretty much mean that a web page was permanently deleted, moved to another location or missing. There are a variety of other errors that could take place as well. However, getting rid of these errors is important for the integrity of your website. If someone comes to your site and sees error messages, it will increase bounce rates and likely that individual will never return.
It’s common for products on eCommerce websites to have unfriendly URLs, which consists of special characters that can trip crawlers.
Slow Loading Times
GOOGLE POSSUM INITIAL RELEASE DATE: SEPTEMBER 1, 2016
If you want to see your bounce rate soar, then having a website with slow loading times will do the trick. This is caused by overly dynamic designs that are packed with Flash files, heavy images and excessive scripts. This takes up a lot of bandwidth and can slow everything down.
Need perfect SEO for your website?
Call us today to get a SEO consultation and to be one step closer to delivering a successful online presence to your audience.