Sponsor

recent posts

10 On-Page Technical SEO Factors to Assess in an SEO Audit

1. Sitemaps
The presence of a sitemap file on your site will help search engines:
Better understand its structure.
Where pages are located.
More importantly, provides it access to your site
XML sitemaps are often simple, with one line of the location per line. They don’t need to be pretty.
HTML sitemaps can enjoy being “prettier” with a touch more organization else.





2. Robots.txt
Identifying whether robots.txt exists on-site may be a great way to see the health of your site. The robots.txt file can make or break a website’s performance in search results.
For example, if you set robots.txt to “disallow: /”, you’re telling Google never to index the location because “/” is root!
It’s important to line this together of the primary checks in SEO because numerous site owners get this wrong.
It is always presupposed to be set at “disallow: ” without the forward slash. this can allow all user agents to crawl the Website.


3.Crawl Errors
The Crawl Errors section of GSC will assist you identify whether crawl errors currently exist on-site.

Finding crawl errors, and fixing them, is a crucial a part of any website audit because the more crawl errors a site has, the more issues Google has finding pages and indexing them.

Ongoing technical SEO maintenance of those items is crucial for having a healthy Website.


4. Multiple URLs: Capital vs. Lowercase URLs
This issue can cause Google to ascertain two or more versions of the page because of the source of single content on your site.

Multiple versions can exist, from capital URLs to small letter URLs, to URLs with dashes and URLs with underscores.

Sites with severe URL issues can even have the following:

https://www.seo.com/this-is-the-url
https://www.seo.com/This-Is-The-URL
https://www.seo.com/this_is_the_url
https://www.seo.com/thisIStheURL
https://www.seo.com/this-is-the-url/
http://www.seo.com/this-is-the-url
http://seo.com/this-is-the-url
What’s wrong with this picture?

In this case, seven different URL versions exist for one piece of content.

This is awful from Google’s perspective, and that we don’t want to possess such a multitude on our hands.

The easiest thanks to fixing this is often to point the rel=canonical of all of those pages to the one version that ought to be considered the source of the only piece of content.

However, the existence of those URLs remains confusing. the perfect fix is to consolidate all seven URLs right down to one single RL and set the rel=canonical tag thereto the same single URL.

Another situation that will happen is that URLs can have trailing slashes that don’t properly resolve to their exact URLs. Example:

http://www.seo.com/this-is-the-url
http://www.seo.com/this-is-the-url/
In this case, the perfect situation is to redirect the URL back to the first, preferred URL, and confirm the rel=canonical is about thereto preferred URL.

If you don’t fully control the Website updates, keep a daily eye on these.

5. Does the site Have an SSL Certificate (Especially in E-commerce)?
Ideally, an e-commerce site implementation will have an SSL certificate.

But with Google’s recent moves toward preferring sites that have SSL certificates for security reasons, it’s an honest idea to work out whether a site features a secure certificate installed.

6. Minifying CSS & JavaScript Files
Identifying bloated CSS code, along side bloated JavaScript, will help decrease your site’s load time.

Many WordPress themes are guilty of bloated CSS and JavaScript, which if time were taken to minify them properly, these sites could experience load times of 2-3 seconds or less.

Ideally, most site implementations should feature one CSS file and one JavaScript file.

When properly coded, the shortage of those files minimizes the calls to the server, potential bottlenecks, and other issues.

7.Image Optimization
Identifying images that are heavy on file size and causing increases in page load time may be a critical optimization factor to urge right.

This isn’t a be-all, end-all optimization factor, but it can deliver quite decrease in site speed if managed correctly.

When you’re done crawling your site, click on the URL within the page list, then click on the Image Info tab within the window below it:
You can also right-click on any image within the window to either copy or attend the destination URL.



8.HTML Errors / W3C Validation
Correcting HTML errors and W3C validation by themselves doesn’t increase ranking, and having a totally W3C valid site doesn’t help your ranking, per Google’s John Mueller.

That said, correcting these sorts of errors can help cause better rendering in various browsers.

If the errors are bad enough, these corrections can help cause better page speed.

But it's on a case-by-case basis. Just doing these by themselves won’t automatically cause better rankings for each site.

In fact, mostly it's a contributing factor, meaning that it can help enhance the most factor – site speed.

For example, one area which will help includes adding width + height to pictures.

Per W3.org, if height and width are set, the “space required for the image is reserved when the page is loaded”.

This means that the browser doesn’t need to waste time guessing about the image size, and may just load the image right then and there.

9. Mobile Optimization & Testing
Mobile is here to remain , and there are many reasons for mobile optimization.

This includes the very fact that Google said that mobile-first indexing was getting used for quite half the online pages in Google search results at the top of 2018.

As of July 1, 2019, Google has announced that mobile-first indexing is that the default for any brand-new web domains.

This should be included in your audits due to how widespread mobile is going to be now.

These issues should be checked.


10.Forcing one Domain
Despite many recommendations online, I still run into many websites that have this major issue.

And this is often the difficulty of multiple URLs loading, creating massive problems with duplicate content.

Here’s things. once you enter your address in your browser, you'll test variations of URLs:

http://www.seo.com/
https://www.seo.com/
http://seo.com/
https://seo.com/
https://seo.com/page-name1.html
https://www.seo.com/page-name1.html
https://seo.com/pAgE-nAmE1.html
https://seo.com/pAgE-nAmE1.htm
What will happen is that each one of those pages loads once you input the online address, creating a situation where you've got many pages loading for one URL, creating further opportunities for Google to crawl and index them.

This issue multiplies exponentially when your internal linking process gets out of control, and you don’t use the proper linking across your site.

If you don’t control how you link to pages, and that they load like this, you're giving Google an opportunity to index page-name1.html, page-name1.htm, pAgE-nAmE1.html, and pAgE-nAmE1.htm.

All of those URLs will still have an equivalent content on them. This confuses Google’s bot exponentially, so don’t make this error.



No comments:

If you have any doubts, Please let me know

top navigation

Powered by Blogger.