DIGITAL PANKAJ
Technical Seo Audit HERE ARE 20 WAYS TO IMPROVE YOUR ON PAGE/TECHNICAL SEO
1.) ONLY ONE VERSION SHOULD BE ACCESSIBLE Redirect all the various version to the single version.
http//www.abc.com | https://abc.com | https://www.abc.com http://abc.com |
Only one version should be accessible in a browser. Final Version -
https://www.abc.com
2.) CHECK ALL THE IMPORTANT PAGES ARE INDEXED IN GOOGLE OR NOT Enter the site command
(site:yourdomain.com)
Under the search box, you will see actual indexed pages in google for your website. If fewer pages indexed than the actual page, indexed more pages than the actual pages. Need to identify the cause of this problem.
3.) PERFORM A FULL SITE CRAWL TO IDENTIFY THE TECHNICAL ISSUES Enter your domain in Screaming Frog & download the results. Check for the Noindex directives first & Check the Robots.txt file syntax as well.
Remove the Nonidex Meta tag from the important pages. 2) If there any important pages blocked through robots.txt remove these pages from robots.txt. 1)
4.) CHECK THE CANONICAL VERSION OF THE URLS Same content on different URLs is problematic for search engines.
Implement a self canonical tag on every page even if there are no other versions of a page to prevent any possible duplicate content issue.
5.) CHECK YOUR XML SITEMAP FILE FOR ERRORS At first check, your website has a sitemap or not. (
http://yourdomain.com/sitemap.xml)
If your website doesn’t have a sitemap, create one right now.
1) Keep your sitemap on the root of the directory. 2) You can keep up to 50000 pages in a single sitemap file. 3) Sitemap file size shouldn’t be more than 50MB.
6.) CHECK CLIENT SIDE SERVER (40X) ERRORS A server error means google bot is unable to access your URLs. Logins into your search console.
Navigate to Crawl > Coverage >
Click on Error. Find the alternate of 404 pages if not found leave it, Google treats 410 (gone) as the same 404 (not found). If you found any relevant pages pu the 301 to these 404 pages.
DIGITAL PANKAJ
7.) MAKE SURE YOUR WEBSITE SHOULD BE MOBILE FRIENDLY Test your website mobile-friendliness firstGo to
Google Mobile Friendliness Test Tool.
Enter your website URL and hit on Test URL. Now in the test results page, it will show you the issues if your page any have. Now make a list of the URLs and fix these issues to make your web pages mobile-friendly.
8.) CHECK FOR BREADCRUMBS ISSUES Breadcrumb is a type of navigation that utilizes to show the users current location on the website.
Also, breadcrumbs help search engines to understand how your site is structured. If your website is in wordpress and yoast is already installed now,
Yoast SEO, in the left pane. Click on Search Appearance, on top of the tab go to breadcrumbs and click on Enable it. Now save the changes. Click on
9.) CHECK FOR REDIRECTION CHAIN ISSUES A URL is redirected from one location to another to another, this is called redirects chain. One url is redirected to final url this is called proper 301 redirection.
Keep the redirects as much less as you can. #2.) You can keep up to 5 redirects in a chain. #3.) Because some browser only supports 5 max redirects in a chain. To keep your crawl budget healthy don’t use more than 3 redirects in #1.)
a redirect chain.
10.) ANALYZE YOUR TOP LEVEL NAVIGATION Clear navigation helps both the search engine and the user to find the right paths on your website.
#1.
Target pages in top-level navigation – Make sure your target
pages always are linked from the top-level navigation.
#2.
Use Descriptive text and primary keywords – Use descriptive
text to describe the navigation labels.
#3.
Avoid dropdown menus – according to the usability studies drop-
down menus are annoying. As visitors, we move our eyes much faster than our mouse.
11.) CHECK YOUR STRUCTURED DATA MARKUP Search engines use structured data to generate rich snippets, it does then appear below the listing in search results. In the first, step you need to analyze, that structured data is implemented on the website or not. If not then find the right structured data first.
There are various types of structured data, you can implement through
Structured Data Markup Helper or Schema.Org. Select the best-matched type data and implement it correctly.
12.) CHECK HREFLANG TAG IMPLEMENTED CORRECTLY Same content in multiple languages you must implements the hreflang tag on your site.
Ways to implement the hreflang tag. #1. Use the link element in the Head Section of the HTML. Example Code –<link rel=”alternate” href=”https://digitalpankaj.me” hreflang=”en-fr” />
#2. Use an XML SitemapExample Code –<url><loc>https://digitalpankaj.me</loc><xhtml:link rel=”alternate” hreflang=”en-fr” href=”https://digitalpankaj.me” /></url>
DIGITAL PANKAJ
13.) CHECK THE GOOGLE SEARCH CONSOLE & GOOGLE ANALYTICS SETUP CORRECTLY How to check GSC Code in Website– Press CTRL+U > CTRL + F> check for this code “google-siteverification” How to check GA Code in Website– Press CTRL+U > CTRL + F> check for this code “UA-” Learn more how to setup Google Search Console and Google Analytics properly read this..
14.) CHECK FOR THE DUPLICATE & THIN CONTENT Duplicate content means that similar or same content on different urls. Search engine hates duplicate content.
Make sure to keep your website safe from duplicate content. Enter your website on
Copyscape and hit enter. And it will show you the
duplicate content with the urls. (If your website will have)
How to fix Duplicate Content Issue? #1.) Completely remove the pages. #2.) Or put the Noindex meta tag on it. #3.) If those pages are important then write the fresh content for those pages. If your web content is stolen by third-party sites you can contact them to remove the content or provide a link back (credit to the original source) to your website.
15.) CHECK YOUR WEBSITE LOADING TIME To make user experience and conversion betters try to reduce your website load time in under
less than 3 seconds.
How to check & optimize your website speed? #1. Use
Gtmatrix and Google’s website speed tool.
#2. Review the recommendation and apply to your page to improve your web page speed.
16.) REVIEWS YOUR WEBSITE FOR DUPLICATE META TAGS All your webpages should have proper Meta Title & Meta Description Tags. And there shouldn't be any duplicate meta tags.
Head over to the
screaming frog enter your URL and hit start.
Click on Internal Tab below click on Filter select the HTML and click on export.
And now for the empty Meta Tags URL's create the Meta Tags and optimize all the pages.
17.) ANALYZE YOUR URL STRUCTURE Your website URL structure should be short, easily readable and keyword optimized.
Example - https//digitalpankaj.me/technical-seo-audit How can I check my URLs? In the screaming run your website crawl report. Click on URL tab export the data & analyze it.
Best Practices for Structuring URLs – 1) Include your Primary Keywords in the URLs. 2) Use – as separator and avoid _. 3) Avoid Stop Words in the URLs (a, an the, at, on) 4) Always use lowercase letters in the URLs. 5) Use 1-2 folder per URL.
DIGITAL PANKAJ
18) CHECK YOUR ALT TAGS FOR IMAGE OPTIMIZATION Google can’t read the images so to make the images readable you need to provide alternate text in the ALT Tag. ALT tag describes what’s on the image. Through ALT texts it helps the images to rank well in the google image search as well.
Recommendation for Image ALT Tag Optimization – 1) Be descriptive – 2) Use your keywords – 3) Use no more than 125 character –
19) ANALYZE YOUR INTERNAL LINKING STRUCTURE An internal link is a hyperlink that point from a page to another page on the same domain.
Proper placements of internal links help the search engine to crawl the website faster. And as well helps to flows the link juice to all the internal url's.
Internal Linking Best Practise – #1.) Use relevant links – #2.) User Anchor Text – #3.) Deep Links – #4.) Keep your links follow –
20) REVIEW YOUR ROBOTS.TXT FILE FOR SEO Robots.txt is a text file which placed on your web server.It tells the search engine bots which pages should crawl or which not.
Why Is Robots.txt Important? 1) Exclude Private Pages – 2) Increase the Crawl Budget –
How to find your Robots.txt file. Enter robots.txt behind your domain name and hit enter.
Example -
http://abc.com/robots.txt
If you found any important pages are blocked by robots file, exclude them from the robots.txt file.
Optimize your robots.txt file carefully and double-check the syntax for any errors. Make sure important pages which you want to rank, should be indexable and shouldn’t be blocked by robots.txt.
https://digitalpankaj.me