What is technical SEO ?
Technical SEO is the way to assist the search engines bots or crawlers to find, crawl and index the website.
To rank your website , it is essential that search engines know and understand your website, and for that you need to check your website’s technical setup.
So, it doesn’t matter whether your website is new or old , it is a huge website with thousands of pages or you’re just getting started with your website you need to check for its technical set up .
It is the important part of SEO as without doing the technical SEO you can’t make your website visible on the search engine and so ranking is far away.
So now, let’s jump into the checklist and have a look at some technicalities of the SEO
Technical SEO checklist:
Specify a preferred domain
Firstly specify the domain you want to use. Domains can be with www or without www you can use any version of the Domain, it is your choice.
But it’s important that you decide and specify it so that search engine crawlers do not get confused.
My preferred domain is nikitatomar.com and not www.nikitatomar.com so I have to tell the search engines about it.
So that while crawling the website it crawls nikitatomar.com and index it.
But if I will not specify my preferred domain as nikitatomar.com and if it is available with www version also then search engines may crawl and index both domains or sometimes only one or the other of them.
So, it must be specified clearly to the search engine crawlers.
XML sitemaps help search engine crawlers to easily navigate through the website and make the crawling process smooth and efficient.
It basically tells the site structure to the crawlers so that they do not find difficulty in crawling the website.
Think of a place where you want to visit but you are not familiar with that place.
You will find difficulty in finding a particular address in that locality or place,but if you have a map to the location you can easily follow and reach the location easily.
Sitemaps are just like the roadmaps that help the crawlers to easily reach the web pages and make the crawling fast, smooth and efficient.
So, do not forget to optimize sitemap.
Note: Sitemaps need to be in XML format only and do not make it in html format.
It is a very simple process to generate and submit sitemaps as, there are various tools or pluigns which help in XML sitemaps generations submision.
Site structure is the way you organize your website’s content. A simple clear hierarchical site structure is best for SEO practice. It not only makes the website look organised and easy to navigate by the users but also for the search engine crawlers.
It helps crawlers in easily crawling and understanding your website. It helps in making efficient use of crawl budget as easy to navigate site structure makes it possible for the crawlers to navigate and crawl the different parts of the website with ease.
Like how difficult it would become to find a book from a library if the books weren’t kept in an organized way.
In a library where each book is kept according to its category we, can easily find the book we want.
So, just like that if your site structure is in clear and organized it would become difficult to navigate your website, both by the users and the crawlers.
It stands for “Uniform Resource Locator”. It is the address to a particular resource or content on the web. URLs consist of these elements:
- Protocol (http or https).
- Subdomain (www)
- Root Domain ( for my website https://nikitatomar.com root domain is nikitatomar)
- TLD or Top Level Domain (.com, .net, .org, .in .co.,etc)
- Subdirectory ( It indicates the path in the human readable format for example https://nikitatomar.com/blog/technicalseo so, here everything after .com/ is called subdirectories or category or file name, it can be numbers as well(like https://nikitatomar.com/21-12-2021 ) but for optimization it is considered best practice to include postname
You can make changes to URL structure according to your need and purpose, but here are some tips to make it SEO friendly. You can use these tips , as URLs do affect the SEO of the website.
Here are the tips which you can use to optimize Url structure of website/ blog:
- Make the URLs short and simple.
- It should include https protocol as it is considered safe and so is good for SEO purpose.
- Use hyphens to separate words.
- Eliminate stop words like and, the, an ,a, for,etc
- Use lowercase letters as URLs are case sensitive using lower case is the best practice.
- Avoid using dates in your URL structure , while setting your URL or permalink it is the best practice to use a post name.
- You can include the keyword you want to rank for in the URLs of the page. Like this blog post which is explaining about digital marketing https://aardigitals.com/what-is-digital-marketing/ the keyword what is digital marketing is included in it’s URL. As, author want to rank it for this keyword.
If for some reason you have duplicate(same or nearly same content under different URLS) content on your website you should clearly specify the preferred URL using canonical tags.
Canonical Tags tells search engines which is the preferred URL for the particular page or content and so crawlers can index your preferred URL for that content.
Some common reasons for the duplicate content are:
- http or https
- www. or without www
- URL’S are case sensitive , for instance it will consider nikitatomar.com/page and nikitatomar.com/PAGE as different URL’s
- AMP URL’s , as AMP(Accelerated Mobile Pages) are the duplicates, so search engine will consider abc.com/post and abc.com/amp/post as duplicate.
To, avoid the duplicate content to show up you need to use canonical tags.
Mobile first indexing
Earlier it was required to make your websites mobile friendly, but since Google made its announcement about its mobile first indexing where google has specifically said that it will index the websites on the basis of mobile crawlability .
Mobile first is the important ranking factor which definitely needs to be taken care to rank your website on the search engine.
One point which is worth mentioning is that Google has explicitly stated that there should be no difference in the content of the mobile and desktop version of the website.
You can make the design mobile responsive which does not at all mean that you can change the content, every content and detail that is there on the desktop version of the website needs to be included on the mobile version as well.
Include SSL certificate or use https
Using SSL certificate definitely provides a boost to your rankings as search engines want its users’ experience to be safe.
It provides a reputation to your website of being secured and improves the user experience and so it will boost your SEO efforts.
After installing an SSL certificate make sure your website is being serve through https and not the http version , and yeah! your website is secured.
And search engines definitely preferred secured websites over unsecured websites.
It is a code that can be added to the website to help search engines better understand your content so that they can provide more detailed information to the user.
Which is shown in the form of the rich snippet (google search result that shows extra information, like the ratings, photo, contact details or any information you provide using schema markup).
It is a type of structured data , it is very helpful for SEO purposes. As, it provides more detailed information to both search engine crawlers and users.
So, it is useful both ways ,do add or optimize schema markup to optimize your website’s ranking.
It is a file which provides instructions to the crawlers which page or part of your website it can crawl and index.
When crawlers visit your website to crawl and collect information it will first check your robots.txt Crawlers will crawl the pages which are marked as allowed to be crawled.
If there is any page which is marked as disallowed the crawlers will not and so will not index that page.
robots.txt file is a text file which is in the root folder of the website and is always named as “robots.txt” you can’t change its name as Robots.txt or robot.txt , etc
Note: it needs to be always robots.txt
Using robots.txt you can instruct crawlers to crawl the pages which you want
.Crawlers keep on crawling the different content on the web to collect more and more Information, and indexed it to provide better and better results to the users.
Crawlers have a crawl budget(time to crawl the website) for each website and as soon as the crawl budget gets over it will leave your website even if it has not crawled the part which you think is important.
So, to avoid this situation you can use robot.txt file to instruct crawlers.
Especially if your website is having large number of pages , you can disallow the part which you think is not so important, so that crawlers will use its crawl budget in crawling the important pages of your website .
You should definitely optimize your robots.txt file to help your SEO strategies .
To check robots.txt file just add /robots.txt to the domain for which you want to check robots.txt file.
Suppose if your domain is https://xyz.com then you have to search for https://xyz.com/robots.txt and you can have a look at the robots.txt file of the website.
This is used when you serve your website in more than one language.
and it also help in saving you from the duplicate content issues.
It tells google which language you are targeting for a specific page, so that search engine can show the page result to users searching in that specific language.
Suppose if you are searching for something in hindi language then the search engine will show the results in Hindi language rather than English, Spanish, French or any other language.
Like kagazkalam.com is blog which has article in two languages English and Hindi, now even if it will not using hreflang on its blog, then also search engine would make sure that it shows the result to its users according to their language preferences.
Google says that using hreflang provides a signal to the search engines,
but even if you do not use hreflang search engine will still find the alternate version of your website to provide the result in the language which the user is searching in.
As it always wants to provide the best suited result according to the users query
But, it is a good practice to use hreflang if your website serves in different languages as your website may face duplicate content issue if you are serving in different languages.
As, the content is nearly same on the web pages in different language.
To sum up, whenever you make efforts to ease things for users and search engine crawlers then you are actually making it easy for you to get visible rank on the search engine.
This was all about beginners SEO checklist.
I would love to know if this list was helpful to you. so comment and share your thoughts.