Table of content

SEO Manual Audit

Strict usage of domain with or without www:

Consistency throughout the URLs is important.

Prefer either www or non-www but use the same for all the webpages in your site. this setting can be done in the .htaccess file or in the Google search console. My advice would be to set it in .htaccess file because there are other search engines other than Google.

There aren’t any advantages of www or non-www websites on SEO because it’s up to personal preference to choose one over another. The main aspect which affects SEO is your consistency, specifically whether you have been adding/removing www to your site’s URL.

In order to benefit from this small thing, you need to choose one option and stick to that choice. In short, don’t change your site’s URL when you have already introduced it to well-known search engines.

Otherwise, they will treat both options ( and as two different websites and will ban them from doubling pages, keywords, and contents.

My website is configured in a way that it always goes to the non-www version, and canonical is also the non-www version.

URL Structure:
Ask to share the content:

End of your article, ask the readers to share the content they have read in social media, it could be Facebook or Twitter or any other social platform.

You can use social share icons at the bottom of your article are you can use the mess floating icons in the side.

Readers will share the articles if it's worth sharing so do focus on writing a worthy article to go viral


Minified versions:

Double-check on your CSS and JS files in case if you are having a normal then you need to perform a minification of them.

Minification will not remove any measure things other than extra spaces and tabs and newlines, no such are extra spaces then the size of the CSS file will become less. Lessen the file size is easier the transfer


Color test:

Sometimes even the grammar is good but still, people find it difficult to read and understand a couple of pages when the reading is not easy, if you have used a right contrast to your web pages then it becomes very easy for The Reader to read it.

For example, if your website background is a dark color and if you choose your font color to be grey color then there is very high chance that the readability will be very very less, so whenever you use a darker as a background try to use white color or shades of white color as the text color so that it will give better readability to the reader.

My advice should be used shades of white color as a background and use shades of black color as the text color.

Use : for checking the colors.



In recent year there any development in the area of lead generation the lead generation technique is to have chatbots, these chatbots collects all the data from the user like a username and email id and the chatbots can give the answer as well if the answer is predefined otherwise it will collect the lead and it will let the agency to know that somebody has a query.

Based on the query the agency or support can answer the query e and having a chat boat in your application is highly valuable for the existing customers as well as for the new customers.

But having only a chatbot and never solving the problem of the customer is another big headache so if you have a chatbot do have a support team to solve the problems collected by the chatbots

Forms and refreshes :

When it comes to the forms try to use ajax as much as possible because the time on the page is calculated based on refresh happen on the page.

For example and user visits a page and submit the form and if the page gets refreshed then the total time the user was present on the page was to the refresh of the page so which means even though the user is staying on the same page and reading something else then it will be considered as a new session and new time on the page.

It is important to have more time on the page so try to use the ajax forms so that it will not refresh the page but it just sends the data from your page to the database.

File downloads:

You want to allow users to download a file from your server then try to put this particular file in a different server and provide the link rather than keeping the file in your server.

If you have this downloadable file in your server then you may see warnings from the Google Console saying that vulnerability found in your web page and that's not a good sign for the Google ranking.

My advice would be used some third-party servers to host your downloadable files

Push Notification:

To know that there is some update in your application the changes in the application, you can also notify the customers about the updates on other applications or other websites.

This customer does not have to open any email or anything; the customer doesn't even have to open the browser to receive the push notification because push notification compatible with all the voice providers so whenever there is a notification, the push notification will pop in your computer(in the phones as well) and alerts the user.

Push notification can also be used to get the location of the customer so what's on the location you can sell the products we want for that particular geography


Ages are gone where people are dependent on laptops and computers every single person has mobile phones in their hand and the mobile phone has the internet.

Phone and internet with it is enough for finding a solution to any problem.

So make sure that your website is mobile-friendly; at the same time I would suggest you go for amp pages but having a bootstrap is enough for your web pages to be responsive.


So it is advised just check the WordPress theme whether it is responsive or not when it's responsive it should match every single device for example if I am having a computer which has more than 1200 pixels then the website should work fine or mobile phone with 400 pixels it should work fine.

You are using the right font size for different devices; font size 14 to 16 is recommended based on the font you are using.


We are in 2020 and if your website does not have an https protocol then people are not going to trust you, moreover, google Chrome says insecure header in the URL bar so it is better to go with https protocol.

It is not going to cost you much or you can have it for free from the Cloudflare.

In recent studies the websites which have https are are over ranking http websites, https is not the major signal for Google to rank you high but not having https might put you down.

From my experience, I would suggest strictly redirect http to the https protocol. For example, if try visiting then automatically you would be redirected to



This file contains information regarding the of your website this file will be used by bye Google boat and Big Boat.

Robot.txt instructs these boats where to look and when not to look for example if you have a page which is a public page for public information then you can allow the bot to look.

But sometimes you might have a directory which you want to showcase the only to the registered users or it could be a confidential directory in such cases you can disallow the bot not to look into this particular directory.


Make sure that you are not dis-allowing the bots into the directory which you want to be indexed in the search engines.


Pagespeed on the mobile and computer matters a lot in recent days, especially on mobiles.

Mobile users state a maximum of 3 to 5 seconds for a page to load so by the time 3 seconds 60 percent of the people will leave your website by v second almost 90% of the people give your website if your web page doesn't load.

So always try to keep your page load time less than 3 seconds or at least less than 4 seconds.

If you are having coffee eCommerce then it is important that we should have the page to load within 3 seconds


Only one canonical should be present for a webpage even it may have more than one URL.

For example, if your website is https then by default you will have two versions of page one is http page and the other one is https page.

Another example if you have disabled www subdomain and you have https website then you will have four versions of a page :

  • www with https
  • www with http
  • https without www
  • http without www.


So it becomes very important to choose which is going to be your analytical URL. On my website, I have Canonical URLs as https URLs without www.


Website structure is one of the key factors in UX, for example, consider a webpage, it is present in your domain now you want to navigate to this page from the homepage of your website calculate how many clicks it takes you to go to the particular page lesser the clicks better the User experience.

Just to make sure that fewer clicks do not make your website as junk by putting all the links in the homepage on the other pages sometimes simple UI will create a better user experience than the cluttered UI.


The page titles of your rupee should be a catchy one so whenever a person reads and this particular title, people should feel to click it.

So so there is a term called CTR (click-through rate) whenever your page is showed on Google or any surgeons in if the number of people clicking on your link is more then Google tries to put you in higher ranking this happens only when you have a catchy title.

Make sure that your title does not cross 8 words or 60 characters


Page headers:

The first head of your webpage should match with your title if it doesn't match then the visitor will leave the webpage SMS they visit.

Try to contain the keyboard as the first word in your header.

Try to have at least one H1 header and h3 headers probably 5 h3 - h5 headers

Readability of content:
Subscribing to site:

At the end of every article ask the reader to subscribe for your mail updates. You can use exit-intent popups are entry internet purpose to collect the email address.

My advice would be to use exit-intent popups to collect email addresses from the user.

When you use exit-intent popup it not only the just increase subscriber count it also denotes the Google regarding the time-on-page; with recent updates, Google has updated the algorithm that the last click will be considered as the time on page, so irrespective of user subscribe for your mail list or not, but user going to click ok or cancel button in the popup, so it will be counted as time on page

Add more than text:

Try to have more than text in your web pages, when you have only text in your website it becomes monotonous and to break this monotonous content you can include images and videos inside your web pages.

This image could contain graphs charts, flow chart or any image, infographics are the best way to demonstrate the concept to the Reader.

Videos could be from YouTube are from a third party website but having videos on your web pages will not only increases the readability but also increases the time on your side.

Text formatting:

When I have started my website I thought content is the king so I have to write a lot of content but I forgot this formatting altogether.

Only one of my friends told me that it was so difficult to read for her from my website then I realized that if I want to increase the time on the page then I have to format content because nobody likes to read black and white content all the time right?

Always have a habit of formatting whenever you write a paragraph, once you complete the paragraph, do format it.

Check for the broken links on the web page if there is any broken please do fix it otherwise when the user clicks on a link he will be there to face not available for some other and that is not a good thing from the UI point of view.

Fix Any Missing or Duplicate Meta Tags:

Try to avoid duplication of your meta keywords and meta description when two pages have the same meta description then you might end up only one page showing up in Google.


Sitemaps are the way to tell the search engine boats but the pages are present sometimes we might have more than 100 or 200 pages in such cases you cannot keep track of every webpage or every link. So it becomes important to maintain a sitemap that contains all the pages present on your website.

There are a number of tools to generate sitemaps If you are using static websites other than WordPress.

WordPress has multiple plugins that can generate a sitemap for you automatically.

You can have multiple sitemaps for web pages and images and for videos, but it is suggested to have web pages sitemap most probably in XML format


Youtube channel:

Having a YouTube channel increases your brand awareness not only to the viewers but also to the search engines. When you have a youTube channel with the same name as your website of the brand name then there is a chance that people who viewed your videos on YouTube might also visit the web pages.

Youtube channel is one of the way to spread your branding and increase the trust on your brand name.


Facebook Page and Group:

Having a Facebook group or page me not increase the traffic you get but it definitely increases the Awareness of your brand to the users.

When I started this website I used to post in Facebook groups continuously for 3 months every single day and which give a consistent amount of visitors from Facebook because of which Google started ranking my articles.
If you are thinking about sharing your content on Facebook then share it consistently for 3 months.


Linkedin Page:

People are using LinkedIn as the job portal and if you have a LinkedIn page then search engine considers you have a firm and because of which you are having a page on LinkedIn.

Software firms and other companies use LinkedIn as one of the job portals, so if you have a page and link from LinkedIn then that is a kind of Trust thing to the search engines.


When you are writing an article try to keep the reference to the pages that you are referring to; use the links how to refer to use articles it not only just reference for the user it also states that this order whatever you are writing a somewhere related with a reference that you are writing.

So when some user tries to find an article from the place where you are referring and if the user did not find the content they are looking for then there is a chance that your website gets ranked for the same keywords because there is a chance that your website might have it.


In recent times everyone was talking about meta descriptions and keywords are not required anymore. But in actual the keywords and meta description do matter to Google and other boards.

Yes, there are discussions that once you have the details on the page what is the point of having meta description right? that is also a fair point but do you think a Google or any other bot will index millions of pages just using the text of the content present on the web page are is it even possible?.

Based on my experience Google bot or any other bot will crawl the pages based on the descriptions but at the same time description does not help you to rank higher in the Google bit it makes sure that bot crawls this particular page as early as possible when compared to the page which does not have a description.

It is not mandatory to have a meta description and meta keywords but It does help in early crawling


The search engine uses bots to crawl your website but you can declare whether you can allow a bot to crawl your website or not.

If you have disallowed boats then there is a list and that your website gets indexed in search engines.

Make that you are allowing the required bots to crawl your website. Not all the bots are search engine in bots


Perform a Keyword research before you can publish your article, if you have two articles with a similar topic then Google ranks only one article so try to combine similar keywords in one article only.

You can use different tools for your keyword research, which are:

  • Google Suggestion
  • Google Similar Searches
  • Keysage
  • Answer the public



Everyone talks about the keywords and its density but sure whether it does matter or doesn't matter to Google bot or any other bot.

Unless you are the head of Google searching you can't decide whether it's important or not.

Based on my experience it is fair usage to have keywords density as 1-2 percentage. For example, if your webpage has about 2000 words then you can have the keyword repetition around by 20 - 40.

If you have seen any of my webpage I usually have close to 30 X operation of a particular keyword when I right around 2700 words in that article.

How does Google know this particular web page is related to a particular topic if you don't have the keywords in that article, think about it

If you see any web page in my website normally I do not go for external links I always make sure that every single page is a link from another page of my website, these lines are called internal links.

If you are not linking the pages then when a Google bot crawls one particular page, the bot will end its crawling on that page alone but when you have interlinks for multiple pages from one page then Google bot will be going into those pages to crawl which is a positive thing.

Moreover, if you are not interested to link to one of your page then how come anyone else will be interested in linking to it.


Submitting your newly created page to the searching in bots is very important, even if you don't submit these pages to search engine bot somehow they will find your web pages and index them but when you submit your web pages to the bots it will speed up the process.


When I said submitting to the search engine bots it not only means Google search engine but it also means Bing search engine bot and Yandex search engine bot.

It is for easier to rank in Bing than Google so never forget bing


Give more importance to the brand queries for example if you are creating a new tool then try to you associate the toll name along with your branding so that the number of Grand Prix is coming on Google will be more.

When the number of brand queries is more the Google thinks that they can trust your website.



Few of the search engine market is to say that social signals are not important but there are set up searching in the market is OC social signals are important.

In my view, social signals are important because Google does trust social signals it may not give the required waiters to the social signals but it does matter to Google.

Let me put it in this way; facebook has a principle that whenever a particular post is redirected to a page when it is clicked then Facebook will not help that post to reach more people.

But if some person is visiting a page from Facebook through the post-click then obviously that particular post is important to the person. isn't it ?. So when a particular topic is that important to people then Google also feels that it should try to crawl that particular page and get the content from there.

Video one more thing is if a page is showcasing Google AdSense and the user is visiting from Facebook then obviously Google will try to call the particular page because the AdSense is present on that page and this may increase the possibility to showcase more Ads for the Google.

Note: The traffic from Facebook will not have the same weightage as the traffic from the Google organic search. This s the reason why you get very little CPM for traffic from Facebook.

Social metadata:

All the social networks have their own metadata for example if you take Facebook, Facebook has open graph data or og data.

Social media networks metadata helps to preview links on their comments or on the posts.


Langauge of website

Having a content delivery network makes your page load faster because the content delivery network will store your page content in different parts of the world.

If a user from Bangalore visits your page then it tries to bring the webpage from India are from the Bangalore location. But when you don't have the CDN and it always tries to bring the webpage from the server how was today it could be Switzerland or the United States of America.


Browser level caching:

Browser level caching helps the content to load faster, most of the files that a page uses are stored in your browser, so if any other pages open by the same person then the all those files will be loaded from is browser cache instead of downloading from the server, so try to use browser caching

Images Optimization:

Image Optimisation means it is not only about the image dimensions but also the size of the image when image size is about to 20kb it is easier to load the image on the web page but if you have 10 to 20 images then it becomes heavy for the webpage to load.

So always tries to have the size of the image less in size.

The dimensions of the image also matter, so try to have the image dimensions as per your requirement only.


Loading javascript at the bottom:

Load your all the scripting files at the bottom of your webpage, all these JS files will have things related with animations and transitions; so if you have those files at the bottom then there is a better chance that your content loads first otherwise the content have to wait till the JS files are loaded by the browser.

I would suggest merging multiple JS file into one


Error Pages :

Having custom error pages not only alerts the user/ Reader about the problem with the page but also increases the chances of going back to an important page.

In case if you are not having custom error pages then the Blank Page will come and which will not have any information other than the error code but if you have custom pages then you can configure that custom pages to redirect your homepage or any important page after particular seconds of time or you can provide the links to the homepage or important pages along with the error code.


Plugins in Wordpress:
The interaction:

Having a Google my business not only helps the user to find your shop or office but also makes Google to trust the business and the website.

Let's consider this example, you have a sickness and will you get the advice from the doctor online whom you never met or from the doctor whom you already met and who is online as well.

This case applies to Google, and Google trusts the website which has Google my business because in Google my business people leave reviews, and review does matter to Google.

So it is the trust indicator for Google that you have a physical office in a particular given location.


Use LSI keywords:

LSI (Latent Semantic Indexing) keywords are words that are commonly found alongside your target keyword. e.g. archive inspire = archives unlocked, the national archives about, archives strategy, the national archives about us, national archive at Kew, national archives exhibition, how to use the national archives, national archives discovery

Presence of Favicon:

Favicon is the small icon near the website title in the browser tab.


It makes the website easy to identify when many tabs are open, viewing browser history and bookmarks. Some search engines, such as DuckDuckGo displays the favicon near the URL in the search result.


Apart from improving the usability, it can help catch user's attention in the search results, so we can call it an indirect SEO technique.

<link rel="shortcut icon" type="image/png" href="/favicon.png">
Comment / Suggestion Section
Point our Mistakes and Post Your Suggestions