With On-Page SEO make your web pages rank higher in SERPSs (Search Engine Result Pages) and get organic traffic to your blog. It will add to your authority, credibility and relevance. SEO is the name of the game. Don’t rush to learn SEO. It has a learning curve, take it slow. Great quality content is a pre-requisite for any thing. Content is the King, SEO is its chariot. Get to know the tricks of the trade and spread the word that your blog is out there.
An Introduction to SEO (Search Engine Optimization)
Websites/Blogs etc are meant to add value to the readers. A search engine wants to remain relevant by providing the information a user want. For this a search engine is looking for the content which is value additive, provides authoritative information, is not disorganised, faster to load on users device, free of scams and many other important factors.
The end objective is to have enhanced user experience.
Do you know, a search engine like Google, is also one of the user of your blog/ website. It needs to understand your content to make it available for its users who are searching for it. Both you and search engine have a common goal, which is benefit to end user. SEO is the meeting point of those common goals. SEO is all about making a search engine (search algorithm/programming) to understand the content of your website/ blog.
Search engines are also evolving and learning. There are many techniques like Big Data, Deep Learning, Machine Learning and finally Artificial Intelligence. The focus has increased manifold to look not just for key-words but also for relevance and freshness of the content. Search Engine Results Pages (SERPs) are listing web sources based on complex factors ranging from site loading speed, updated and latest information, geographical relevance, semantics etc.
On-Page SEO also known as On-Site SEO
On-Page SEO is much more than a keyword research and keyword placement. It is making Google understand, what your content is all about. It means optimising your web page/ post/ content in such a manner that your page is ranked higher and your organic traffic also increases.
It contains basically a few tweaks, settings and guidelines, if followed will help your page be discovered easily by Google Search Engine. Remember, these tips or techniques alone will not boost your visibility, but all will collectively improve your ranking, relevance and credibility. Focusing or stressing about single technique will be an epic failure. It is a full package to support your content.
Once you do an On-Page SEO, the results will not be visible instantly. Things take time to respond. In some cases people have reported changes after 48 hours but mostly it takes more than a months time to the effect to kick it. Late to late some people has seen full impact by 5-6 months time. It is a very small time as compared to your dream. Have patience and trust yourself. And create good content.
On-Page SEO Strategies and Techniques
Title Tags and Meta Description
Before throwing jargons and complex words, let us first understand these terms ‘Title Tag’ and ‘Meta Description’.
To begin with basics, HTML is an acronym for Hyper Text Markup Language. It is the language or the coding or programming to create web pages. In HTML you create content. First you have Head section in HTML coding. Head section is identified by <head> element. It contains description about the web page. Thus title, scripts, keywords, refresh rate etc are all written there. Title Tag and Meta data are within the <head>element.
The name of a page which you see in your browser’s title bar or the name you see in the page’s tab comes from ‘title tag’. It is contained in the ‘Head’ section of your HTML coding. First requirement on on-page SEO is the contents within the tag. As you know, it is the name of your page, the one which can be stored to favourites and the one which will be displayed by search engines. One should always avoid one or two worded tiles. Instead go for a lengthy titles which describes your intent and agenda. Though you should try to remain within the limit of 50 to 60 characters including blank spaces.
Meta description is the summary/ outline or your whole content. Search engines use the meta description to understand what this content is all about. One should keep it below or upto 150 characters. Google search engine may take a few words generally, upto 150 characters from your meta description to explain it to the end user the purpose of your page in the search results page. You can call it ‘excerpt’ of your webpage or post.
URL (Universal Resource Locator) Fundamentals
For a starter you can call URL as a web address which a user types in the address bar, hit enter and then reaches a particular web address or website.
For example, https://rajatjhingan.com/ is an example of a URL. Basically it is an address to a web page. ‘https’ is the protocol – here you cannot do much. ‘rajatjhingan.wordpress.com‘ is the domain name, you can also call it as ‘nickname’ or the address to my home page. Computers talk in digital languages of 1s and 0s (binary numbers), URLs are for humans to understand where the address is taking. Computers by design allot numbers and alphabet combinations which only they can understand and remember. But a URL must be comprehensible by human beings.
Short, simple and descriptive URL is one of the key elements for efficient On-Page SEO technique. A normal user should be able to guess what kind content is related to a particular URL.
For eg. if the address is like: ‘https://rajatjhingan.com/165161/265.html‘, then no one can understand what this address is all about. But if the URL looks like:
https://rajatjhingan.com/2021/01/01/improve-your-writing-skills-make-an-impression-with-these-workable-tips/ , then you know that this is about a webpage related to ‘WordPress’, which could have been created on 1st January 2021 and is related to the topic regarding improving the writing skills.
Still a more cleaner version could be https://rajatjhingan.com/improve-your-writing-skills/
Highly complex URLs creates lot of problems for search bots like Googlebot. Firstly, the bot ends up consuming more bandwidth while indexing your page or could be unable to index your content. A URL having ‘session IDs’ create too many duplicate URLs. Unnecessary elements in URL like referral codes, date parameters which are not needed if date limits are not given, ends up creating duplicates.
This makes the URL very well optimised for SEO and positively affects page ranking. Use hyphens(-) and not underscores( _ ) in your URLs, keep them short yet descriptive. Ideally you should delete all the unwanted elements like numbers and IDs from your URL and must use your ‘focus keyword’, in your URL.
No Low Quality and Thin Content
Content is the king and no one can threaten his throne. Remember one thing, your website is meant for a purpose, which should not the cheating and fraud. At the end of the day, whether it is B2B (Business to Business), B2C (Business to Consumer) or C2C (Consumer to Consumer), the end beneficiary of your website/ Blog is a human being. You can fool people once or twice but not forever. This is not a sustainable practice. In a rush to increase traffic or views, people apply shortcuts which are way more counter-productive in long-term and even in the little less long-term. On-page SEO particularly penalises such practices. Let’s see what it is all about.
Creating a too generic content is not helpful to anyone. Not even to yourself. If you are making efforts to write an article and create other content, then better research about a topic and try and solve some issue or give a well researched opinion. There is really no benefit if you write something and no one is reading. Another issue is duplicate/ copied and similar content. It’s not just about Google, but even the users will come and bounce off from your page the moment they’ll land. Google does not rank such pages or content. Main thing is that it should add value to the reader.
Similarly, auto-generated pages, a page with too much affiliated pop-ups or a page having advertisements more than the content itself. These are all annoying to the reader and to Google ranking algorithms too. Google is becoming smart and scamsters are now being narrowed down in options. Remember those pages where you click on a link and then another page opens and this unending game leads to nowhere. Yes, the doorway pages and other tricks are now taken very negatively. Also a big no to the pages which are meant to look like other famous websites and blogs. All these bring down your ranking drastically.
Look Out for Article (Content) Length – Word Count
I don’t know what you have heard or from who’s who has advised you what? Next logical step after ‘Thin and Low Quality Content’ in On-Page SEO is obviously about the length of the content. Shallow articles with around 300 to 500 words are discouraged by Google’s ranking algorithms. Don’t run behind algorithm for now. Just understand one point. Creating a value-additive and high quality content is the key. I have been hammering this point almost since stone-age of blogging. No short-cuts. Blogging is business and do it ethically.
Length of your articles (content length) affects your ranking directly. An in-depth and high quality article which provide deep insights cannot be packed in and under 500 words. It does not matter how flashy headline or keyword you research, choose and target, if it adds no value, Googlecan sense it and will surely trash it. Instead of writing in a rush, take time, research well, develop an understanding and then write your article
Take it above 2000 yes, ‘Two Thousand’ words per article. Make this as a norm. The only exception can be when you are writing on a too general topic and has a very detailed infographic to support it. Only then the article limit between 200-500 words is ‘a little’, understandable.Successful bloggers generally aim between 1700 to 2500 words. A tutorial or an in-depth review or analysis is bound to be beyond 3000 words.
Role of Images in Search Ranking
Being on the internet and creating content is all about making the user/reader and Google Algorithm understand your message effectively. Images are important and play a very positive role in telling your story. Remember even in school, kids don’t like reading books which has all text and no image. Looks dull, like a law book or an income tax book. Yes, now you got the point. But don’t over do it. Too much images will create a kind of a photo album with too less content. Search engines prefer the content with at least one or two images, embedded videos, gifs etc. Hey! you need to keep those images tightly relevant to your content.
If you understand the background coding then here is a tip: Use HTML elements for images like <img> or <picture>, instead of CSS. Also not to forget about ‘alt-text’ attribute. I’ll simplify for you. A Google Robot, AI or Algorithm cannot see a picture but needs to know what it is about and how it fits in your content. Additionally, sometimes image does not load on other user’s device and instead a ‘text’ is shown. Yes, this same ‘text’ which describes the image is known as ‘alt text’. Write a short and descriptive alt-text instead of a lengthy one.
Now you may think that images can slow down the loading speed of your page, which will negatively impact your optimization. Well, here is a way to optimise your images for Google image crawlers. Compress your image where the size of image is reduced without affecting its quality. Online tools are available for image compression. Just Google them and you’ll be amazed how many will pop-up.
In the end, images enhance user-experience and adding alt-tags help Google Bots to know what the image is all about. Use commonly used formats like JPEG, BMP etc.
Keyword and Keyword Density
In the world of blogging or any other website, content is created for the users or let me say ‘public’ or ‘audience’. If you are creating content only for yourself, ‘diary writing’, is much preferred. But when it comes to writing online, you should know, what people want to read or what solutions or insights they are looking for. Condensing this means you should know the ‘keywords’ which the users are punching in search bars to look for information. Every niche has its own keywords. A person looking for flowers may enter, ‘fresh flowers’, ‘roses’ etc and a tourist may search, ‘nearest hotel’, ‘places to visit’, ‘flights available from place to place’ etc. These are all keywords. They tell you, what your audience is looking for.
A content which is relevant, drives traffic. It is normally – informative, well-written, directed at the user, solves a problem, suggest solutions, can be easily shared and adds value. These days SEO has evolved to long tail keywords. This means people are looking for customised solutions or specifically refering to the part of the problem where they need help. Instead of saying, ‘pain’, user sayds, ‘pain in the right elbow’. This is the level of specificity the user is demanding. Listen to them.
Semantic keywords are the new addition i.e. where you specify the intent also. E.g. ‘laptop’ is a generic key word. But a long tail semantic keyword is – “Budget laptop for college students regular use”. Google is close to understanding human language parameters.
Key word stuffing is counter productive when it comes to ON-Page SEO. It will drag down your page rankings like anything. Don’t use unnecessary keywords and need not use them in every second or third sentence. Though there is not some ideal limit, but keep it closer to 1 to 2 percent when compared to total word count of your article. And let it be evenly spread. There are many plug-ins and softwares too which help in managing keywords.
Structured Data Enables a Seamless On-Page SEO
Ever searched about any product and noticed that Google Search page also shows its ratings, price, discount’s etc. Yes now you got it.
See above example. The information provided has greatly enhanced the user-experience. Now the search results tells about the number of pages, language, a brief-summary and even the ratings. Search engine wants to give best results to its users. For this the search engine needs to know and understand what you have in your webpage and how to interpret it.
Structured data is basically a code that is added to your HTML. It can tell not just price, rating or date of release but other factors too like opening and closing hours of business, any branches etc. It makes the search result look good in the whole search page and more likely to be clicked.
Structured data should be related to your content. Do not add structured data to blank pages. If any data is not meant for or visible to the user, then structured data should not be added for that data. This coding is also done in the element of your HTML coding using in-page markup related to that page.
Fresh Content – Post Frequently
Any business, even in the physical world develops its credibility by increasing its visibility in the public eye. If you are an occasional blogger or are not serious about your website, then its credibility and authority is highly dented.
There is no fixed number like 2 or 3 or 4 posts every week. It all depends on the quality, length and time taken to create that quality and length. But post regularly. If you are posting 2-3 posts per week, then continue with that pattern. Make yourself visible and credible.
Next comes the issue of adding new content regularly. Yes, adding new content contributes positively to your page ranking. Also you need to update already published content to keep it relevant.
Freshness of content – Hubspot has done a benchmark this year that showed, once again, that posting more frequently improves Google rankings. So not just publishing new, but polishing old is also needed. Keep that in mind.
HTTPS – Security Matters in On-Page SEO
HTTPS is the acronym for Hypertext Transfer Protocol Secure. Don’t be afraid such complex word. Let’s understand it. In simple terms, it is a mechanism (protocol or set of algorithms) that are meant to guard or protect i.e. to provide security when communication is taking place between two systems. Hackers may intentionally target your communication and tamper with it, if your communication is not taking place under the protective umbrella of HTTPS. Don’t worry you don’t have to do any programming or any technical thing of any sorts.
First understand the threats. Till now security was provided by a protocol called SSL (Secure Socket Layer). New version of SSL is Transport Layer Security (TLS). HTTPS provides security over SSL or TSL by encrypting the data so that hackers cannot decode or manipulate the data which is being transferred. Again! need not worry, no user intervention is needed. Its just some algorithms or programming running in the background. You just need to give some permissions by clicking at a few options in settings and a little tweaking.
From mid-2018 many browsers have opted to warn users if the pages are not secure. Google Chrome has already implemented it. Many anti-viruses with higher standard of security setting block such pages. Additionally, unsecured pages are sitting ducks for hackers to insert malicious codes into your website and blogs and crashing them or even stealing or redirecting traffic. Many standard web-hosting services providers has provided for smooth transition from http to https and also provide free SSL certificate as a part of the complete package.
Simply put, you only need to opt for that hosting service provider which provides HTTPS level security and SSL certificate. Rest all is handled by hosting service providers and content management softwares. Even in the address bar of this page you’ll see the ‘lock’ icon which ensures that the communication is secure.
Crawling and Discovery of Your WebPage
Like olden days telephone directory used to store phone numbers of all the people in that directory, similarly Google maintains an index which has the address (URL) and a little description about the webpages. This is called Index. Process of recording, updating and maintaining that ‘Index’ is called ‘Indexing’.
Crawling is when Googlebot aka Spiders sniff around for new or updated pages or content. Basically it is an automated software. It discovers pages and then adds them to the index.
More interlinked and updated the web of links, more easily the spiders can crawl and understand. If links are broken and the web of links is not maintained properly, then spiders won’t be able to navigate properly. They try to understand your content with the help of all above tools I have mentioned till now and site maps to understand the structure of your website. It they can crawl it easily, then these minions will go back to their master Google and will happily report positive things about you. Orelse….Grrr. Ah! you just need to manage all the little tweaks. Nothing complex, all doable.
A better crawlability providing sites are ranked higher and are obviously easily discoverable. Interlinking your blog pages is essential. Spiders follow the web of links, just remember this. Also you must have a ‘Home Page’ where latest posts are updated. Getting new things from single page increases the efficiency in crawlability.You can also tell the spiders what not to explore. This can done by tweaking a file known as ‘Robots.txt’. I will suggest that, for now you should not venture into that field without proper technical knowledge. Stick to On Page SEO things which are less coding dependent, if you are doing it for a first time or are still discovering SEO techniques.
Mobile Friendly Page
Mobile phones and tablets has increased the penetration internet. People on the go simple punch keywords on their smartphones and look for their queries. Your web presence will be deeply impacted if your website is not mobile-friendly.
A search engine will not promote a site or rank it higher if it cannot be shown on other devices than laptops or desktops. The machine which user uses to access your content is also important. Even in WordPress you can see the mobile version of your web-page.
Optimising your webpage for mobiles is an important part of On-Page SEO. It do, greatly affect your ranking. Being mobile friendly does not mean that you have to create two themes or two websites etc. Many CMS (Content Management Systems) provides such responsive website layouts, which can change size and content layout as per the device on which it is viewed. Additionally you can check your web page on Google’s mobile-friendly tester tool too.
404 – Pages : Most Overlooked and Yet Important for On-Page SEO
404 is an error code. It means the web page or the site which the user is looking either does not exist or not available. Sometimes it is written as ‘404 Not Found’.
Sometimes there is a broken link which user has clicked. There could also be an error on the part of the user, where he/she typed the address of your page or website with some typos or simply wrong URL. Thus, 404 Error Code tells them that it is the dead end.
You must have a custom 404 Page which directs the lost user to your working pages. They could be your most propular page, home page or related content. It on the whole improve user experience and keep them engaged.
You should never block your 404 Page from indexing via robots.txt file. It will put a dent on your ranking. Not having such a redirect page in itself a big disadvantage. A 404 page which is not consistent with the style of your website is another drag.
Google also provides for Change of Address tool to help you in telling google that you have changed your site address so, ‘please do update your index’.