1What is On-Page SEO : Definition and introduction

On-page SEO is the practice of optimizing individual web pages in order to rank higher and earn more relevant traffic in search engines. On-page refers to both the content and HTML source code of a page that can be optimized, as opposed to off-page SEO which refers to links and other external signals. (MOZ)

In this post I will try to explain in as much details as possible what factors affect your On-Page SEO in 2018 and what strategies and techniques you should use to properly optimize your pages.

The following factors are not presented in any specific order simply because every point has its own importance. This post content is segmented into 9 different parts that can be used as a checklist.

You will notice that while speed optimization and website responsiveness are ranking factors that affect every page of your website they will not be covered in this post. The reason is simple. They both are very important and complex topics. They will be discussed in future dedicated posts.

Quick access and summary:

– Notification –
This guide is still being edited. Some parts highlighted in yellow are still under review. Please come back soon to have access to those specific parts.


2Keyword research: Know what you should optimize for.

Before you start any SEO action, you need to perform a Keyword research to understand what your potential costumers will search when they are looking for your products. You must also understand the level of competition of each keyword opportunity you discovered.

For most of you (especially if you are not SEO initiates or professionals) ranking on certain terms will be very hard if not totally impossible. If big companies are willing to pay up to 80 dollars a click on AdWords for certain queries there is a lot of chance that ranking with pure SEO will be a very long and hard task.

To perform a keyword research you don’t need a premium software, there are a lot of free tools available on the internet today, the first one being AdWords Keyword planner, but you can also take a look at SERPs Rank Checker or other free alternatives. I will not list premium programs here, just check out google if you want to find them.

Once you have extracted a full keyword list and you have refined it to a list of accessible Keywords start segmenting it semantically. In this process you should start to notice that a logical website architecture as well as an internal linking scheme are popping-out.


3Copy-writing: Craft a proper communication

Keep calm and work on your content

Content, content, content… Did I say content already?

Your copy is by far the main factor that will push your pages up the ranking ladder, but do not write for Google!!! You should always write for your visitors. Being relevant, informative and unique will not only impact your rankings but it will also increase your user retention and increase your conversion rates.

If “Wiki” sites are showing on top of Google for a lot of queries it is simply because they answer questions in a very specific way.

If you try to rank on the term «SEO» because you are a SEO company, there are lots of chances that your page is not providing what most users are looking for when they query a search engine for that term. Truth is they are more likely looking for a definition, not a service provider.

This being said content optimization is divided into several parts:

Content: Excellence and readability

When writing content you must know your objectives and always keep your audience in mind if you to market your company efficiently.

Some questions you should ask yourself before publishing a new piece of content:

  1. Is my copy clear enough? Does the reader understand what he’s getting?
  2. Is it engaging and pleasant to read? Does it provide a good user experience?
  3. Does it denote benefits or just describe features?
  4. Is it informative enough for consumers to be able to make a decision or achieve something?
  5. Does it differentiate from other pieces of content on the same subject?

If the answer to these questions are mostly “yes”, there is a good chance your content is providing added value and a good user experience. If on the other hand the answers are no, I advice you to rework your content before pushing it online.

Headings: Segment your content and organize your thoughts

H1 through H6 are heading tags. These are used to structure the page content, with H1 being the main heading. You need to think of these as the headings in a university paper. The main heading for an article is the H1 tag. There should be 1 and only 1 H1 tag per page.

Sub-headings that break up the article into sections are the H2 tags. Should any of those sections contain their own sub-sections, these would then use the H3 tags. As you imagine cases in which you use all six levels of heading on a page are very rare.

<h1> Main page heading </h1>
   <h2> 1st section heading </h2>
	<h3> 1st section sub-part 1 heading </h3>
	<h3> 1st section sub-part 2 heading </h3>
   <h2> 2nd section heading </h2>
	<h3> 2nd section sub-part 1 heading </h3>
   <h2> 3rd section heading </h2>
	<h3> 3rd section sub-part 1 heading </h3>
	<h3> 3rd section sub-part 2 heading </h3>

For quite some time, search engines have put additional emphasis on keywords that appear within heading tags, but like a lot of factors it has been largely abused. Not so far ago it wasn’t rare to find entire paragraphs of text embedded in H2-6 tags. As a direct result they no longer have quite the impact they once did.

A widely spread mistake is the use of headings to style content. This is not how they should be used. If you want a part of you page to look like your headings you should play around with some CSS.

Keywords usage:  Concept and ideas not one keyword.

You most certainly are waiting for a magic formula about keyword density here and you will be disappointed because there is none. Search Engines technology have changed a lot in the last decades and things have been accelerating in the last year.

Pages can actually rank for a query without having that actual keyword in their copy so how is this actually possible? One explanation could be a particularly well studied backlinks profile but manipulating backlinks to earn rankings gets more dangerous and trickier every year.

The truth is that Google got better at understanding user intent and at analyzing synonyms and topic related keywords on a webpage. Search engine algorithms have far more factors at their disposal with which to judge relevance, and have long outgrown such a basic method as counting the number of times a term appears on the page.

When taken to extremes, excessive use of keywords can trigger spam filters resulting in a decline in SERPs which is exactly the opposite of what you are trying to achieve with SEO. This is not something new, in the following video Matt Cutts was already talking about this back in the year 2011.

What about a semantic approach to the problem?

You might have heard about TF-IDF (term frequency–inverse document frequency).

“TF-IDF is a numerical statistic that is intended to reflect how important a word is to a document in a collection or corpus. It is often used as a weighting factor in searches of information retrieval, text mining, and user modeling. Variations of the TF-IDF weighting scheme are often used by search engines as a central tool in scoring and ranking a document’s relevance given a user query. TF-IDF can be successfully used for stop-words filtering in various subject fields, including text summarization and classification. One of the simplest ranking functions is computed by summing the TF-IDF for each query term; many more sophisticated ranking functions are variants of this simple model.” (Wikipedia)

Write great content, run a TF-IDF report for your words and get their weights. The higher the numerical weight value, the rarer the term. The smaller the weight, the more common the term. Check your results and compare then with your competitors. Working smartly you will also find terms you would have never thought about and adding these terms to your copy can make a big difference.

TF IDF analysis run on Website Auditor
This is what a TF-IDF analysis looks like on Website Auditor (every color line represents a competitor)

Remember that the more your content “makes sense” to the user, the more it is valued by the search engine. From what I could notice, with a large panel of words having a high TF-IDF weight in your content, your page has a higher chance to rank high in SERPs.

Emphasis tags: A good way to make some content stand out.

You can read a lot of articles that state that for search engines words in bold or italic have an increased value, but in the facts they don’t have a major impact on your rankings. They might help you to get in the top 300 but they will have little to no benefits at all on competitive queries.

The main use of these tag is to make parts of your text stand out. You will want to use them to help your users understand your page content in a single eye look.

Internal links: Help navigation and give more info

Internal links help your visitors find more information on a specific subject on your site. If your write an article about “how to build a wooden house” and you cite a specific way to create a wooden wall, feel free to link to an article explaining “how to build a wooden wall” this will help your visitors digest your information easily but it will also help Search Engines group your pages semantically and define the main subjects of your different website areas . If you want to see a good example of the correct use of internal linking, check out Wikipedia even if I wouldn’t be as extreme in terms of links quantity.

Using Absolute vs Relative URLs paths:

As far as I could notice there is no difference between the two options in terms of pure ranking. But as you have the choice of using one or the other, which one should you opt for? I personally tend to use relative URLs for internal linking because they do not require an additional DNS lookup and they should be faster and less resource intensive on high traffic websites.

Absolute URLs have their advantage too. Not all scrapers remove the internal links, so they offer some degree of protection. In case of content scraping you could still have links back to your site.

External links: Provide reliable sources

External links are actually a powerful SEO feature that is not used to it’s full potential. Linking to pages considered as an authoritative sources on a subject will give a hint to bots and spiders that you know what your talking about and you are willing to give access to as much information as possible to your users.

When this new factor was discovered a lot of SEOs started exploiting it by pointing links at what is one of the most authoritative site on the internet that has articles about anything. If you haven’t recognized it I’m talking about Wikipedia.

As a direct result Google seems to ignore such links, and Josh Bachynski even claims having noticed that adding a link to a wikipedia page would impact negatively on one of its test page rankings.

Using proper external linking is a nice little strategy that can help you build authority but if you’re building a business website make sure you do not link to your competitors… Link to studies and research on a particular subject instead.


4Images: Optimize your visual assets for search

© User Experience Professionals Association

Before we take a deeper look at how you can optimize your images for SEO, I would like to point out a couple of important things.

Choose your images wisely : Images draw attention  both to the image and to the surrounding areas. Consider the placement of your images carefully. Images should be near information that you intend to highlight.

Use original and high-quality images : Many users ignore stock photos. Using an original picture will provide a better user experience and a higher CTR from images search results.

To optimize them, be sure to use the file name, as well as the alt and the title tag. The title tag is the name of your image while the alt tag is a description of the image. You can include some keywords in there but make sure not to spam them. The following example is not to be followed.

<img src="/img/best-seo-company-NY.jpg" title="Best seo company in NY" alt="Best SEO company in NY">

Image file name:

You want a search engine to understand what an image is about. Using keywords in the file name is a good way to do so. If your image shows the Statue of liberty, naming your image DSC3448.jpg is probably not the best solution, renaming your image statue-of-liberty-new-york.jpg is a better alternative. You can notice that my keyword Statue of Liberty is a the start of my image title because it the main subject of the picture.

Image alt text:

“In situations where the image is not available to the reader, perhaps because they have turned off images in their web browser or are using a screen reader due to a visual impairment, the alternative text ensures that no information or functionality is lost.” (Wikipedia)

Make sure the alt text is descriptive, relates to the image and the page. you should include one of your identified keyword expression, because alt text is used by search engines to understand what an image is about.

When filling the alt tag you must keep in mind the accessibility of your website. If you want to learn more on how to provide the best image experience to people with various disabilities please read https://www.w3.org/WAI/tutorials/images/.

Remember to always think about your website accessibility when filling these tags.

Image title:

The title tag is for human readers only. It is the text that is revealed when you hover over the image without clicking. Title text should be written as a Call To Action to prompt a reader to act. Do not focus on them for SEO purposes.

Image file size:

Images have a major impact on a webpage loading speed. For this reason an image file size should be kept as small as possible.

You must avoid resizing your images dimensions on your page. Using a line of code to display a 300px width image when the image itself has a width of 1200px is a very bad idea. Try as much as possible to serve images of the right dimension even if you have to upload multiple files to your server.

You must also reduce your images weight to the minimum possible. Using the “save for web” option in Photoshop is not enough. Fortunately there is a fantastic website called Optimizilla that offers an image compression service for free.

It uses a smart combination of the best optimization and lossy compression algorithms to shrink JPEG and PNG images to the minimum possible size while keeping the required level of quality.

How to optimize image size for loading speed
© Optimizilla.com – Online Image Compressor – By far one of the best tools available

One of the major advantages of this platform is that it offers a live visual preview of your image allowing you to find the exact settings that fit your needs.


5Video: Make the most out of your Iframes

The use of videos to market products or services is already widely spread and the trend keeps increasing. Video content is a fantastic way to engage the visitors and a lot of companies want to use it at their advantage.

The biggest video hosting platform and one of the easiest to use is Youtube. Its great user base, its streaming quality and its free to use model make it a good solution for a lot of businesses.

Uploading a video on Youtube and embedding it on a website is a quick and easy process but it has a major inconvenience. While you can optimize your Youtube video to rank high in both Google SERPs and Youtube SERPs, it is hard to use the video to rank your website page higher. This is mainly because youtube embed code uses an iframe.

For quite some years, iframes created problems with crawling. Either the bots could not access the content inside an iframe either they stayed stuck in it. When they could not get out of the iframe and back to your page, they could not index the end of your content. Kind of problematic isn’t it?

It seems that today this is not an issue anymore. Most bots can get in and out of iframes without any problems but an other problem persists. The content indexed in an iframe is not considered to be content on your page and when you think about it is totally logical. The content is physically present on another URL.

Until recently there was a noframes tag but with the arrival of HTML5 it has been deprecated.

The noframes tag is a fallback tag for browsers that do not support frames. It can contain all the HTML elements that you can find inside the body element of a normal HTML page. (w3schools)

So how can you provide more info about an iframe without spamming? 

Like for images you can add a title to your iframe code as well as a description. This will have as main impact to increase your webpage accessibility but the code is also be picked by the bots.

<iframe width="560" height="315" src="...URL..." title="...Iframe content title..." frameborder="0" allow="autoplay; encrypted-media" allowfullscreen>
Description of your video content

I havn’t done enough testing on this subject to tell you if you should input your whole video transcription or simply a more general description. As far as I’m concerned I use a short description of the video content.

I’m planning to test this more seriously very soon and I will update this part when I will get tangible results. Meanwhile be very careful with this feature: over-optimization is always around the corner.


6The page URL: Keep it short, readable and full of sense

Google claims, the search engine can understand any type of URL even if it contains /?id=12 but using a clean URL structure will help your users understand your website structure and navigate directly through the address bar.

When using clean URL it’s also good to notice that you should keep your URL as short as possible (4 to 5 words). Including a keyword is also considered by many to give a little ranking boost.

Do not use special characters and keep it lowercase

While Google seems to handle special characters correctly, it is not the only search engine available on the web. Non ASCII characters can be tricky to handle by SEs and I have never seen any proof that they would help ranking (I ran some tests with french accents and couldn’t notice any particular impact). For these reasons you should avoid using them.

Another factor to take in account is the likelihood of a user to share your URL. Using characters such as %20 in URLs is not user friendly and it decreases the probability a user will actually copy and paste you URL. (source: Matt Cutts)

Some people try to add Caps to the first letter of each word in their URL. I guess the reason is to grab attention, but I would say this is not recommended. The reason is simple. Adding Capital letters in URLs can create errors when someone types you URL. You could handle this on a server side, but why make it complicated when you can keep it simple. You should consistently use lowercase as a standard.

Choosing between – and _ as a word separator

Some years ago there was a slight difference on how Google treated – and _ in urls. It had to do with a programming language Matt Cutts explains it in the following video.

This could raise a question about the use of _ for compound words but Vincent Courson (@VincentCourson) confirmed me on twitter that presently – and _ make no difference as they are treaty globally in the same way. He also stated that you should use consistant URLs across a domain and that compound words were no exception.

The use of canonical URLs

On modern website it is very frequent to have content available via several URLs. E-commerce website are a great example in this matter. While a user can easily understands the following URLs are the same page, if a bot can access them, by default, they will be considered as separate pages.

  • https://example.com/catA/
  • https://www.example.com/catA/
  • https://www.example.com/catA/page-2 (pagination is debatable)
  • https://www.example.com/catA?sort=price-desc
  • https://www.example.com/catA/?sort=date-asc

Presently a duplicate content penalty does not exist, but when a search engine is able to  crawl the same content on several pages this can impact your site’s rankings.

You can help search engine take rank only one version of your pages by adding a canonical tag on each one of these pages. This is achieved very easily by adding the following code in your header:

<link rel="canonical" href="https://www.example.com/catA/" />

Basically you are telling the bot that they should index the canonical URL instead of the actual page URL. It is considered a good practice to add a canonical URL to a page even when this one is the same as the actual page URL.


7Your website architecture: Organize your content

Using a correct website architecture helps Google digest your website as a whole as well as helping your visitor find their way easily through your content. A good UX and an efficient crawlability generally end up in better rankings. But how should you organize your pages?

While you might think that a website architecture consists of linking all your pages together, the fact is that when it comes to SEO things are more complex than that. On the subject of website architecture, their are two main schools of thought: silos and semantic cocoons.

While both have similarities they also have differences. I’ll try my best to explain what they both are so you can choose the one that best fits your needs.

Writing about silos and cocoons


8The meta-tags: Data for bots but not only

Before we proceed, I would like to underline that both the title tag and the meta description are visible in the SERPs. They should describe your content but if crafted properly they can be an easy win.

Some consider CTR as a ranking factor, I disagree. While Rand Fishkin (@randfishkin) provided evidence that a high amount of clicks on a search result could impact the ranking of a page, the effects are only temporary. The following Google Inc. document – Learning to rank with selection bias in personal search – clearly states:

Click-through data has proven to be a critical resource for improving search ranking quality. Though a large amount of click data can be easily collected by search engines, various biases make it difficult to fully leverage this type of data.

Previous research shows that in order to reliably leverage click-through data one has to account for
multiple sources of bias including: position bias, presentation bias, and trust bias. Therefore, directly using click-through data may result in noisy and biased training data, which will negatively impact the downstream applications.

We discussed the infeasibility of using existing click models in personal search

This being said you must try to optimize your CTR as much as possible simply because a higher the percentage directly means a higher number of visitors on your page.

The title tag

This is the main page name that will show up in your browser tab. It is very important to make it informative, unique and to include your main keyword for two reasons. The first is that it is the first thing users will see in the SERPs, the second is that it is has a big influence on SEO.

Generally speaking when it comes to Google it is recommended to keep a title under 70 characters but in reality the title length is calculated in pixels, not in characters. Knowing this it is easy to understand that 70 characters is not a golden rule.

How to make the most out of a title tag

There are two important things you need to consider when writing your title tags:

  1. Your main keyword must be present towards the beginning because Google gives more importance to the firsts words.
  2. Your title must be attractive and encourage users to click on your result rather than your competitor’s.

By applying these simple two advices your titles will help you rank and convert.

The meta description tag

Even if some minor search engines still use meta description as a ranking factor Google and other major Search engines don’t. Spamming your keywords in here is useless, but this doesn’t mean that you can forget about meta descriptions.

This description shows right under you title tag in SERPs so it is one of the first things users will see and it will influence your page CTR, so you need to make it appealing.

If you don’t know what to write take a look at Adwords results for the query, when companies spends a lot of money on Adwords they generally run A/B tests to find out the best converting phrases to maximize their ROI. You should take advantage of this free insight.

The meta keywords

As for Meta descriptions major search engines do not use meta-keywords as a ranking factor anymore. You should avoid using them for a very simple reason. You are providing your competitors with insight on the main keywords you are targeting. In a competitive world you must provide the least possible amount of information on your plans and tactics.

The meta robots

This is a tag you should definitively take a look at. Located in the header it tells the search engine two different things. Should the bot follow the links on the page and should the search engine index the page. By default the bots will index the page and follow the links.

If some pages are not being index you should definitively check this tag. The error is human and leaving a no-index tag happens more than you might think.


9Structured Data: Provide even more info to the bots

SERP displaying microdata for Chocolate fondue

One of the great things about structured data is that Google can decide to display it (or not) directly in the SERPs. This provides a better search experience and generally increase the result Click Through Rate.

Depending on the industry or niche you target, structured data might be used a lot, this is the case for recipe websites. However a lot of website, especially local business websites are not taking advantage of this very nice feature.

Microdata is an HTML specification used to nest metadata within existing content on web pages. Search engines, web crawlers, and browsers can extract and process Microdata from a web page and use it to provide a richer browsing experience for users. Search engines benefit greatly from direct access to this structured data because it allows them to understand the information on web pages and provide more relevant results to users. (Wikipedia)

Structured data markups are generally implemented either with HTML markups or JSON-LD. They provide additional information about the content on a web page.

Some frequently used data types:

  • Local business: name, address, phone, logo, area served, …
  • Product: brand, model, color, description, price, …
  • Recipe: cooking time, nutrition information, ingredients, …
  • Review: item reviewed, review text, review rating, …
  • Event: date, time, organizer, location, price, …
  • Person: name, gender, phone, email, image, awards …
  • For the full documentation: please visit https://schema.org/

These tiny parts of code are 100% invisible on your pages but can make a very big difference in helping out bots understanding your content, make sure you use them.



Please enter your comment!
Please enter your name here