To begin with,

it never hurts just to repeat it, SEO is an acronym that stands for Search Engine Optimization.

In concrete terms, it is a huge and heterogeneous set of strategies and activities aimed at facilitating the reading the contents of our website by search engines, with the aim of improving their organic positioning. By “organic” we mean the SERP results (Search Engine Result Page) which are not paid for (the latter are immediately recognisable from among the search results thanks to the “Ann” symbol and these search results typically occupy the first 3- 4 positions on the page).

Intro

SEO: meaning and scope

The first result of SEO is without a doubt to improve the visibility of a site. But, those who know the trade
know well that visibility as an end in itself is not an indicator of the validity of the activities being carried
out. The ultimate goal of SEO, as of all other digital activities, is to bring conversions.

For this reason, once it has generated qualified traffic to the site, the SEO also takes care of all the “on-site”;
factors that can facilitate the user’s browsing and lead it to the moment of conversion.

Bases

How search engines work
(and search engine positioning)

To be able to make the search engine understand the content of our website, it’s essential that we understand how it thinks, retrieves information and re-elaborates it.

The function of any search engine can be simplified into 3 phases, which are repeated every time a search is performed.

Phases

The scanning phase (also known as the crawling phase)

This is the process by which the search engine uses a software, also called a crawler or a spider, to scan the network for new content. The crawler finds your website and literally reads everything it can about it, scrolling through the pages via the links inside it. Your task is to make the crawler’s job as easy as possible, with a series of measures that we will gradually go through one by one.

The indexing phase

While the crawler is still scanning, it archives the data once it’s been read in the search engine’s database, and sorts them by relevance to the key word that’s been searched.

The positioning phase (also known as the ranking phase)

As we were saying, once analysed, the scan results are sorted and returned to the user within the SERP. This is what is known as the ranking, and it is influenced by over 200 factors, and based on these factors the SERP may vary from one search to another and even from one user to another!

It’s important to point out that Google in particular has stated that the weight of ranking factors varies from one semantic area to another and some weigh more than others.

But that’s not all! Through machine learning processes, the search engine learns whether the submitted results are really useful and actually respond to the search intent, based on feedback from the user. If they are not useful or respond to search intent, the weights of the search factors are re-weighted and the ranking is updated.

What’s more, the exact structure of the algorithm (which is continually updated) is secret: all the rules and directions to follow are based on Google’s own statements, on observation of changes in the SERP (taking advantage of the time lag with which the updates become effective first in the United States and then in Europe) as well as insights, tests, etc.

For all these reasons, no one can provide you absolute certainty or the guarantee of positioning your website for one or more keywords.

The lingo

Algorithm changes and positioning on Google

The very concept of SEO has evolved a lot over the years. Of course, they’ll have told you too that

SEO is dead

But, just like cats, SEO has died and been resurrected at least nine times, and almost all of the resurrections coincide with updates that the good Google has made over the years to its indexing algorithm.

Google updates are continuous and very frequent, but some have become particularly famous for their impact on the SERP.

2003

2005

2009

2011

2013

2014

2015

2018

2019

It all began back in November 2003 with “Florida“, which marked the end of “keyword stuffing”, that is to say, the spasmodic repetition of a keyword on the page in the text, in the code and in dozens of other fantastic hiding places (like using white text on a white background). This dirty job (which someone had to do) was carried forward to Austin the following year

2005 was the year of the Local Maps, fundamental in returning results to users based on their geographical location.

2009, another year to remember, saw the launch of Google Caffeine. Although it was not the one that (at least immediately) had the most impact on the SERP, Caffeine remains one of Google’s most important updates, as it was created to speed up the work of crawlers and guarantee real-time research.

Also, in 2009, more attention began to be paid to the brand element: those who can boast a strong brand, as well as a good SEO strategy, are considered more credible and reliable.

But the most famous updates in recent history are surely these: Panda and Penguin. Panda, released in 2011, hit sites with duplicate content, of little value, not relevant to the users’ search intentions. At the same time, Google published a guide to write high quality content. The following year, Penguin came down hard on the sites that force links, through the purchase/exchange even of bad quality links. Today, in fact, sites with a natural backlink profile are favoured.

With the advent of Hummingbird (2013), Google aimed to analyse content on the web from a semantic point of view, providing interpretation to the users’ search intentions that overcame the ambiguities linked to language.

In 2014, another important update showed us how Google is increasingly attentive to local searches. This is Pigeon, which affects not only the SERP of local searches, but also global one.

April 21, 2015 is better known as Mobilegeddon day. The update, announced by Google as early as February, had the main effect of giving priority to mobile-friendly websites. The change did not affect desktop searches.

We are slowly getting to today and to the Google Core Update, released at the beginning of August 2018 and mistakenly re-baptised as the Medic Update, due to the negative impact that it had on medical and health and wellness sites.

In fact, this update concerns the sites “Your Money, Your Life”, that is, all the sites that deal with topics that can positively or negatively affect the health and happiness of users (including the medical sector, but also the financial sector). Given the impact that certain sites may have on people’s lives, in these cases Google seems to have decided to assign particular importance to aspects of the site’s authority and reliability.

We end (for now) with the March 2019 Core Update, the third largest core update since Google publicly confirmed its interventions.

Last June, finally, Google announced another “Broad core algorithm update“.

2003

It all began back in November 2003 with “Florida“, which marked the end of “keyword stuffing”, that is to say, the spasmodic repetition of a keyword on the page in the text, in the code and in dozens of other fantastic hiding places (like using white text on a white background). This dirty job (which someone had to do) was carried forward to Austin the following year

2005

2005 was the year of the Local Maps, fundamental in returning results to users based on their geographical location.

2009

2009, another year to remember, saw the launch of Google Caffeine. Although it was not the one that (at least immediately) had the most impact on the SERP, Caffeine remains one of Google’s most important updates, as it was created to speed up the work of crawlers and guarantee real-time research.

Also, in 2009, more attention began to be paid to the brand element: those who can boast a strong brand, as well as a good SEO strategy, are considered more credible and reliable.

2011

But the most famous updates in recent history are surely these: Panda and Penguin. Panda, released in 2011, hit sites with duplicate content, of little value, not relevant to the users’ search intentions. At the same time, Google published a guide to write high quality content. The following year, Penguin came down hard on the sites that force links, through the purchase/exchange even of bad quality links. Today, in fact, sites with a natural backlink profile are favoured.

2013

With the advent of Hummingbird (2013), Google aimed to analyse content on the web from a semantic point of view, providing interpretation to the users’ search intentions that overcame the ambiguities linked to language.

2014

In 2014, another important update showed us how Google is increasingly attentive to local searches. This is Pigeon, which affects not only the SERP of local searches, but also global one.

2015

April 21, 2015 is better known as Mobilegeddon day. The update, announced by Google as early as February, had the main effect of giving priority to mobile-friendly websites. The change did not affect desktop searches.

2018

We are slowly getting to today and to the Google Core Update, released at the beginning of August 2018 and mistakenly re-baptised as the Medic Update, due to the negative impact that it had on medical and health and wellness sites.

In fact, this update concerns the sites “Your Money, Your Life”, that is, all the sites that deal with topics that can positively or negatively affect the health and happiness of users (including the medical sector, but also the financial sector). Given the impact that certain sites may have on people’s lives, in these cases Google seems to have decided to assign particular importance to aspects of the site’s authority and reliability.

2019

We end (for now) with the March 2019 Core Update, the third largest core update since Google publicly confirmed its interventions.

Last June, finally, Google announced another “Broad core algorithm update“.

As you may have guessed, Google’s work to improve the interpretation and proposal of content to users is constant.

The best way to avoid “sinking” after the next algorithm update? Behave well, follow the rules and pursue quality in every aspect, both for the search engine and for the user.

Everything that you’ve ever wanted to know

SEO Guide for Dummies: what you absolutely must know about SEO

Based on the best-known ranking factors to date, optimisation interventions can be divided into two macro-areas: “on-site” and “off-site”. However, any type of activity performed, or improvement to your site, is in vain if not supported by scrupulous keyword research.

Keywords

Keyword research: finding the right SEO keywords

A careful keyword analysis is the pillar that supports the entire architecture of an SEO strategy. The search for the “best” keywords to describe our activities can be done using different tools, ranging from various professional tools – like Semrush, SEOZoom, Ahref, or even the Google Ads Keyword Planner – to the search engine’s own suggestions (Side note: we’ve also written an article dedicated to the best SEO tools!). What’s more, we always advise performing a thorough benchmark analysis on your own organic competitors (but a word to the wise, sometimes not these may not coincide exactly with your market competitors). So, it’s important to know the competitive situation on the SERP where you want to position yourself and understand what others are doing, in order to do better.

Going back to keyword choice, it can often be difficult to juggle very general keywords, with high search volumes (the so-called “vanity keywords”) and very specific queries. Granted there is no universally correct strategy, we advise selecting a mix of keywords of various types, aiming for long-tail positioning – especially when you have a domain with a not very high authority.

The “long tail” concept was introduced by Anderson in 2006 to describe a market situation in which many low demand products, when taken together, can constitute a higher market share than a few highly demanded products.

Applying this vision to SEO, aiming at specific words that describe the product/service offered well will lead to appearing in less searches but, at the same time, allows for less competition and to reach a more qualified traffic – since the long tail will respond to well-defined research intents.

Finally, once you have identified the keywords you want to position yourself with, you need to create perfectly optimised pages to intercept and respond to users’ needs. The structure of the site and the distribution of contents within it must be carefully studied in order to avoid a scenario in which more than one page competes for the same SERP and generates that annoying phenomenon better known as cannibalisation (as those three nice proton backpack-clad gentlemen from New York taught us: “don’t cross the streams

Stay strong

You’ll like SEO even more…
and us too! : )

On-site

SEO On-site Optimisation

On-site” means all those “parts” that are optimised within a site. Even here we can identify four fundamental areas.

Keywords
It is important to always test that the keywords you want to position yourself with are consistent with the search intent of the users who are looking for them before optimizing a site. For this reason, it is essential to continuously study the SERP and the contents of the other sites that are positioned.

Contents
“Content is King”, another quote that you’ve almost certainly heard, or even used yourself. But it’s true: no matter how many influence factors exist, the content remains Google’s first “pretext” to understanding what a page is all about.

Find out how to write an article from a SEO point of view

Crawlability
Of course, another fundamental factor is how the search engine is allowed to read certain content. This is why HTML, CSS and JS become fundamental allies … and too many people forget that. You can then add further elements to this, such as data structured within the HTML code, which allow the search engine to interpret content that otherwise would have no meaning, like numbers for example. It is also important to provide the search engine with a sitemap, which is nothing more than a hierarchical index, in xml format, of all the pages of a website available for scanning. The sitemap must be sent to Google via the Search Console. Sitemap location information can also be indicated in the robots.txt file. The robots.txt file is another element that indicates the quality of a site. It contains instructions to prevent all or some search engines from indexing some or all of the pages on the site. In general, however, Google does not recommend blocking the resources of a site, except in strictly necessary and very specific cases (for example: a section still under development).

Internal linking
The way of relating the contents within the site is important and is linked to the study of the tree, that is the internal structure of a site. The way in which the pages relate to each other lets you give more strength to one page than to another. Practical example: the homepage. The homepage is generally the “stronger” page of the site, because all the others are linked to it. In turn, the homepage has the task of redistributing the organic authority that the site receives from incoming external links to the most important pages.

[Link ad articolo che abbiamo già sull’internal linking]

In practice, the optimisation of a webpage goes through various factors. To begin with, we’ll list the ones you just can’t do without. First thing: optimise the URLs (i.e. the addresses to reach a page). The structure of a URL is fundamental for search engines as it reflects the structure of the site itself and its subdivision into categories and sub categories (example: www.miodominio.it/categoria/sottocategoria/prodotto). It’s also essential that the URLs contain the keywords, as we’ve said before, and that they “talk”. That is, they allow for the immediate understanding of the page’s topic even to the users. So, they must be static and without parameters. Another very important factor is the title of the page. The page title is a piece of meta information, and is only displayed in the SERP (so what is it? Well, simply put, it is the blue title that characterises each result). The title must be representative and consistent, and it should be a max length of 55-60 characters, even if the true maximum length is measured in pixels (currently the maximum length displayed is 600 pixels). Even the title must abide by the keyword rule. In this case, the concept of “prominence” is also important: the more the keywords are close to the beginning, the better

Meta description
The meta description is, together with the Title, the element that works most to capture people’s attention. Even if it does not directly affect the positioning, the meta description becomes a fundamental ally to convey the click through rate. This is why it is important that each page has an optimised and captivating description, containing a call to action and keywords. In fact, when users perform a search, Google highlights the keywords present in the snippet, through the use of bold. Currently, the recommended length for the meta description is 155 characters, and can be no less than 50 (obviously the most important thing is that it is effective), spaces included.

Headings
Headings or headers help search engines understand the content of the page and then classify it. Numbering sets the hierarchy of the contents is as follows: from H1, identifying the title of the page content, up to H6, using the increasing numbering to refer to the subtitles.

  • It is essential that H1 is present on every page of the site, that its content is unique and that it possibly contains the keyword (or keywords) that best identify the page (without stuffing, which is counterproductive).
  • Markers H2, H3, etc. must be used according to the hierarchical order and must provide a clear description of the content they anticipate.

Paragraphs
Remember to use the <p> element to enclose the body of text. The <p> is born to circumscribe and separate the paragraphs and allows the search engines to understand that the text presented inside is relevant content for the users’ searches. With the <strong> tag, on the other hand, you can highlight the key concepts (not the keywords!) For reading the text can be highlighted in bold, within the paragraph. When you close the content with (</p>) and reopen (<p>) a new paragraph, what you create on the page are two separate pieces of text. Inside the <p>, to start a new line you can use the <br> element (break – which, unlike the other elements, does not close).

Images
Images help to give voice to the text of your site and positively contribute to its positioning. If you want to optimise an image for search engines remember to:

  • carefully choose the file name;
  • compress images and reduce file size to reduce loading times;
  • fill in the ALT text and the Title, using the keyword you want to place the page with (but without forcing it). Google is able to understand the content of the images thanks to the ALT text, an attribute created to describe the content of the images to the blind or as a place order, when the engine fails to load images.

In addition to contributing to overall optimisation, images can also play an important role in conversions. Has what we’ve said so far make sense to you? Good, but know that even only a fraction of the common sites can claim to have optimised all these aspects. Since 2017, Instilla has been performing an annual study on the degree of digitalisation of Startups in Italy. In 2018, the study found that only one third of the 9,705 Startups registered in the Enterprise Register have a functioning website, and that they meet the minimum requirements for good usability. Of these, only 3.23% also exceeded what we consider a basic level of the SEO, that’s only 97 out of 3001 companies. You’ll find that there are still many obstacles for companies along the path to a perfect online presence. You can learn more by consulting our 2018 SEO Startup Report (Italian).

Stay strong

Keep it up You’re half way there!
Or you can leave the “boring” part to us.

Off-site

SEO Off-site Optimisation

Did you know that Google was the first to invent the concept of “off-site”? At one time, to classify the content of a page, search engines used to start from the number of times the user searched for the keyword or query.

In 1996, Larry Page and Sergey Brin developed the Page Rank, an algorithm that was then patented by Stanford University that considers all the hypertext links on the web in order to quantify the relative importance of a page within it, taking up a system similar to that used to evaluate the quality of academic papers. In fact, just as a study cited by various articles and subsequent studies is considered authoritative in a given sector, in the same way, a page that receives many incoming links will in turn be considered authoritative, and therefore favoured among the search results. An estimate of the perception that Google has of a site’s “authority” can be provided by some third-party tools, such as MOZ and Majestic. Both tools calculate the so-called “Authority” of a domain (or a specific directory) and perform an analysis of the site’s backlink profile. Specifically, the profile link highlights what the situation of incoming links is and uses the Trust Flow as an index. Trust Flow is a percentage metric that estimates the level of reliability of a site based on that of the sites it receives backlinks from, and therefore the quality of those links. The Citation Flow is also important: it is a metric that adheres instead to the quantity of incoming links.

Black Hat and White Hat SEO

The effort to place a site among the top positions of the best SERPs is a commitment that will last a while, as you are always looking for new strategies to beat the competition. And as with any kind of competition, there are numerous ways to “prepare”. Even with SEO there is a great variety of techniques that contribute to achieving the best positions. But not all of these are correct and conform to the standards dictated by search engines. So, we have the eternal contrast between the Jedi and the Sith of SEO, the White Hat and Black Hat techniques.

White Hat SEO is the most ethical and secure method of scaling and maintaining the SERP ranking; but to reach high positions you must follow these techniques, which takes time.

These include:

  • the creation of quality content designed for users;
  • focus on Content Marketing;
  • think of mobile browsing;
  • give priority to the user experience (UX);
  • carry out an excellent research for keywords;
  • use the markup of structured data;
  • optimise the meta tags;
  • optimise internal links;
  • for local businesses, create a Google My Business card;
  • undertake a link building activity that involves only real websites, with quality links that are actually relevant to your site and your users;
  • etc.

In other words, White Hat focuses first on the quality of the links and the content, rather than on quantity. Conversely, Black Hat SEO can quickly increase a site’s visibility, but with short-lived results and high risks. These techniques do not follow the search engine ranking guidelines and, if crawlers can identify them, they de-index your site.

These include:

  • massive comments on Blogs and Forums;
  • content automation;
  • doorway pages, or pages that automatically redirect users to another page. This trick makes the search engines index the page (often well indexed) while showing another one to users.
  • cloaking (another technique in which the content presented to the search engine spider is different than that presented to the user’s browser);
  • hidden text or links;
  • keyword stuffing;
  • link farms, link wheels and link networks, or different ways of creating a set of web pages, for the sole purpose of creating links to a landing page in an attempt to improve their positioning in search engines;
  • rich snippet markup spam;
  • creation of pages, subdomains or domains with duplicate contents;

etc.

Some content is particularly suitable for receiving external links or being reshared.

When creating new content, we need to think about what will solve our users’ problems. Also, in this case the keyword research is the starting point to understanding how users use the web, what they are looking for, and resort to using their own language.

Link earning refers to the activities of natural link acquisition: creating an exhaustive technical guide (like this 😉 ) that meets the needs of a niche market or community; creating an eye-catching infographic that deserves to be reshared and that illustrates, for example, how to solve a problem; writing a step by step guide, a scientific research, a report, create a viral video, etc. Originality and relevance are the two features that you can’t do without in content designed to acquire links.

Link building instead refers to all the activities that are inherent to acquiring links that occur spontaneously. Link building can be done in different ways: through article marketing sites, guest posts (free or paid), directories, private blog networks (PBN), links from comments … good link building essentially means knowing how to find the right mix of the backlink profile of a site.

 SEO and other silly things

SEO and co.:
SEO in a multichannel strategy

Off-site

SEO and SEM: what’s the difference?

When it comes to promoting websites on search engines, another discipline is often mentioned together with the SEO: Search Engine Marketing, or SEM. Often times SEO and SEM are confused, using them in an interchangeable or opposing manner. SEM is the set of all web marketing activities performed in order to increase the visibility and traceability of a website in search engines, generating qualified traffic. These activities range from benchmarking strategies to monitoring, up to the evaluation of the returns of both individual actions and total strategy, with specific analytical tools.

So, uncle SEM is an umbrella that contains SEO, SEA (Search Engine Advertising), SMO (Social Media Optimisation), but it also includes the activities of direct sales, customer care and creation and analysis of databases collected.

It would be nearly impossible to disregard one of these factors and still achieve an efficient web marketing campaign, since these are techniques that cannot work independently, but must be part of a single overall strategy.

SEO VS SEA

SEA stands for Search Engine Advertising, and is the branch of SEM that deals with increasing the availability of the site through paid advertisements, search engines or partner sites. The most used platform for SEA is without a doubt Google Ads (Ex Google Adwords). You may have heard of it called PPC (an acronym for “pay per click”) in layman’s terms, but this is only one of the various ways of paying for an ad, which normally takes the form of an auction.

In fact, you can’t think of creating a campaign with Adwords without having optimised your site or having created a fast and SEO friendly landing page.

In the same way you couldn’t hope of creating a successful blog without caring about sharing on your Social Network profiles.

SMO: SEO and Social Media

According to Kaplan and Haenlein, social media is “a group of web applications that are based on the ideological and technological assumptions of Web 2.0, which allow for the creation and exchange of user-generated content.

As many as 13 categories of social media have been identified. One of these is represented by social networks, but in addition to them there are also blogs and microblogs (such as Twitter), forums, professional networks (business networks, such as LinkedIn), virtual game and virtual reality worlds, video sharing (Youtube!), and others. So, with Social Media Optimisation, or SMO, we’re talking about the strategic process with which original online content is generated, aimed at motivating an audience to interact with a particular website, brand or product (source). This content is born in the most disparate formats and basically serve to generate interest and to be shared.

So, its importance for SEO is obvious: creating quality content and promoting it on as many channels as possible is the best way to increase the probability of obtaining spontaneous links.

In order to encourage spontaneous diffusion, the search for not only interesting but also viral content becomes fundamental. That is to say, content that stimulates sharing, expanding the network and, therefore, interacting with new people.

From here, moving public interaction from a social media page to your own site, resulting in an increase in visits, contributes both to improve the positioning in the medium/long term, and to the increase of its online reputation (Online Reputation Management).

We’ve seen how, in recent years, Google has increasingly given weight to online reputation to determine positioning with its updates, making this is a factor that every business should keep in mind.

In conclusion, the relationship that links SEO and Social Media is much more complex than you might have previously believed, not to mention that each channel has its own “secrets” for optimisation, and is the reason why we’ve investigated it further in this article on Social Media Optimisation.

SEO for E-commerce

Working with SEO in mind even from the early stages of designing an eCommerce site is essential for achieving a good Conversion Rate.

Basic SEO strategy for an eCommerce site is not so different from that for any other site. First, we need to do an excellent Keyword research and an accurate analysis of our competitors, in order to then move on to a meticulous optimisation of all on-site factors.

For an eCommerce site, loading times, image quality and user experience will be particularly important. In fact, eliminating any obstacle that may come between the user and the moment of conversion is crucial. It’s also important to give a lot of attention to internal links, facilitating navigation with a linear site structure, organised by categories and sub-categories.

Finally, good positioning of an eCommerce site depends a lot on the ability to write good product details from an SEO perspective:

  • write unique and original product descriptions;
  • use the keywords that are related to the product name, variants and synonyms;
  • use H markers with criterion;
  • implement the appropriate markup for structured data (schema.org);
  • each page must respond to a query (an eCommerce site is particularly subject to the risk of cannibalisation/duplicate content – knowing how to correctly implement the rel = “canonical” attribute can avoid many problems).
If you’ve made it
this far
You must really love SEO!
We get it, our love affair with SEO started years ago…

Local SEO

SEO for local activities (also called Local SEO) has to do with all the activities that work to improve the visibility of a web project at local level, or within the specific geographical area in which the company operates.

To this end, we work on local search keys which, despite having lower search volumes than the generic search keys, are more specific and much more focused on the target of interest (think about trying to position your restaurant’s website: it would be simpler to position yourself with the key “seafood restaurant Milan”, rather than with a generic “restaurant”.

Moreover, through the creation of a Google My Business card, you can verify an activity on the search engine and you can take control of the information that is provided. In particular:

  • the name of the activity;
  • the category of services in which the activity falls;
  • the location (or locations) of the activity;
  • the opening and closing times and special times;
  • photos of the activity;
  • the official website;
  • the telephone number of the activity;

Being the owner of a GMB card gives great advantages including:

  • greater visibility in the SERP for targeted searches;
  • – the possibility to be found on Google Maps;
  • – the ability to receive user reviews and show your rating;
  • – the possibility to interact directly with users, sharing posts and answering their questions and reviews;
  • – to be easily contacted thanks to the “click to call” function (fundamental from mobile).

Instilla was born with a particular vocation for Local SEO. Check out Primi sulle Mappe, our local SEO service. 

SEO and Mobile

Today, since mobile searches have surpassed those from the desktop, optimising a website for mobile devices has become a requirement for visibility on search engines. It’s become increasingly important for websites to be optimised for browsing from mobile devices by providing a compatible version or using a responsive layout that fits the type of device the user is using.

In fact, since March 2018, Google has finally begun implementing the SERP revolution process that it announced, ranking search results based on their mobile version. With the introduction of the Mobile First Index, Google started using a mobile user agent to crawl websites.

Basic requirements of a mobile friendly site include:

  • quick page loading times, even from smartphones;
  • resources are usable on all devices with satisfying UX from all devices;
  • content is equally visible in all versions.

Again, Google provides a test tool to find out if a site is mobile friendly: https://search.google.com/test/mobile-friendly

Want to learn more? We covered the topic of SEO for mobile devices in this article.

SEO and Voice searches

The voice search is certainly one of the “novelties” that changes the way in which users approach research.

Launched by Google in 2009, Voice Search (the voice recognition technology that allows users to search by simply using their voice), experienced a boom with the spread of Vocal assistants (such as Siri, Cortana or Alexa) and home automation.

It is very difficult to say exactly how many queries are made on search engines by voice today. However, the general trend in recent years shows that voice searches cover an increasingly large percentage of total searches, and this definitely influence the approach to SEO strategies.

But what are the differences between voice searches and the traditional search? And how do they actually impact SEO optimisation?

It is very difficult to say exactly how many queries are made on search engines by voice today. However, the general trend in recent years shows that voice searches cover an increasingly large percentage of total searches, and this definitely influence the approach to SEO strategies.

But what are the differences between voice searches and the traditional search? And how do they actually impact SEO optimisation?

Voice Search is based on two main aspects:

  • the ability to recognise human language;
  • the ability to understand and interpret the specific meaning of a request.

As previously mentioned, with the Hummingbird algorithm Google has already improved its ability to recognise search intent, and is slowly refining a very powerful semantic engine.

Clearly, searches made through voice search will be less mechanical than those that are typed. We move from keyword search to real long-tail queries, related to the spoken language. Other than the form it takes, the context and the purpose a search is carried out with also change.

This reflects the way in which the content is thought and articulated and how it is linked together, the Tone of Voice used, and gives us insight into how to optimise the user experience.

Voice Search also goes hand in hand with contingent needs, which often translate into searches at the local level. This is why Local SEO is once again a precious ally in facing this new evolution.

SEO COPYWRITING

SEO Copywriting is a writing technique that combines the principles of search engine optimisation with those of good writing, to create content that can position itself but also satisfy and actually interest users.

The SEO copywriter must therefore be capable of combining solid writing skills with a thorough knowledge of on-site optimization. In addition, he or she must be able to study the evolution of user queries, using the tools available.

Returning to some of the concepts we’ve already explained, let’s summarise which features characterise a good SEO copywriting job:

  • formally correct writing and quality content;
  • naturally inserted keywords;
  • not limiting yourself to the use of keywords, but also using synonyms, plurals, related terms, etc.;
  • use of bold, underlined and italics to highlight important concepts;
  • attention to meta tags (title and meta description in the first place);
  • use of keywords in the URL;
  • use of bullet points;
  • use of internal links and useful external links and incorporating links in the text (creating the right balance with optimised anchor text).

There are various SEO writing techniques, from writing the so-called pillar article (long texts – ranging from 2000 words or more – which are vertical and exhaustive on a given topic) to that of a product page, or to pages with short text but which are enriched with videos or infographics.

What must remain constant in these jobs is the punctual study of the queries in order to intercept the interest of the user.

ASO, optimisation for App Store searches

App Store Optimization (ASO) is the process of optimising mobile apps in order to position them higher in the search results of an app store and be visible to potential customers. Also in this case, the final goal is conversion, that is to say: to make users download your app. In store searches are still the largest channel for discovery available for your app, and with over 3 million mobile apps in major stores, being found becomes a crucial problem. The ASO strategies are divided into two phases: the “pre-launch” strategy, and the “post launch” strategy, in order not to waste the work done. Basic parameters taken into consideration for optimisation are:

  • Application name
  • Subtitle and promotional text
  • Description
  • Visuals: logo, screenshots, images
  • Category (the choice of the category – as we saw with the GMB Card – must be consistent with what is offered, and it’s crucial to determining an increase or a collapse of the traffic towards your app. If you aren’t sure which to choose, do a search of apps similar to yours and choose accordingly).

If you’ve been paying attention in the previous segments, by now you’ll have realised that, even in this case, it is crucial to start off with a thorough keyword analysis and in an in-depth study of the target.

Then there are secondary factors that depend on user interactions and that make the difference:

  • the number of downloads
  • evaluations and reviews

ASO, like any other optimisation, takes time to be implemented correctly as well as to see results. But for those who have the time to understand, iterate and test, the results can be incredible!

Were you able to make it this far without getting lost?

Great! By now, you will surely have a more complete view of SEO’s complexity. But why stop there?

If your goal is to get into the discipline on a practical level, to study every aspect of SEO, from history, to technical SEO, from the study of the semantic web to the understanding of HTML and JavaScript, what you need is an advanced SEO course. Lucky for you, you’re in the right place and at the right time.

Find out more about the Instilla SEO Masterclass in collaboration with Lacerba (updated to 2019).

Want to find out more?