History of SEO: What Evolution of Online Research?

Tracing the history of SEO comes back somewhat to trying to retrace the history of the handshake. We all know that SEO exists and that it is an essential part of our professional activity, but we rarely try to discover its origins, preferring to concentrate on its use on a daily basis.

However, SEO is a relatively new concept, which evolves frequently. It seems to have come from Generation Y, its origins being generally established around the year 1991.

Moreover, during its relatively short existence, it developed and evolved at a steady pace. It is enough to be convinced to consider the number of modifications that brought to Google to its algorithm.

So where does the history of SEO begin and how has this process come to be so important?Accompany us on a journey through time to try to answer these questions and discover a story that is worth its weight in gold.

The appearance and evolution of search engines

History of seo! .png

In 1945, Vannevar Bush, director of the then-disappeared Office of Scientific Research and Development, published an article in The Atlantic , Where he describes a system for collecting, retrieving and inserting data, to be shared. The description of this system corresponds to what Google has become today.

A few decades later, in 1990, Alan Emtage, a student at McGill University, develops Archie , which some see as the very first search engine. Even if the question is debated and the tool remains primitive, it represents at the time the “best way to access information on other Internet servers”.

The following decade is the scene of various crucial developments, with the advent of the search engines, as we know them today:

  1. February 1993: Six Stanford students create Architext, which will eventually become a search engine called Excite. Some experts, such as Search Engine Land , argue that Excite “revolutionizes the way information is cataloged”, thus facilitating searches by “referencing results, based on the analysis of keywords found in the analyzed content” .
  2. June 1993: Matthew Gray launches World Wide Web Wanderer, renamed Wandex a few weeks later.
  3. October 1993: Martijn Koster presents ALIWEB, a search engine allowing webmasters to submit their own web pages. Unfortunately, this search engine will remain unknown.
  4. December 1993  : There are now at least three search engines powered by indexing robots (JumpStation, RBSE spider and World Wide Web Worm), which analyze both servers and published content online to provide more research results Relevant.
  5. 1994: Alta Vista, Infoseek, Lycos and Yahoo all introduce their own search engine to the market.
  6. 1996: Google co-founders, Larry Page and Sergey Brin, are starting to develop a search engine they initially call BackRub.
  7. April 1997:  AskJeeves, which later became, made its appearance on the market.
  8. September 1997: is officially registered as a domain name.

Also, nearly 12 years later, in June 2009, Microsoft launched Bing, the successor to Live Search, Windows Live Search and MSN Search.

That’s when the SEO comes into play. With the use of search engines becoming popular, webmasters are beginning to realize their potential.

As the Moz SEO community members point out , “by discovering that implementing a few simple actions makes it possible to manipulate search engine results, website owners realize that the Internet can be a source Of consequent income “.

However, at the time, the results of research are not of good quality: this is the reason why the SEO will take as much importance …

The evolution of research and SEO

SEO in the 1990s

In the 1990s, search engines are gaining popularity and the number of homes that have an internet connection is growing. Access to information is therefore becoming easier. However, there is a real problem with the low quality of this information.

Search engines were generally satisfied with matching the terms chosen by users with the proposed results. A growing number of webmasters decided to use keyword stuffing (the constant repetition of the same keyword in a text), to improve the referencing of their web page for this keyword.

Remember though, that keyword stuffing is a black-hat SEO technique, that is penalized by Google’s guidelines because it makes it difficult to understand a text comprehensively.

Thanks to this technique of black-hat referencing, webmasters could generate more traffic and make more profits thanks to the paid advertisements present on their websites.

In addition to stuffing keywords, webmasters used inbound links to exert their authority.

At the time there were no referencing criteria and by the time the algorithms were adjusted to remedy these problems, other black-hat practices, which were not penalized by these last adjustments, were born.  

When Larry Page and Sergey Brin embark on the development of Google, this is one of the problems they want to solve. In 1998 they published an article in which they stated that
“advertising is the dominant economic model for search engines  ” but they recognize that ”  the objectives of this business model are not always to provide research results Quality to users “.

It is in this same article that the two students mention for the first time PageRank, the technology Google uses to reference search results based not only on keywords, but also on the quality of websites .

Some even believe that it is this technology that has paved the way for SEO as we know it today.

SEO in the early 2000s

The early 2000s also marks the beginning of Google’s hegemony. As the company develops a less advertising-oriented technology, it is also beginning to develop a guideline on “white hat” practices (which are considered to be in line with Google’s guidelines) to help webmasters reference their Sites without resorting to questionable methods that were widespread in the 1990s.

From 2000 to 2002: the desire for quality content

However, the Google guideline does not yet have any real impact on the SEO of websites and few webmasters get clutter. This is partly due to the operating mode of PageRank, which relies on the number of incoming links on a particular web page (the more it counts, the better it is referenced), although it did not exist at the time No way to measure the authenticity of these links.

In the early 2000s, the use of inbound linking techniques still allows pages to be referenced that do not match any search criteria.

Nevertheless, during a television interview in 2001, journalist Charlie Rose asked Larry Page and Sergey Brin the reasons for their success, the latter insists that at the time Google is nothing but ” A search engine, which considers “the web as a whole and not limited to determining which terms appear on each page”.

This interview opens the way to the first major updates of search engine algorithms, which then begin to analyze these terms with more attention.

From 2003 to 2004: the first SEO penalties

It is in November 2003, with the Florida update, that this approach to the web is no longer limited to the simple analysis of terms actually takes shape.

The number of sites whose SEO suffers from this update is very high, but it is important to note that it is also beneficial for many sites. This is the first time that websites have been penalized for the use of techniques such as keyword stuffing, meaning Google’s intention to put the user’s interests first, mainly by proposing to him Quality content.

In addition, in 2004, one of the earliest versions of Google’s voice search appears. Nevertheless, the New York Times describes the experience as inconclusive. In fact, you have to make a query on the Google phone platform before clicking on a link, the handset in one hand and the mouse in the other).

This invention nonetheless announces the future importance of SEO on mobile media (but we will have the opportunity to come back to this point later).

2005: A rich year for SEO

2005 represents a pivotal year in the history of search engines. In January, Google, Yahoo and MSN teamed up to develop the nofollow attribute in order to, among other things, reduce the amount of spammy links and comments posted on websites, especially blogs.

Then Google launches the custom search in June , which takes into account the search history and navigation of a user to offer him more relevant results.

In November, the launch of Google Analytics, which is still in use today, is used to measure traffic and the return on investment of campaigns.

2009: restructuring of SEO

The year 2009 reflects a desire for restructuring in the world of search engines.

Bing is born in June and Microsoft launches an aggressive marketing campaign to position it as the search engine that produces significantly better results than Google. However, Bing is far from bringing down Google’s decline and his advice on content optimization does not really contrast with those of his competitor.

According to the Search Engine Journal, the fact that Bing tends to give primacy to keywords in URLs, as well as capitalized terms and pages from high authority websites is the only notable difference between Two search engines.

In August, Google offers an overview of Caffeine, the new update to its algorithm, and requires user input to help test this new infrastructure. Moz says it is “designed to accelerate the robot analysis process, develop the index and integrate indexing and referencing in near real time.”

It will take almost a year for Caffeine to be fully implemented, as the algorithm must also help the search engine gain in execution speed.

However, in December, a real-time search tool allows Google to include tweets or last-minute information. This position confirms that SEO is no longer exclusively the business of webmasters, and that journalists, web editors or community managers must now also optimize their content for search engines.

SEO since 2010

When you search Google, it is sometimes fun to watch the suggestions as you type. They are the result of the implementation, in September 2010, of Google Instant technology. Initially, Moz says it reduces SEO to ashes before realizing that it has no real impact on site referencing.

Google Instant, just like the different phases of the evolution of the SEO from 2010, represents yet another step in the framework of the main mission of Google: the user is placed at the center of the debates.

2010 also marks the advent of the importance of social content for SEO. In December, Google and Bing add social signals to their arsenal. These allow initially to display any post published by the network of a user of Facebook, for example, corresponding to a search that it performs.

Google also starts to reference Twitter profiles with a number of inbound links.

2011: SEO and the Year of the Panda

The policy of penalizing sites seeking to manipulate Google’s algorithm is not in its infancy.

Some incidents are more notorious than others, such as the one involving in 2011. At the time, the Wall Street Journal argues that domains ending in .edu generally testify to greater authority with Google, which Overstock uses to its advantage. The site then targets keywords such as “vacuum cleaners” or “bunk beds” and asks educational institutions to provide inbound links by offering discounts to students and faculty. These links allow Overstock to improve its SEO for searches including targeted keywords, until the company stops applying this method, which Google penalizes shortly after.

It was also in February of that year that Google launched Panda, updating its algorithm aimed at suppressing content farms, sites offering large quantities of poor quality content often updated and developed in the Only to generate more search results. This kind of sites tends to offer a high advertising-content ratio, which Panda was designed to detect.

The Panda algorithm has been updated many times: there are 28 updates to Panda until July 2015, although the impact of most of them remains difficult to measure.

2012: The Year of the Penguin

In April 2012, Google puts in place a new stage in the process of rewarding quality sites and launches the first of many updates to Penguin.

Penguin targets sites that display ingenuity in their use of black hat SEO practices, such as those offering content that is predominantly informative and sparkly hyperlinked and unrelated to the title of the content.

It is also interesting to note that we are witnessing in 2012 a return of the original anti-advertising policy of Google with the update “Above The Fold”, whose first objective is to alter the referencing of sites with a ” Advertising spaces too high above the waterline or in the top half of their pages.

Google will no longer be content to target just the content of spam. Suggested in June 2013 and officially implemented in the following May, the update of its algorithm, Payday Loan, focuses more on research likely to achieve qualifiable spam results.

Google decides to adjust its SEO system to eliminate spam from its results, and while this adjustment does not necessarily impact SEO efforts on legitimate sites, it takes into account efforts to Research.

Google and local search

In the tradition of animal-based algorithms updates, Google is launching the Pigeon algorithm, which will have a significant impact on local search results.

The algorithm seems to have been designed to improve searches on Maps, which then begin to be processed with the same technologies as other search functions, such as Knowledge Graph, spelling correction, or synonyms. Local research was at the time becoming unavoidable and never ceased to be.

SEO in 2015: the user experience

The most important SEO announcement after 2010 is undoubtedly that of Google’s mobile update in April 2015. Downgrading SEO sites that do not have a user-friendly and functional mobile version, it announces the end of An era where SEO focused exclusively on keywords and content to also take into account the concept of adaptive design.

Google announces this upgrade in February, and offers a usability test and mobile feature that allows webmasters to identify and correct potential problems before its official launch.

SEO in 2016: Validation of some updates

2016 is the year of validation of some updates of Google that are permanently integrated with the heart algorithm of Google.

In January 2016 and September 23, 2016, the Panda and Penguin updates are integrated with Google’s main algorithm to run in real-time.

In August 2016, Google tackles mobile pop-ups and penalizes their use too frequent.

Finally, in November 2016, Google announces the deployment of its Mobile-First index algorithm, which will henceforth index websites according to their mobile version, in the course of 2017.

What is the future of SEO?


As you can imagine, search engines will not stop in such a good way and the future will reserve us even more SEO changes. L

SEO: The Importance of Mobile

The use of mobile supports is rising sharply. According to a study , 51% of digital media is consumed on mobile compared to 42% on desktop. It therefore makes sense that SEO continues to explore this sector.

This is already apparent in Google’s propensity to develop a user-friendly mobile user experience. We also anticipate that voice research will be one of SEO’s latest spearheads.

The use of this method, which has its own history is increasing and now represents 20% of searches on Google and 25% of those performed on Bing. It is likely to develop further with the advent of voice activated personal assistants, such as Amazon’s Alexa.

Although there is as yet no clearly defined process for optimization for voice search, which is partly due to a lack of data, this area should develop rapidly and become another essential pillar of SEO .

Proximity search

Proximity research raises the question of optimizing research results at the regional level.

This is particularly true for voice search, Yelp and other aggregators are already responding to, for example, a lot of voice searches on interests in a user’s area the request.

This SEO opportunity for local businesses nevertheless requires that they ensure that their ad is complete, accurate and optimized to be referenced on a third party site.

SEO and social networks

If the introduction, in 2009, of real-time search of Google presents some social ramifications, social networks are today an essential element of any SEO strategy.

When Google starts, for example, to index tweets in 2011, the phenomenon suggests a future where the search for information on social networks will be carried out under the same conditions as traditional searches. This indexing may represent the future of Google, a time when the use of search engines will be totally different from what we know today.

If you type the name of a celebrity in Google, let’s say for example Barack Obama, the first page of results will include his profiles Facebook and Twitter. Indeed, it is one of the first elements that a user seeks to obtain when doing this kind of research.

Even with all these changes being made in recent years, search engines will not stop there and the history of SEO will continue to evolve. This is why it is necessary to have a high level of competence, a strong ethical sense and a good knowledge of the latest technological trends.

But we know that it is not always possible to assign an individual to this task, and that is why we continue to create the best possible informative content on SEO.

We are Social

Online Users

 0  Online user(s)

 0 Registred user(s)

 0 Guest(s)