Coevolution of sites and search engines

or the degradation of the content quality by the SEO

SEO o non SEO questo è il dilemma


As a living organism trying to survive the hostile environment around him by winning the competition from other organisms, so a Web site tries to position itself as high as possible in the results of a search engine, because his food are the visits that will receive at the expense of competing sites.

Google, now search engine par excellence, building on its unchallenged dominance, decrees which site will live or die under the aegis of its algorithm of content indexing.

Google is therefore the environment with which the websites or rather their human creators must confront and this is why legions of them are dedicated to the optimization of websites for search engines, the art well known now to most with English acronym SEO.

While Google's algorithm evolves over time trying to reward quality sites, the other side of the Web sites evolve hand in hand to make it clear to the above algorithm to be of quality and absolutely relevant to the search key of the network surfer.

Google's evolution is far from assessing the quality of a site as would a human being and this means that all the art of SEO, instead of producing quality sites for a human being, pushes the evolution of content and structure of web pages, so that for the eyes of an indexing algorithm they appear the most relevant to a search key at the expense of everything else.

To SEO, or not to SEO, that is the question:
whether 'tis nobler the net to delight with skillfull prose
or to please its censor with redundancies and banal phrasing?
In the last places among the unknown or at top of the results?

And so every author of content comes across in that Shakespearean dilemma. Write in their own literary flair or bend over to Google, internet censor, and write simple, banal texts with possible keywords repeated ad nauseam and placed artfully in the text placating the natural instinct of trying synonyms?

Being the author of a site that excels in the quality of texts and the organization of content, but oblivion, or the author of a crafty site to blow up the Google Analytics graph?

In an ideal world with a Google perfectly able to assess the quality and relevance of a site, the art of SEO would not exist. In the real world it exists and the evolution of the species has produced sites where each page is designed to satisfy a search key, with the following aberrations.

The human being is directed now to the search engines and voice assistants of smartphones with a natural language and therefore asks questions.

How can I be the first in the search engines?

How can you earn with Internet?

How do I lose weight?

How many eggs can I eat a week?

Who won the US elections?

For each question now you find on the net a web page entitled with the question of the human and Google finds on that page an obvious relevance and thus puts at the top of its results. This leads the authors of a site to avoid organically deal with a topic, but to break it up into a myriad of pages, so that each meets a particular question. Each page also has to be rich text to be well seen by Google, but for each topic you can not write a treaty, so the same content can be found duplicated and smoothie in the above myriad of pages.

In fact the mere consultation of information sites, like fitness, economy and gardening to give just a few examples, in the first places of Google searches have not behind a newsroom to organically order the various themes of the site and also removing content obsolete or redundant. The de facto motto is: «The more the better!» Then they pay people to write content in abundance, because, another fact, Google rewards sites constantly updated. The same argument, even if in itself, could be treated in a single small page, is repeated in many variations from a thousand different authors. If we consider that many sites are copying each other the result in terms of content quality is devastating. At the end a first place page for a search key can badly deal with a topic in spite of a high quality, but with a bad placement in the results pages (SERP).

While in the software world they are having to be designed, to fit always better to user behavior, in the world of search engines are the humans to have to adapt to an indexing algorithm and they do so with fervor, as the engine of all is not Google, but the revenue that a popular site can generate.

Ultimately yes we optimize the web sites, but not too much...

Argomenti: SEO