Google updates its algorithm around 500 times a year, equivalent to a change every 17.5 hours .
Sometimes they are small adjustments that go unnoticed. Other, powerful transformations able to turn around the results.
In any case, all updates have the power to affect your site somehow.
How do you prepare?
First, using good SEO practices. It is the best way to prevent Google puts you in the spotlight .
And secondly, keeping up. Because if you know at all times the movements of the search engine, you can act immediately and correct the situation as soon as possible.
Here you will find all Google changes you need to know.
This set of updates is intended to favor mobile optimized sites in search results provided to users of smartphones.
The update In-depth articles is designed to highlight articles dealing with general issues in depth .
Its operation is as follows: Google identifies queries asking for more than just a quick response and includes a block of feature articles on the results page. These items stand out for their thoroughness and high quality.
By the time the feature articles are available only on Google.com and Google.co.uk.
Accelerated mobile pages or AMP ( Accelerated Mobile Pages ) is an initiative of Google that allows the development of pages superlight from existing technologies.
These are the updates based on the signals picked up by webmasters worldwide and analysis tools SERPs.
Google is limited to show four leading commercial search ads, regardless of the advertising sidebar on the right. Despite being an exclusive AdWords change directly affects the CTR of organic results.
The update introduces the use of HTTPS as a ranking signal. This means that sites with SSL key certificate (2,048 bits) are favored in the rankings. The aim is to promote the widespread use of HTTPS for a more secure Web.
Dubbed Quality Update by the team of SEL, the update changes the way of assessing the quality signals by the Google ranking algorithm. Still, does not pursue concrete or activities related to spam sites (ed is not a disciplinary change).
As the search does not share more information, the only references that may be considered are the post 2011 on creating high – quality sites and this guide on how tocreate valuable content, both published by Google.
Indexing Google applications or Firebase helps to create a consistent user experience between websites and applications.
Developers of apps can send Google the content of the application for Android and iOS and the search engine will track it as it does with web pages. If the content is indexed correctly, it will appear in search results and deep links to the app.
RankBrain is the name of a new system of interpretation that gives a better understanding of Google search queries, especially the most ambiguous. It integrates into the search algorithm Hummingbird and employs artificial intelligence to determine what the user is looking.
According to Google, RankBrain is the third reference signal in ranking web pages (content and links go first). It does not add new factors to the ranking algorithm, but determines the weight of existing based on the user’s query.
Although the system uses machine learning , not really learn alone and must trainregularly. Another curious detail, derived from this method of learning is that the same engineers who designed it can not explain how it works internally; they do not know.
The goal is to clear Google results pirated sites with malicious content. The change comes three months after the security firm Sophos revealed how spammers use PDFs undercover to get traffic from the search.
Located above the organic results, local business listings show a locality or location related to the search query entered.
Designed to combat low – quality content, Panda penalizes sites that offer contentduplicate type fraudulent ( copied or rewritten), as well as content farms and sites with excessive ad ratio. Moreover, it rewards those who create unique and meaningful content to its users.
The first update affects 11.8% of consultations in English and since updates occur every few weeks. In August 2011 it begins to deal consultations in Spanish, with an impact of between 6 and 9%.
Currently it is integrated into the core of Google, which continues to be updated regularly . Google stopped regularly commit the updates in July 2013 after announcing he was going to incorporate into the core algorithm.
This is the history of Google Panda updates. The points marked (*) are unofficial updates based on the studies of Glenn Gabe and other sources.
Doorway pages are created inside and outside the site with the aim of having multiple pages in search results, but ultimately direct the user to the same destination. The update is intended to make to position these pages in the search engine results.
To provide the latest information, Google adds new sources to its results, among which include feeds from social networks, news and fresh content indexed.
Dubbed as Dove or Pigeon by SEL equipment, the update is intended to provide more useful, relevant and accurate maps and lists of results local search .
According to Google, the new algorithm is intended not to act against spam, but “expand capacity search by including hundreds of factors used in web search , with features such as graphic knowledge, spell checking and using synonyms, among others. ” In addition, more accurate ranking trying parameters related to the distance and position.
Designed to combat web spam , Penguin affects sites that attempt to manipulate the rankings by artificial links . At the same time, sites that do not make use of these techniques benefit from the space given in the search results .
Google automatically searches for patterns in the profile links of the sites in its index.When given a positive, it applies an algorithmic sancción which translates into a drop in the rankings. Some of the affected sites become candidates for subsequent manual review.
Although the impact of the update is less than the Panda according to official data, many sites notice their influence due to the common practice of creating links against the guidelines of the search engine .
At the moment is developing new versions to come. In fact, it is expected a future version capable of updating without human intervention.
The algorithm works as a filter, degrading the rankings of sites that repeatedly violate copyright, such as portals illegal downloading and playing music and movies. To detect these types of sites, the search is based on requests made to the DMCA (Digital Millennium Copyright Act).
Sitelinks are links located under the first search result, which lead to internal pages for faster browsing.
The search results include rich snippets photo with copyright and links Google+ profile.In principle, it is necessary to link the Google+ account with pages of content to qualify for it.
Although this feature is no longer actually works, Gary Illyes encouraged to keep the markup author, because Google might use in the future.
This update introduces a new algorithm led to clear the search engine results for queries prone to spam, such as those related to loans, drugs and pornography.
The algorithm Page layout affects sites where the content is buried under a pile of ads.
Having to scroll to find the content you are looking produces a poor user experience, which is why Google wants to avoid such pages on their results. With the page layout algorithm sites that only show ads above the fold perform worse rankings.
English Hummingbird Hummingbird is the new Google search algorithm, designed toopen the way to the conversational search real.
With the increased use of mobile devices, Google has to be prepared to handle querieslonger and colloquialisms, such as “how much is Madrid from here” or “compare butter with olive oil.” Hummingbird adds the ability to understand the meaning of words, make comparisons and apply filters in a given context, which allows the search engine toprovide accurate results to complex queries.
Everything else remains the same, that is, the Penguin, Panda and even PageRank algorithms are still active and functioning in the same way.
The introduction of the Knowledge Graph or Knowledge graph is a step towards thesemantic search .
Basically, search queries related to people, places and objects. For example, if you search the name of a known person, not only show typical results matched search, but a panel with semantic information collected throughout the Web.
This update is intended to reduce the visibility of exact match domains ( Exact-Match Domains or EMD) in the search engine results.
To date, any domain bearing the name of a search it was very easy to position. For example, a domain that “palabraclave.com” was called was likely to appear on the front page for the query “keyword” doing nothing. With that EMD update is finished.
EMD does not imply that exact match domains are condemned by Google. Still a good bet, but not as juicy as they were a day.
According to Matt Cutts, the impact of the update would be 0.6% of consultations, but Moz detects 10.3%.
Venice is designed to improve local search results. Although not affect the results of Google Places, but the results organic local.
Before Google introduced this improvement, local pages get involved typing the name of the town to get some results. Now, you can make a generic search and find pages in your city.
The update is based on the geographical location searches, an option already offered by Google in Venice before, but that did not work properly.
The introduction of the Search plus Your World (Search Plus Your World or SPYW) involves a considerable advance towards personalized search.
With this update, Google began to introduce Google+ social data in its results.Specifically, you can find information shared by you and your contacts in Google+ and Google+ people and pages related to the specific topic of your searches.
Since the launch of the search plus your world there are fewer reasons not to be inGoogle+ .
This change has the purpose of rewarding the latest content in the results, whether new, updated, or object of an interaction (someone does +1 , or write a comment).
It is especially relevant to search queries in which time plays an important role, as with the news, seasonal events, and any information likely to be out of date. In these cases, the updated pages are more likely in the search.
It is a joint initiative between Google, Bing and Yahoo! to provide a universal marking scheme. With Schema.org you can tag your sure the major search engines will understand content.
Google releases an update to pursue sites that use the negative comments to further their rankings. The news appears a few days after the New York Times published anarticle in which he talks about how the online store DecorMyEyes uses that technique.
Matt Cuts confirms a recent development by which the seeker considers social signals from Facebook and Twitter to determine the position in the rankings. Note: In August 2015, John Mueller denies that social signals have a direct influence on the rankings of Google .
One step over the suggestions of Google, the search begins to display search results as you type in the search box. Despite the uproar mounted, the impact is minimal.
The update allows that a domain appears more than once in the results for brand searches, contrary to what happened before. The change surprised the SEO community does not know if it is an experiment or a fault finder.
Caffeine is the name of the infrastructure of Google web indexing, designed to speed tracking, increase the size of the index and incorporate the functions of indexing and positioning of pages in real time.
So dubbed by the community of WebmasterWorld , Mayday gives visibility to quality sites – type searches long tail . The impact is great for sites with little content and can be considered a precedent of Panda.
Google believes that speed is important. The faster is the site, the better user experience and more time goes into it.
Hence the search introduces the loading speed as a factor in positioning. Sites that load quickly benefit in the rankings are, although the weight factor is reduced next to eachother as the relevance of the page.
The update focuses on aspects such as reputation, credit and authority, which favor large brands in the rankings. The change was perceived by the community of WebmasterWorld and shortly afterwards recognized and explained by Matt Cutts.
Micrososft, Yahoo and Google announced the new label Canonical , thanks to which it is possible to identify the canonical pages transparent to the user. It is a majorbreakthrough to end the duplicate content.
Feature well known and widely used since its launch . While typing in the search box, Google is autocomplete search suggesting related terms.
Currently Google is able to show answers directly in the search box.
WebmasterWorld community detects fluctuations in the rankings during the month of March, but not the details of the update are known. Matt Cutts involved in the conversation to ask the opinion of the webmasters.
To increase capacity without sacrificing performance indexing, the search engine creates an additional index. The pages included in it not vigorous enough to be included in the main index, more frequently updated PR.
According to Matt Cutts is not an upgrade , but an accumulation of small changes in the search.
Google develops the typical listings 10 links introducing new types of results, such as blogs, videos, images, news, maps and books.
Google starts updating its infrastructure, representing a change in how to handle redirections and canonicalization, among other factors. The deployment is extended until March 2006.
It consists of a series of updates designed to minimize the effects of web spam with artificial links (reciprocal, farms and paid). The deployment took place between September and November, with October , the month of greatest impact.
Significant fluctuations in the rankings are observed. Matt Cutts claims that are caused by the refreco every 3 months the search engine backlinks and PageRank , though many webmasters are sure that is an upgrade.
Google begins to use search history to provide more accurate results.
Google introduced a new feature in Webmaster Tools that allows sending sitemaps in XML format. With it you can tell the search engine what pages they want to be indexed.
It is believed that the update changes how the search engine handles problems canonicalization site ( www vs. non-www ) and duplicate content.
No update is unclear. Some think that it is changes in the method used by Google for latent semantic indexing, while others believe it affects the sandbox . The possibility that the update is related to suspicious links is being examined .
It is introduced Nofollow as a result of cooperation between Google, Yahoo and MSN (Bing). The attribute allows webmasters deny credit (and transfer PR) to the page that the link points, which is a major step to combat spam, especially in comments.
Several changes of Google, among which are the expansion of the index, Latent Semantic Indexing ( Latent Semantic Indexing or LSI ) and more attention to the link text ( anchor text ) and power source. On the other hand, it weakens optimization based on the text formatting (headings, bold, italic, etc.).
Google goes a step beyond Florida dismantling practices such as excessive keywords inmeta tags and invisible text. The impact is broad.
The update represents a blow to the abusive tactics of the late 90s such as keyword stuffing ( keyword stuffing ). It affects a huge number of websites.
From now on , Google fails to update its index monthly to do daily.
Probably it consists in improved infrastructure Finder to update its index more frequently.
Continued stayed where Cassandra and Boston in the battle of Google againstmanipulators links. The way the search value inbound links changes considerably.
It focuses on artificial links such as multiple links from the same site, cross – links between domains and hidden links. With this update introduced the reconsideration requests for penalized sites.
Congress takes the name of SES Boston where he was announced and is the first official update of Google. It allows a more detailed analysis of inbound links, resulting in better quality assessment and whether they were created artificially.
The details are not clear, only produced major changes in the rankings coming to appeareven pages 404. Although the quality of the results seems to lower, it can be considered the moment when Google started the war against spam and manipulation.
I update this post whenever major changes are confirmed in the Google algorithm.However, if you want more information or see if your site is affected by a change, go to the following links: