The Riveting History of SEO: From the '90s to Today
Last Updated on
September 12, 2024
Published:
December 20, 2023
Digital marketers are all vying to create content optimized for SEO, an acronym for "search engine optimization," but how did this practice come about in the first place? Read on to learn all about the history of SEO, how it has evolved, and where it's headed.
When did SEO begin?
The concept of SEO began before the inception of Google. According to legend, the rock band Jefferson Starship was at the center of SEO's birth in 1995. Promoter Bob Heyman received an angry call from the band when they were on the road and couldn't find their webpage.
Several fansites had been writing so passionately about "Jefferson Starship" that they surpassed the actual band's page on SERPs. So Heyman and his partner Leland Harden boosted the number of references to "Jefferson Starship" on the official page, which catapulted them to the top of the rankings.
Who invented the term SEO?
According to this anecdote, Heyman and Harden invented the term SEO, but experts wonder how much of this story is true. Around that time, you could boost SEO performance just by including some inbound links, outbound links, and several references to your focus keyphrase, and posting content on a website that functioned properly. By the mid to late '90, several tech pioneers began using these tactics to see significant results.
In 1997, the Webstep Marketing Agency was the first group to use the phrase "search engine optimization" in their marketing materials. And around 1998, Danny Sullivan, founder of Search Engine Watch, started popularizing the term and helping clients optimize their content to rank well in search engine results.
By 2003, the term "search engine optimization" appeared in Wikipedia for the first time, cementing its status within internet culture as an industry of consultants and analysts formed to help companies rise to the top of search engine results.
The failed attempt to copyright the term "SEO"
Although the SEO industry was coming into its own, many marketers were using black hat tactics or word stuffing to improve search performance. In 2007, Jason Gambert, who may or may not be an actual person, tried to own the trademark to the term "SEO." Gambert allegedly wanted to protect the integrity of search engine optimization and save the internet from companies preying on consumers.
Not surprisingly, other content marketers and SEO experts weren't too excited about this development, nor was the U.S. Patent Office, which denied the trademark by 2008.
SEO and the rise of Google
While SEO predates Google, the rise of this search engine juggernaut started by Larry Page and Sergey Brin has dominated SEO attention for nearly 20 years. So even though people refer to generic search engines, Google's the one that matters most in the history of SEO. Considering that Google accounts for almost 93% of search engine use, this dominance probably isn't going away any time soon.
One of the main reasons Google broke through the pack of Yahoo, AltaVista, Dog Pile, Infoseek, Ask Jeeves, and others is because it provided better search query results. It has invested heavily in training machines to perform data augmentation and constantly improve its process. As of 2024, it evaluates more than 200 factors when determining SEO rankings, a number that has grown over time.
How has SEO evolved?
Since Google has been the dominant force in the SEO industry for years, many key dates revolve around the Google Core updates. Looking at the history of search engine optimization through a Google lens is a logical way to track how conventional search has changed over the years and which criteria matter most.
Google has frequently introduced new algorithms to offer different features that improve the overall quality of results from searches. Matt Cutts, the head of webspam and search engineer at Google from 2001 until 2016, led many of these updates. Every algorithm update can bring agony or ecstasy to content marketers, as well as a shuffling of indexing of domains and determining what's actually "relevant content."
How have Google algorithms changed?
The history of SEO is full of changes to Google's algorithms, and they all work together to achieve the goal of providing more relevant content for the users. These updates tend to do one of three things: filter out spam and black hat tactics; prioritize fresh, relevant, local content; or provide relevant results for semantic searches.
This evolution shows how conventional search has become increasingly refined. You can also see how different algorithms build on past ones. However, keep in mind that Google upgrades and refines each new core update. So even though an algorithm might be over a decade old, it's still impacting search results today. Version 4 of an algorithm will be quite different than version 1. Here are a few key algorithms that have shaped the history of SEO.
- 1996, Page Rank. This early project created by Sergey Brin and Larry Page helped pave the way for the juggernaut that would become Google. It weighted factors including domain authority and internal and external links. While this algorithm created a strong foundation for automatic indexing of the Internet, it was still vulnerable to black hat tactics.
- 2003, Florida. This marked the first major Google algorithm update, designed to filter out sites with large numbers of poor-quality links. They released it in November, and it caused a massive upheaval for many sites, right at the peak of the Christmas shopping season. Unfortunately, many quality sites were labeled incorrectly, significantly hurting small businesses. However, the introduction of link analysis helped shape the trajectory of SEO.
- 2004, TrustRank. The TrustRank algorithm went a step beyond the foundation of Florida as a way to filter out spam and black hat techniques from search engine results. This algorithm helps to identify how trustworthy domains are, so users get quality results.
- 2010, Google Caffeine. This algorithm gave Google search results a nice jolt of energy, as the capacity for indexing articles increased. Search results were able to prioritize fresher content.
- 2011, Google Panda. Google Panda was designed to direct people to higher-quality sites, like news organizations, and minimize the impact of content farms with thinly written, poorly cited articles.
- 2011, Google Freshness. Some websites, like a good recipe, can be timeless, but the vast majority need regular updates. The Google Freshness algorithm did exactly what you’d expect; it prioritized fresh, relevant content and was an improvement over Caffeine.
- 2012, Google Penguin. Like Panda, Google Penguin took another hit at spammy websites, trying to weed out content that used stealthy techniques to boost their rankings. It targeted spam websites that set up pages with lots of external links linking to them to boost their domain authority and position in search results.
- 2013, Google Hummingbird. This robust algorithm offered a massive transition toward prioritizing natural language in search queries. Hummingbird prioritized natural language over keyword packing and unnatural attempts to fit in all the necessary words for an article. This massive change represented a new trajectory and laid the groundwork for future improvements of artificial intelligence and natural language processing.
- 2015, Google Mobilegeddon. This update provided a big boost for websites optimized for mobile. Mobile optimization has created a much better user experience for people using smartphones to surf the web. Mobilegeddon, and the prioritization of mobile sites, helped catalyze this change. Mobile search grew from around 20% in 2013 to 63% by 2021.
- 2015, Google Rankbrain. This significant algorithm, which in some ways is an extension of Google Hummingbird, provided more clarity for unstructured data. So its main job is to understand what you’re looking for, even if you don’t use any “quote marks” around your query. For example, if you type in John Smith, it doesn’t look for all the Johns and all Smiths and identify where they overlap; it looks for the unique “John Smith’s.” It’s also a significant building block in the significance of natural language processing and its impact on search, as well as local search.
- 2016, Google Possum. Does Google ever ask your location when you’re searching? Google Possum paved the way for local results. This algorithm update provided significant changes for local SEO and offered local businesses a way to connect with their audience. So if you look up “best fried chicken,” you’re probably not going to see results from a universal search of all the chicken places in the world. Instead, you’ll see fried chicken purveyors from your local region.
- 2017, Google Fred. If your website had been relying on black hat SEO tactics, you probably felt a dip after Google implemented Fred. This algorithm further sought to penalize poor websites with an overabundance of ads and fewer pieces of quality content.
- 2018, Google Medic. Anyone who has ever asked Dr. Google for answers to medical questions knows how important it is to have trustworthy medical sources in your search results. This algorithm helped prioritize websites with medical authority, like the Cleveland Clinic or Mayo Clinic, over others without medical authority.
- 2019, Google BERT. This powerful algorithm improved upon Hummingbird’s foundation and prioritized search intent and long-form keywords. BERT has improved search quality and prioritized relevance.
- 2023, Google Gemini. This update didn't replace BERT, but works in tandem with it. Gemini focuses on generative tasks and broader applications of AI. It aims to handle more complex queries, generate coherent responses, and engage in sophisticated conversations.
How BERT has changed
Google has updated BERT many times. Let's look at the timeline of how it has evolved.
December 2019: BERT Expansion
Google announces that BERT is now being used across multiple languages beyond English. This expansion helps improve search result relevancy for a wider audience by understanding the context of queries in different languages.
January 2020: Passage Ranking
Google integrates BERT into its Passage Ranking system. This update, known as the “Passages” update, allows Google to understand and rank individual passages within a web page more effectively. This enhances the ability to retrieve relevant information from long documents based on specific query intent.
November 2021: Improved Language Understanding
Google updates BERT to further refine its ability to handle nuanced language and complex queries. This includes improvements in understanding conversational queries and contextually rich language, enhancing the overall search experience.
March 2022: Enhanced Multilingual Capabilities
BERT’s integration is further enhanced to support more languages and regional dialects. This update continues to refine how BERT understands and processes queries in different languages, improving search relevancy globally.
April 2023: Integration with Google’s AI Ecosystem
Google announced advancements in BERT’s integration with its broader AI ecosystem. This update enhances BERT’s performance on complex and conversational queries, benefiting from new AI technologies and techniques developed by Google.
Google Gemini
You've probably seen the new AI-generated text box that appears at the top of some Google search results. This is the integration of Google Gemini, an advanced suite of AI models developed by Google DeepMind, officially introduced in December 2023. It's a rebranding and evolution of Google's earlier AI models, incorporating advancements in natural language processing and understanding.
You might be wondering, "How are SEO trends going to change if Gemini presents the information to searchers automatically with generative AI?"
First, Gemini's generative AI isn't employed with every search — only the ones for which the information can be most accurately and logically presented in that format. But Gemini is more than just generative AI integration, it's also an update to the algorithm. Gemini’s advanced natural language understanding (NLU) helps Google better interpret the intent behind user queries. This means search results can be more accurately tailored to match the context of what users are actually looking for, which is hopefully, your site.
Websites with high-quality, contextually relevant content are likely to perform better because Gemini rewards content relevance. With Gemini’s emphasis on understanding context and relevance, creating high-quality, informative, and engaging content becomes even more crucial for SEO and content strategy.
How does the history of SEO affect the future of SEO?
Producing great content has become a highly specialized process that often requires a team of experts and complex data sets and models. That's a far cry from keyword stuffing and link sharing of yesteryear's blogging.
The core of SEO strategy has remained constant for decades. Search engines want to instantly give readers the best possible answers to their search queries. And companies want to be the ones providing those answers to potential customers. Content quality matters. As machine learning and artificial intelligence evolve, we'll see increasingly specialized search results that are based on relevance.
What are some basic tenants of SEO?
Let's look at some of the core elements of SEO when it comes to content and overall website optimization.
- Keyword research. Identify relevant keywords and phrases that users are searching for. This involves understanding search intent and selecting terms that match your content's topic. While there are lots of SEO tools out there to help you find keywords, this is Rellify's bread and butter. We use deep machine learning to find the right topics and keywords that will resonate well with audiences and search engines.
- On-page SEO. Optimize elements on your own site to improve visibility. Content, meta tags, canonical tags, URL structure, and internal linking are all part of on-page SEO.
- Off-page SEO. This involves external factors that influence your site’s authority and relevance, such as backlinks, social media presence, Google My Business, and online reputation.
- Technical SEO. Make sure that your website’s technical aspects support search engine crawling (robots.txt) and indexing. Optimize page speed, mobile-friendliness, and secure connections (HTTPS).
- User experience (UX). Enhance the overall experience for users, including site navigation, mobile responsiveness, and page load speed. It'll help reduce bounce rates.
This list isn't exhaustive, but it give you a basic idea of how to go about content creation in the context of SEO. Plenty of web analytics tools can give you insights and help you track every aspect of your SEO. Rellify makes monitoring your content's performance simple by integrating it into your content process. After publishing, you simply add your URL to the file in our content platform and click on “Monitoring.” Data and insights are imported directly from web analytics tools into the platform.
How can I make sure my content remains optimized for SEO?
Google continues to refine its algorithms, usually offering major updates twice a year. It's pretty clear about some of the ranking factors and guidelines, like prioritizing articles according to E-E-A-T (experience, expertise, authority, and trustworthiness.) But even if you're doing everything according to the latest guidelines, you might occasionally see an unexpected dip in your analytics.
If you notice your content seems to be losing traffic without any logical reason, don't panic! It's probably due to a new Google core update. You might need to tweak a few things, but if your content:
- is well-written ...
- has sound technical SEO ...
- is based on extensive keyword research ...
- and answers the questions people are asking ...
then your organic traffic should return or even improve during the next update. For this reason, you've got to keep the long-term results in perspective. Sometimes you'll see quick wins, but annual performance is often more accurate than monthly metrics.
These algorithm fluctuations are one reason companies like Rellify are so important. With a state-of-the-art custom Relliverse™ from Rellify, you can employ AI-assisted machine learning to crawl and cluster industry-specific data to find what's already resonating with your target audience. So regardless of the next chapter of SEO, you can be confident your content marketing efforts deliver strong results. Be sure to reach out to a Rellify expert to learn more about how your business can get 10x the returns with 10x less effort.