What is the history of SEO? One of our clients recently posed the question, and it’s a fun one! So grab a beverage, then hop in our digital DeLorean as we travel back in time to the conception and birth of search engine optimization.

SEO’s Origin Story

Technological progress is a chain link. Innovations are built on the backs of others. To wit, the development and history of SEO are inextricably linked with that of the Internet. So that’s where we’ll start.

The Internet’s Birth

Believe it or not, the Internet’s origins date all the way back to the 1920s when computational engineers Ralph Hartley and Harry Nyquist began exploring information theory. The two men — Hartley in the United States and Nyquist in Sweden — didn’t work together, but each created theoretical scaffolding that gave shape to advanced technological inquiry.

In the 1940s, Claude “The Father of Information Theory” Shannon aligned the conceptual puzzle pieces and crafted a workable framework for error-free transmissions, which involved an advanced understanding of signal-to-noise ratios and bandwidth.

Computational engineering became increasingly popular starting in the 1950s. By 1969, the U.S. government had begun commissioning ARPANET, the first wide-area packet-switched network.

From there, it was history, and on January 1, 1983, Transfer Control Protocol/Internetwork Protocol technology went live, and computer-to-computer communication made its public debut. A little less than a decade later, on August 6, 1991, the World Wide Web went live thanks to Tim Berners-Lee’s Hypertext Transfer Protocol.

First Came the Websites…

The first website — http://info.cern.ch/hypertext/WWW/TheProject.html — detailed instructions for using the “Internet Superhighway.”

Early platforms were mostly linked to universities and high-tech research centers. For example, Tim Berners-Lee maintained something called the World Wide Web Virtual Library; Stanford University had a website for its Accelerator Center; Nikhef was another early site run by the Dutch National Institute.

In 1992, Exploratorium, a San Francisco science museum, launched the first website intended for the masses. From there, more commercial platforms began popping up. Early adopters included:

…Then Came the Search Engines

Once the Internet was up and running, people quickly realized it would be convenient to have a way to search its contents — and poof, search engines were born. The earliest versions were crude databases. But engineers began experimenting with algorithmically-enabled bots that collected data and carried it back to their respective mothership.

Archie

While there’s some dispute among Internet historians, many consider Archie to be the first search engine. Originally conceived by a McGill University post-doctorate named Alan Emtage, who was tasked with helping to connect the school to the Internet, the protocol, which debuted in 1990, established an index of searchable FTP archives.

Notably, the name wasn’t an homage to the classic comic. Instead, the system’s moniker derived from the word “archive.” Although, subsequent versions were named Jughead and Veronica.

Archie never became popular publicly, but it holds a venerated place in the pantheon of Internet creations. Today, it’s no longer accessible online, but the University of Warsaw Interdisciplinary Centre for Mathematical and Computational Modelling preserves a legacy version.

Aliweb

In 1994, Aliweb hit the scene. Its name is an acronym for “Archie Like Indexing for the Web.”

Those who don’t think Archie was the first search engine because it only indexed archived FTP files give the honors to Aliweb. The brainchild of Martijn Koster, it made its debut at the First International Conference on the World Wide Web at CERN.

Some folks make the mistake of thinking that WebCrawler, which we’ll get to below, came before Aliweb. But the latter outdated the former by several months.

Aliweb was also the first platform to use a fetch-and-return bot model.

WebCrawler

Close on the heels of Aliweb in 1994 was WebCrawler — one of the oldest surviving search engines. Created by Brian Pinkerton while studying at the University of Washington, WebCrawler launched with an index of over 4,000 websites. In February of 1996, it was the second most visited website. But it didn’t have staying power as more options hit the market. That said, it’s still operational!

Infoseek

Infoseek was the other big engine to come out of 1994. Developed by Steve Kirsch and known as “Big Yellow,” it was one of the first search platforms to accommodate a cost-per-thousand impression advertising model. Plus, in 1998, it launched the first behavioral targeting algorithm.

In 1999, the “Big Mouse” — Disney, Inc. — bought Infoseek. But it ended up being a Dot Com Bubble casualty. By 2001, the entertainment conglomerate had canceled the service and laid off the entire staff.

Notably, one of Infoseek’s lead engineers, Li Yanhang, moved to Beijing and co-founded Baidu, which turned out to be the “Google of Asia.”

Inktomi / HotBot

Founded in 1996 by Eric Brewer and Paul Gauthier — a grad student and teacher pair out of UC Berkeley — Inktomi was an engine that was licensed by search platforms.

Pronounced “Ink-tuh-me” after a Lakota legend about a spider who outwitted larger adversaries, the engine was initially used by HotBot and eventually implemented by copycat startups. Inktomi raised $36 million in a 1998 IPO and quickly acquired many companies. But the early noughties recession came sweeping in on the wings of Y2K fears, and Inktomi crumbled in the 2001 Dot Com Bubble burst.

Ultimately, the corporate raiders restructured the company and sold it to Yahoo! In 2016, HotBot.com sold for $155,000. Today, it’s owned by a Seychelles-based VPN company that uses it for a privacy-focused search engine.

AltaVista

AltaVista came next. Established in 1995 by Paul Flaherty, Louis Manier, Michael Burrows, and Jeff Black, AltaVista was the first searchable full-text database with a simple form interface. It quickly caught on, and in 2000, 17.7% of users favored AltaVista. To put that in perspective, only 7% of people at the time stanned Google.

Over the years, it changed hands between Digital Equipment Corp., Overture Services, Yahoo!, and Verizon Media. Ultimately, it went defunct in 2013. Today, it forwards to the Yahoo! search engine.

Enter SEO

In the mid-1990s, webmasters started optimizing online platforms to rank well in AltaVista, HotBot, Infoseek, and WebCrawler. But back then, SEO existed on a very different plane than it does today. To start, instead of letting bots find and index web pages, you had to submit them to search indexes manually.

Secondly, meta tags were all mighty. Webmasters stuffed them to oblivion with keywords, and it worked! Some ingenious developers even started hiding keyword phrases in site designs and reaped the rewards. By 1996, search engine optimization had become a burgeoning industry.

And once that started, search platforms began trying to thwart underhanded tactics. It was a cat and mouse chase between search engineers and SEOs. Link farms began to litter the Internet, and paying for backlinks became the norm. Websites published shlock by the hour to generate self-created buzz and made ample use of content mills. Basically, the Web quickly filled up with a whole lot of crap.

The Emergence of Google

Google

Traditionally, we delineate between BCE and CE — or BC and AD. In the online search world, the designation is BG and AG, before and after Google. The creation by Larry Page and Sergey Brin, Google is the GOAT of search engines.

It started in 1998 as a project called “Backrub.” What differentiated it from other engines is that Google rated a website’s prominence based on the number and quality of inbound links. They called the metric “Page Rank,” and the better a site’s Page Rank, the higher it showed up in the results. Due to the innovation, Google’s results were arguably better and more reliable than other search engines.

A loyal following soon flocked. People loved the simple design, and it was difficult to game the results. By 2001, Google was already using over 200 signals to rank pages. In 2005, it started personalizing results — so Jack, a stockbroker in New York, got wildly different returns on Google than Jane, a stay-at-home hippie mom in Washington State.

New Rules for a New Century

In 2000, search engines started to get a lot savvier about “spamdexing” — the practice of tricking algorithms. It’s also when they started banning sites from their databases for using underhanded tactics. In 2005, Google drew a line in the SEO sand when it canned Traffic Power, a highly successful SEO company that deployed black-hat techniques. Not only was Traffic Power booted, but so were many of the company’s clients.

Google’s Major Algorithm Updates

Since Google is the king of Searchlandia with over 90% of the market share, its functionality and mechanisms are the most important to SEO professionals. And while the company has made thousands of small algorithm updates over its lifespan, a handful of major ones rocked the search engine optimization industry.

2010: Fresher Content

In 2010, Google adjusted its algorithm to incorporate fresher content into its search results. The update changed the pace of content creation because there was an incentive to produce timely articles regularly.

2011: Panda

Google Panda Update

In 2011, Panda hit the digital streets, forever changing search engine optimization. It was, as they say, a “game-changer.”

Prior to Panda, it was fairly easy to game search engines. You’d repeat keywords in certain parts of a page, tend to the meta tags, and fill up content farms with articles that included link-backs to your site. Voila. That was the long and short of it. The pieces didn’t even have to be good — heck, they didn’t have to be comprehensible. They just needed to include specific keywords a certain number of times.

But Panda swept in and changed all that. Seemingly overnight, Google deindexed a bunch of websites that relied on duplicate content and downgraded platforms that had artificially crawled their way to the top. Low-quality sites that once dominated the rankings disappeared into “search-beria” — aka page 50 and higher in the SERPs — where no user dares to lurk.

2012: Penguin

Google Penguin UpdateThe following year, Penguin arrived. What Panda did for duplicate and low-quality content, Penguin did for spammy links. Google deindexed farms, and sites that relied on them felt the pain. Like its predecessor, Penguin rocked the search engine world and affected many more platforms than Google anticipated.

At this point, quality content with natural backlinks became the foundational pillars of SEO.

2013: Hummingbird

Google Hummingbird Update

Arguably, the 2013 release of the Hummingbird algorithm brought search engineering into the semantic AI era. Previous iterations of content algorithms incorporated elementary language checks, but Hummingbird took it to the next level by including natural language ones.

The change meant that Google bots now had a better grasp on “conversational” language patterns, and they began to heavily punish websites that relied solely on keyword placement and stuffing.

2014: Pigeon

Google Pigeon Update

Named after the homing bird, 2014’s Pigeon algorithm update affected local search results. Essentially, it pushed local listings up in the rankings. So, for example, if someone in Denver queried “chiropractors near me,” they’d get a very different list than a Scottsdale resident who pumped in the exact search.

2015: Mobilegeddon

Google Mobilegeddon Update

Another massive moment in search, Mobilegeddon was the algorithm that added signals for responsive layouts. In other words, sites that worked well on both mobile phones and large screens leapfrogged those that couldn’t scale up or down.

2016 to 2018: Updates to Previous Algorithms

Between 2016 and 2018, Google tweaked and updated previous algorithms. Though it’s never been verified, search analysts believe a lot of work was done on Hummingbird during this period.

2019: BERT

BERT stands for Bidirectional, Encoder, Representations from Transformers, and it’s an advanced semantic learning model that can predict follow-up sentences and statements. Do you know how Gmail will offer prompts if you type in a few words? That functionality is BERT-enabled.

In 2019, the technology was incorporated into search algorithms, making quality content even more imperative from an SEO perspective.

2021: User Experience Update

Google UX Update

Bloated, sluggish sites saw a massive decline in search rankings starting in 2021, thanks to the User Experience Update. Functionality and UX took center stage with this fix.

SEO Today

These days, SEO is a multidisciplinary field. Thankfully, companies like Google and Bing have forged less antagonistic relationships with digital marketers, and most search engines now provide tools that webmasters can use to optimize website performance.

Generally speaking, to rank well, websites must sit on secure HTTPS domains; load with lightning speed; be properly coded, with a structured data framework; look good on all screen sizes; have multitudes of natural backlinks, and feature high-quality content that’s published regularly. It’s a giant job, and doing it right takes resources, patience, and perseverance.

Connect With SEO Experts

Rounded Digital has been helping clients climb search engine rankings for over a decade. We know our craft and have the skills needed to get you where you want to be. If you’re ready for more traffic, increase conversions, and ignite your bottom line, let’s talk.