A History of SEO

Wednesday, July 20th, 2016
Facebooktwitterredditlinkedinmail

search engine advertising

Search engine optimization, like the Internet, has only been around for about 25 years. However, as the Internet has undergone dramatic, fast-paced changes, so has SEO. Since its origin as a way to manipulate search engines, it has transformed into a respected field, which simultaneously attracts the target market to a specific website and provides the most relevant results to a user’s web search.

Along the way, search engine optimization has helped to transform the Internet into a place where people can quickly find what they need, when they need it. For example, thanks to SEO, websites have drastically improved user experience, allowing people to find relevant answers to their search almost instantly. Companies have joined social media, posted videos, created mobile responsive sites, developed thousands of informative blogs, and targeted their content at the people who want to see it the most.

This is not to say that the development of SEO has always been easy. Along the way, search engines have sometimes allowed poor content to populate the Internet; websites have made mistakes that have damaged their online reputations, and user demands have tested the creativity of marketers. The future of search engine optimization, however, is promising. Search engine optimization is becoming increasingly integrated into the fields of web design, web development, social media, and content marketing. As it does so, it continues to respond to user needs and help websites to deliver information that their target audiences need to find the products and services that are best for them.

In order to accurately understand and plan for the future of SEO, you must understand its history. Following is a brief background of search engine optimization. In particular, this narrative focuses on the trends that have shaped how websites have created and distributed their content online over the years.

The Need for SEO

The first website was launched in 1991. Its creation unleashed a flood of websites that resulted in the existence of more than 2,700 websites in 1994 and more than 257,000 websites in 1996. While those numbers might seem small in the face of the 1 billion websites online today, they represented a growth of more than 25 million percent in just 5 years.

With the rapid expansion of websites on the Internet, it became difficult for visitors to sort through their many choices to find the sites that had information relevant to them. Even sorting through 2,000 websites to find reliable information on a topic is time consuming and difficult. Websites and their web pages needed to be organized to help users find what they needed efficiently. It was this need that led to the rise of search engine optimization.

The Rise of Keywords

The very first search engines to try to make sense of the teeming wilderness of the Internet were Archie and Gopher, developed in the early 90s. However, search engine optimization really had its origins in the release of slightly more developed search engines like Excite. Excite, when it was introduced in 1993, quickly dominated the Internet because of its ability to collect and present relevant information (such as news, weather, and more) to users. Its distinguishing characteristic from an SEO standpoint was the fact that it was able to prioritize web pages based on certain keywords.

effective keywords

Excite eventually went bankrupt, after rejecting an opportunity to buy Google and after suffering the financial impacts of poor advertising revenue and an unsuccessful merger. Its idea, however, of ranking web pages based at least in part on the keywords that they used, remained alive and well in the search engine world and continues to play a role in successful SEO today.

The Rise of Inbound Links

Of course, keywords are not the only way to prioritize web pages, as Google discovered in 2000. That year, the growing search engine giant discovered that a web page’s value could also be determined by the number of links to it from other sites. The thought was that the more websites that find a certain web page to be valuable, the more likely it is that the web page will satisfy user queries. Google ran with this idea by releasing PageRank, an algorithm that would play a large role in search engine optimization for many years.

Created by and named after Google co-founder Larry Page, PageRank was one of the first true search engine algorithms. It worked by counting the links to certain websites and then ranking the websites based on the number of links they received. It also attempted to analyze the value of each link, and to assign more importance to links that seemed more valuable. The more links (especially valuable ones) a web page received, the better the page would rank. In addition, Google would post a PageRank bar in its toolbar above each page to show visitors (and website owners) where the page ranked in Google’s world.

PageRank had a huge impact on search engine optimization. Marketers could improve their websites’ performance by adding inbound links to their web pages. Plus, they could immediately see on the toolbar how successful their strategy was in Google’s eyes. While PageRank remained a part of Google’s algorithm until this year, its importance in Google’s algorithms gradually declined as more advanced methods for determining page value were developed. Despite its decline, however, the focus on inbound links that it contributed to continues today, as marketers still try to grow the number of links to their site, and as Google continues to reward websites that have large numbers of valuable inbound links.

google-search.jpg

The Problems of Early SEO

The rise of keywords and inbound links for determining page rankings led to a number of problems that plagued SEO from the 1990s to the early 2000s. These problems degraded the value of the content that website visitors received in the search results and led to unreliable and disappointing search results much of the time.

This proliferation of poor content resulted from the fact that web crawlers were easy to manipulate. For example, when crawlers began ranking pages based on the number of keywords the page contained, they could not distinguish between relevant content and meaningless content that simply repeated the same keyword over and over. As a result, many web pages would rank high in the search results simply because marketers filled the page with lots of keywords.

In addition, websites began engaging in artificial link building practices. Strategies such as purchasing links meant that websites could quickly obtain many bogus inbound links. These links were not earned by solid content but rather purchased or created for the purpose of getting the website to rank better in the search results. With web crawlers unable to accurately analyze the quality of these links, Internet users were bombarded with web pages that were useless for their purposes.

Google Algorithm Updates

The problem of poor content began to be resolved in 2003, when Google took the lead on revising its algorithms to make them more difficult to trick with black hat SEO techniques. These updates gradually transformed how the web crawlers were able to understand website content. Over time, they allowed the algorithms to analyze not only certain characteristics about a web page but also to analyze how relevant and high quality that content was. As a result, black hat SEO techniques became easier to detect and more difficult to get away with. Instead of being able to stuff a page with 100 repetitions of a keyword, for example, marketers now had to make sure that the keyword was placed in the midst of content that meaningfully addressed the topic.

While Google updates its algorithms many times a year, there have been a few updates that have stood out because of the extent of change they introduced to the field of search engine optimization. For example, Panda, which was introduced in 2011, penalized websites whose content was repetitive or overfilled with keywords (called keyword stuffing, a problem that had not yet been completely eliminated from the 90’s). The Penguin update in 2012 was able to determine which links were valid and which were artificial or irrelevant. An update in 2015 forced businesses to create websites that were mobile responsive in order to rank favorably in the search results.

In addition, Google began hiding the content of its algorithms from marketers. This means that marketers could not see exactly how the algorithm worked. Many algorithms are also released without much notification to marketers. These strategies make it more difficult for marketers to develop black hat techniques in order to manipulate the algorithms. As a result, these algorithms produce much more meaningful results for users.

Impact of Google’s Algorithm Updates

These algorithm updates were an influential factor in changing how marketers approached SEO. When Google began changing its algorithms, these changes were accompanied by penalties for websites that did not comply with the new standards in search engine optimization. These penalties could be as minor as a drop in the rankings, or as severe as a lengthy (and public) ban from the search results. There were a few of these very public bans, including businesses as well-known as BMW and J.C. Penney. Google used these bans as an opportunity to remind others that black hat SEO tactics were unacceptable.

Because of the combination of new algorithm updates and stiff penalties, websites began to cut back on the amount of keyword stuffing, cloaking, and link buying in which they had been engaged. The field of search engine optimization began to focus less on manipulating the algorithms to get low quality content on top of the search results and began to focus more on creating quality content that met users’ needs. As this shift occurred, the quality of the content available online began to improve, and users found it easier to locate meaningful information through their Internet searches.

This shift toward better content is an ongoing process. A few SEO “experts” still try to game the system through black hat techniques that trick algorithms into seeing poor content as the best answers to users’ questions. This is in part why search engine algorithms are generally kept secret, and why search engines like Google release frequent updates to their algorithms. However, the trend is toward more complex algorithms that do an ever better job understanding content’s value, quality, and relevance for a particular search. As a result, search engine optimization continues to shift into a field that focuses on creating the type of content that users want, instead of simply trying to rope visitors in through the use of deceptive strategies.

A Shift Toward User Experience and Personalization

Of course, while Google’s algorithms have had a significant impact on the shape of SEO over the years, they have not been the only factor that has changed how marketers approach their online presence. In 2008, Google also introduced a feature called Google Suggest. This feature, as its name implied, suggested websites to users based on their past search histories. Local SEO was introduced in order to help users find information on businesses, products, and services in their areas. These new features soon became only a small part in a rapidly growing trend: Personalization and user experience.

SEE ALSO: 6 Tips for Succeeding at Local SEO

Personalization in SEO means that the web search experience is tailored for the particular needs of the individual user. User experience means that the focus is on how easy it is for the user to find the content they need. This is why Google now displays Google Answers on its search results pages for certain keywords. While sometimes unpopular with SEO marketers, these instant answers improve the user experience.

Likewise, the importance of high quality content has continued to rise. The reason is that high quality content meets users’ needs. When they are searching for something online, users want to receive in-depth, engaging, accurate, and reliable information about that topic. Because Google’s algorithms are so advanced, it is becoming increasingly necessary for websites to produce this type of content if they want to rank well and attract users to their websites. Other features of SEO, such as reliable link building practices, smart keyword optimization, and social media engagement, also come from a need to deliver a personalized and high quality user experience in order to rank well in the search results.

Mobile Responsiveness and SEO

One of the most significant changes to search engine optimization in modern times has been Google’s emphasis on mobile responsiveness in websites. This term refers to the ability of a website to adjust its size and content display to make it easy to use on many different kinds of devices (from smart phones to desktop computers). In 2015, Google made mobile responsiveness a factor in how it ranked web pages. Pages that are mobile responsive receive higher rankings than pages that are not.

seo-mobile.jpg

This shift by Google was a reflection of how users are now approaching online searches. With almost everyone using smart phones to search online, and with many online searches taking place over multiple devices, it was necessary to create websites that could function equally well on any type of device. Otherwise, a user might find it difficult to complete certain tasks (such as filling out forms or performing basic navigation) on a smart phone, tablet, or other device with a smaller screen.

This change means that good search engine optimization must now take web design and web development into consideration instead of just focusing on content. A successful search engine optimization campaign must include both high quality content and a design that performs well on a variety of devices. If you can create a site that delivers great content on any type of device, you are much more likely to end up in a favorable position in the search results, experience a lower bounce rate, and earn more visitors and business than would have been possible otherwise. Web design and development are now integral parts of successful SEO.

SEO Today and in the Future

Today, search engine optimization encompasses almost every area of web design and development. It has become a broad field that focuses on developing websites and content that meet user needs. As a result, SEO includes content creation, back end and front end website development, social media strategies, and more. That is why it is increasingly difficult to engage in DIY SEO. Instead, you typically need the services of SEO experts who understand the field, are on top of its frequent changes, and have the time to devote to a meaningful and successful SEO strategy. As the trend toward personalization, user experience, mobile responsiveness, and increasingly intelligent algorithms continues, you will find that the most successful SEO campaigns use websites that are on top of user needs and offer immediate, high quality solutions for those needs.

Ready to increase visibility of your site with an up-to-date SEO campaign? Contact us for a quote today.

Facebooktwitterredditlinkedinmail

Comments are closed.