Search engine optimization is a method of affecting the visibility of
a website or a web page in a search engine's search results.
As an Internet marketing strategy, SEO
considers:
1.
how
search engines work
2.
what
people search for
3.
the
actual search terms or keywords typed into search engines and
4.
which
search engines are preferred by their targeted audience
Optimizing a website may involve editing its
content, HTML and associated coding to both increase its relevance to specific
keywords and to remove barriers to the indexing activities of search
engines.
Promoting a site to increase the number of
backlinks, or inbound links, is another SEO tactic.
History
Spider and Indexer
Initially,
all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a
spider to crawl a page and collects information and send the same
information to its own server to be indexed.
Indexer extracts
various information about the page, such as the words it contains and where
these are located, as well as any weight for specific words, and all links the
page contains, which are then placed into a scheduler for crawling at a later
date.
According to industry analyst Danny
Sullivan, the phrase "search engine optimization"
probably came into use in 1997.
Early versions of search algorithms relied on
webmaster-provided information such as the keyword meta tag, or index files in engines. Meta tags provide a guide to each page's
content. Using meta data to index pages was found to be less than reliable,
however, because the webmaster's choice of keywords in the meta tag could
potentially be an inaccurate representation of the site's actual content.
Inaccurate, incomplete, and inconsistent data in meta tags could and did cause
pages to rank for irrelevant searches.
By relying so much on factors such as keyword density which were exclusively within a webmaster's
control, early search engines suffered from abuse and ranking manipulation. To
provide better results to their users, search engines had to adapt to ensure
their results pages showed the most relevant search results, rather than unrelated
pages stuffed with numerous keywords by unscrupulous webmasters. Since the
success and popularity of a search engine is determined by its ability to
produce the most relevant results to any given search, poor quality or
irrelevant search results could lead users to find other search sources
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing
number of Internet users, who liked its simple design. Off-page factors (such as PageRank and hyperlink
analysis) were considered as well as on-page factors (such as keyword
frequency, meta tags, headings, links and site structure) to enable
Google to avoid the kind of manipulation seen in search engines that only
considered on-page factors for their rankings. Although PageRank was more
difficult to game, webmasters had already developed link building tools and
schemes to influence the Inktomi search engine, and these methods proved
similarly applicable to gaming PageRank. Many sites focused on exchanging,
buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for
the sole purpose of link spamming.
By 2004, search engines had incorporated a wide
range of undisclosed factors in their ranking algorithms to reduce the impact
of link manipulation. In June 2007, The New York Times' Saul Hansell stated
Google ranks sites using more than 200 different signals.
The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages.
In 2005, Google began personalizing search
results for each user. Depending on their history of previous searches, Google
crafted results for logged in users.
In 2008, Bruce Clay said that "ranking
is dead" because of personalized
search. He
opined that it would become meaningless to discuss how a website ranked,
because its rank would potentially be different for each user and each search.
In 2007, Google announced a campaign against paid
links that transfer PageRank. On June 15, 2009, Google disclosed that they had taken
measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links.
In December 2009, Google announced it would be
using the web search history of all its users in order to populate search
results.
Google Instant, real-time-search, was introduced in late 2010
in an attempt to make search results more timely and relevant. Historically
site administrators have spent months or even years optimizing a website to increase
search rankings. With the growth in popularity of social media sites and blogs
the leading engines made changes to their algorithms to allow fresh content to
rank quickly within the search results.
In February 2011, Google announced the Panda update, which penalizes websites containing
content duplicated from other websites and sources. Historically websites have
copied content from one another and benefited in search engine rankings by
engaging in this practice, however Google implemented a new system which
punishes sites whose content is not unique.
In April 2012, Google launched the Google Penguin update the goal of which
was to penalize websites that used manipulative techniques to improve their
rankings on the search engine.
In September 2013, Google released the Google Hummingbird update,
an algorithm change designed to improve Google's natural language processing
and semantic understanding of web pages.
What are the methods to get
higher rank ?
Getting indexed
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic
search results.
Pages that are linked from other search engine indexed pages do
not need to be submitted because they are found automatically.
Search engine crawlers may look at a number of different
factors when crawling a site. Not every page is indexed by the search
engines. Distance of pages from the root directory of a site may also be a
factor in whether or not pages get crawled.
Preventing crawling
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots.
Increasing prominence
A variety of methods can increase the prominence of a webpage within the search results.
- Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.
- Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.
- Updating content so as to keep search engines crawling back frequently can give additional weight to a site.
- Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic.
- URL normalization of web pages accessible via multiple urls, using the canonical link element or via 301 redirectscan help make sure links to different versions of the url all count towards the page's link popularity score.
Online advertising, also called
Internet advertising, uses the Internet to deliver
promotional marketing messages to consumers. It includes:
Like other advertising media, online advertising frequently involves both a publisher, who integrates advertisements into its online content, and an advertiser, who provides the advertisements to be displayed on the publisher's content.
Other potential participants include advertising agencies who help generate and place the ad copy, an
ad server who technologically delivers the ad and tracks statistics, and advertising affiliates who do independent promotional work for the advertiser.
Online advertising is a large business and is growing rapidly.