Cracking the Google Code… Under the GoogleScope

Cracking the Google Code… Under the GoogleScope

Cracking the Google Code… Under the GoogleScope

Free Online Articles Directory

Why Submit Articles?
Top Authors
Top Articles
AB Answers

Publish Article

0 && $.browser.msie ) {
var ie_version = parseInt($.browser.version);
if(ie_version Hello Guest

Login via

My Home
Sign Out



Remember me?
Lost Password?

Home Page > Internet > SEO > Cracking the Google Code… Under the GoogleScope

Cracking the Google Code… Under the GoogleScope

Edit Article |

Posted: Sep 19, 2005 |Comments: 0
| Views: 1,157



Syndicate this Article

Copy to clipboard

Cracking the Google Code… Under the GoogleScope

By: Lawrence Deon

About the Author

Lawrence Deon is an SEO/SEM Consultant and author of the popular
search engine optimization and marketing model Ranking Your Way
To The Bank.

(ArticlesBase SC #2391)

Article Source: the Google Code… Under the GoogleScope

Google’s sweeping changes confirm the search giant has launched
a full out assault against artificial link inflation & declared
war against search engine spam in a continuing effort to provide
the best search service in the world… and if you thought you
cracked the Google Code and had Google all figured out … guess

Google has raised the bar against search engine spam and
artificial link inflation to unrivaled heights with the filing
of a United States Patent Application 20050071741 on December
31, 2003. On March 31, 2005 is was available online for the
first time.

The filing unquestionable provides SEO’s with valuable insight
into Google’s tightly guarded search intelligence and confirms
that Google’s information retrieval is based on historical data.

What exactly do these changes mean to you? Your credibility and
reputation on-line are going under the Googlescope! Google has
defined their patent abstract as follows:

A system identifies a document and obtains one or more types of
history data associated with the document. The system may
generate a score for the document based, at least in part, on
the one or more types of history data.

Google’s patent specification reveals a significant amount of
information both old and new about the possible ways Google can
(and likely does) use your web page updates to determine the
ranking of your site in the SERPs.

Unfortunately, the patent filing does not prioritize or
conclusively confirm any specific method one way or the other.

Here’s how Google scores your web pages.

In addition to evaluating and scoring web page content, the
ranking of web pages are admittedly still influenced by the
frequency of page or site updates. What’s new and interesting is
what Google takes into account in determining the freshness of a
web page.

For example, if a stale page continues to procure incoming
links, it will still be considered fresh, even if the page
header (Last-Modified: tells when the file was most recently
modified) hasn’t changed and the content is not updated or

According to their patent filing Google records and scores the
following web page changes to determine freshness.

·The frequency of all web page changes ·The actual amount of the
change itself… whether it is a substantial change redundant or
superfluous ·Changes in keyword distribution or density ·The
actual number of new web pages that link to a web page ·The
change or update of anchor text (the text that is used to link
to a web page) ·The numbers of new links to low trust web sites
(for example, a domain may be considered low trust for having
too many affiliate links on one web page).

Although there is no specific number of links indicated in the
patent it might be advisable to limit affiliate links on new web
pages. Caution should also be used in linking to pages with
multiple affiliate links.

Developing your web page augments for page freshness.

Now I’m not suggesting that it’s always beneficial or advisable
to change the content of your web pages regularly, but it is
very important to keep your pages fresh regularly and that may
not necessarily mean a content change.

Google states that decayed or stale results might be desirable
for information that doesn’t necessarily need updating, while
fresh content is good for results that require it.

How do you unravel that statement and differentiate between the
two types of content?

An excellent example of this methodology is the roller coaster
ride seasonal results might experience in Google’s SERPs based
on the actual season of the year.

A page related to winter clothing may rank higher in the winter
than the summer… and the geographical area the end user is
searching from will now likely be considered and factored into
the search results.

Likewise, specific vacation destinations might rank higher in
the SERPs in certain geographic regions during specific seasons
of the year. Google can monitor and score pages by recording
click through rate changes by season.

Google is no stranger to fighting Spam and is taking serious new
measures to crack down on offenders like never before.

Section 0128 of Googles patent filing claims that you shouldn’t
change the focus of multiple pages at once.

Here’s a quote from their rationale:

“A significant change over time in the set of topics associated
with a document may indicate that the document has changed
owners and previous document indicators, such as score, anchor
text, etc., are no longer reliable.

Similarly, a spike in the number of topics could

Pages: 1 2 3 4 5