Friday, August 12, 2011

The Complete Google Algorithim



Google’s war on content farms and low quality websites has officially been launched in the form of a major update to the search engine’s algorithm. The changes, which went live this week, impact 11.8 percent of all search results, meaning that this will have an impact on most site owners, for better or worse.

The blogosphere was buzzing this morning with many site owners complaining about a major decrease in traffic from Google search. Some have seen search traffic decrease over 50 percent, essentially overnight. When Google first announced their war on content farms I suspected that the changes would have a negative effect on many bloggers as well. However I did not expect the changes to hit authoritative sources as hard as they have.
Many SEO Gurus have attempted to give a rough outline of what the Google algorithm might look like. Based on research and suggestions this might be how the formula basically could look like;


Google's Score = (Kw Usage Score * 0.3) + (Domain * 0.25) + (PR Score * 0.25) + (Inbound Link Score * 0.25) + (User Data * 0.1) + (Content Quality Score * 0.1) + (Manual Boosts) - (Automated & Manual Penalties)

Algorithm A Work In Progress

According to Google’s Amit Singhal, the now famous search algorithm is tweaked twice a day on average. Most changes are not publicised, and most of them affect only a very small percentage of searches.
Scott Huffman, whose team is responsible for improving the Google search results, explains, “On the one hand, we want to be moving quickly and we want to make great changes. On the other hand, we don’t want people to come to Google and say they don’t recognize it.”
He also said that Google understands what people want when they search. “People are not just expecting a search engine to return every document that has most of the words typed in a query box. They want the context understood; there are a lot of nuances hidden within that.”

Working The Algorithm Changes

Initially, potential changes to the algorithm are tested on a network of computers that simulate real searches on Google. If the results look promising, teams of evaluators get to test the change, by rating the relevance of the results that come up in the tests.
The next step is to make real-world tests, by blending the changes into normal searches. “At any given time,” Huffman reveals, “some percentage of our users are actually seeing experiments.” They don’t know about it of course. And the first round of testing ensures that nothing bad is going to be experienced by them.
If the results continue to be positive, the change might be phased into the existing algorithm. Google handles more than 1 billion search queries every day.

The Future Of Google Search

Google assures people that there is no way that their algorithm will ever be complete. It’s a work in progress, they explain. And they’re planning plenty more changes in the future too.
Right now, Huffman says, they’re thinking about ways to improve how Google understands and derives inferences from the many languages used around the world.
Singhal himself envisaged what he called the “ultimate dream,” a day when search engines understand users so well, that they predict what they want to know, and prompt them with messages to smart phones.

The Official Statement From Google:
Many of the changes we make are so subtle that very few people notice them. But in the last day or so we launched a pretty big algorithmic improvement to our ranking–a change that noticeably impacts 11.8% of our queries–and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites–sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites–sites with original content and information such as research, in-depth reports, thoughtful analysis and so on…
It’s worth noting that this update does not rely on the feedback we’ve received from the Personal Blocklist Chrome extension, which we launched last week. However, we did compare the Blocklist data we gathered with the sites identified by our algorithm, and we were very pleased that the preferences our users expressed by using the extension are well represented. If you take the top several dozen or so most-blocked domains from the Chrome extension, then this algorithmic change addresses 84% of them, which is strong independent confirmation of the user benefits.
Google's keyword search function is similar to other search engines. Automated programs called spiders orcrawlers travel the Web, moving from link to link and building up an index page that includes certain keywords. Google references this index when a user enters a search query. The search engine lists the pages that contain the same keywords that were in the user's search terms. Google's spiders may also have some more advanced functions, such as being able to determine the difference between Web pages with actual content and redirect sites -- pages that exist only to redirect traffic to a different Web page.


Keyword placement plays a part in how Google finds sites. Google looks for keywords throughout each Web page, but some sections are more important than others. Including the keyword in the Web page's title is a good idea, for example. Google also searches for keywords in headings. Headings come in a range of sizes, and keywords in larger headings are more valuable than if they are in smaller headings. Keyword dispersal is also important. Webmasters should avoid overusing keywords, but many people recommend using them regularly throughout a page.