Saturday 14 January 2017

Google Algorithm for SEO Part 2 (Updates in Year 2005)


Hi Friends, Today I am very happy to come with Google Algorithm update's second part, in this I am going to take all the updates from year 2005 to 2008, with their effects on website, also going to share the basic reason why google started doing so frequently updates in their algorithm, lets start some update of year 2005.

In the starting of year Jan 2005 so many sites those are engaged in the link exchange have faced that due to link exchange the potential of their page going down, but to place link of other important sites are mandatory for the users, like links to government sites, special sites etc, so search engines understand the requirement that an attribute must be introduce that can prevent link juice, so they invented "NoFollow" when we use this attribute with any link, google or any other search engine not follow that link and not navigate from your page to another page, so this way link juice of a page saved.

After this some intelligent webmaster noticed that their website's ranking ups and down due to some change, they faced for a new site, so some said it is due to sand box concept and other said it is due to LSI, I think both are correct, as sandbox concept applied on a new website while LSI concept on a site that is one year age old, but Google never officially accepted this change, while Webmaster given this update a name i.e Allegra update.


One update came in month of may with a very interesting name Bourbon, the name is based on the local street, Brett Tabke of WebmasterWorld was participating in a WMW Webmaster Conference year 2005 at New Orleans, so this Google update was named due to a famous local Bourbon Street.

This update was came in light due to preventing SERP result pages from sites with Duplicate content, site those are ranking due to irrelevant back links, site those are engaged in link farming and having good position in Google SeRP and also some site those are suffering from www and non-www issues, as a page with www and non-www considered by google as duplicate content, so google introduced this concept in Google Web Master tool also to decide your domain that you like to rank in search engine.

In june 2005, Google's team think that user can customize their search, like they can specify the sites from location, file extension, languages etc for a site, but this is not a part of algorithm update, algorithm update is that, SERP come with results as per these changes in the setting of your browser.

We like webmaster used our html sitemap in webmaster to get all the pages indexed, so many time our sitemap is more decorated so search engine spiders faced problem in crawling the pages due access use of java script or complex codes, to avoid such problem, google decided a new format of sitemap i.e xml sitemap, it is very lightweight and have a specific format for all type of sites, with the help of this sitemap it is become very easy for google to indexed all the pages of a site.

Some webmasters noticed in the Month of sept 2005 that there is a some kind of fluctuation in the ranking, so they think there is a kind of algorithm updates, and so many reputed sites on SEO also supported that there is an update  called Galligan, but Google not claimed that this is an update, they just said it is routine data refreshment so due to this some ranking fluctuation, means there is not major changes happened in this update other then data refresh 

No comments:

Post a Comment

 
;