Tuesday, 13 August 2013

What the heck is Google panda & Penguin update?

Often new comers in the SEO industry ask, ‘what the heck is a Google panda update?’. Google these days is very much criticized by not only its competitors but also its customers and public. The reasons for which Google is being criticized range from its privacy policies for the public’s data to its desire of controlling the content across the internet.
Image Credit - Click Finders
And, as most of the people today have grown using Google as the search engine. Their habits of using Google search for all type of queries have become so strong that they use Google for everything now. This not only increases the revenue of the search engine but also provides it more control on the content that it indexes.


Recently, like two years back, Google realized that it now has the upper hand on the internet. Therefore, Google introduced a new algorithm in its indexing functionality. By using this algorithm, Google was able to take down all websites that were using black hat techniques for SEO purposes and cloaking techniques to defraud the indexing bots.

Google Panda on the other hand was rolled out in 2011. The update is not just related to the search engine optimization functionality of every website. However, the Google panda update includes the design, trustworthiness among public, speed and many other parameters.

The search engine mentality:


Every search engines works through a strategy that is made by the strategy department of that search engine company. If we look in to the history, the main purpose Google was made, was to search for the important information by using the keywords (like to search out for thesis and research papers available on CNET server). However, today, these ‘keywords’ are granted more special tasks like ranking pages according to their worth and their prominence.

Search engines mostly do these things by crawling robots or spiders on websites. These crawlers feed all the code in the Google servers and tag them accordingly. HTML5 and CSS3 have played an important part in speeding up this tagging process as well as help in the speeding up the design of the website.
Image Credit - Mrsthiessenswiki

Impact on the websites:


Websites in old days were not much popular. Therefore, all those search engines that were available had to only index the search results and display them when someone searched for them. However, when more websites were made on similar topics. The keywords used lost the worth and most of the websites that were on top in old days had to lose their rank. Later, Google search engine and many other individual programs also rolled out their keyword checking tools to see the competition on a single keyword.

Moreover, as the competition had increased due to over usage. Website developers and some techies started searching for different ways to rank their websites higher on the web. And as the search engines were working on the prominence and quality of content on the website. The computer techies started increasing the worth by back linking, spamming and keyword flooding.

As a result, the rank of their website increased on search engines and they were able to get revenue by viral traffic to their website.

However, those websites, which were ranked higher, were not having quality content. Instead, they were having quantity of keywords. So, complains related to those sites started rolling in the inboxes of the search engines. Due to these complains Google developed new algorithms, even Bing and Yahoo followed.
Image Credit - bruce krasting

Understanding the system:


Now, anyone who thinks of getting to the top on internet for a single keyword has to follow the guidelines laid down by the search engines. Bing has its own guidelines while Google has its own. However, Google posted 26 questions on its blog after rolling out the penguin update and these 26 questions are considered the short summary of Google webmaster guidelines.
Image Credit  - SEO design solutions

How to rank well?


There is no accurate answer to such a problem. Some get ranked higher with little hard work, while others put lot of effort but still do not get ranked higher. However, there are few things that are loved by all, including the search engines, the public and even the computer search engine techies.

They are:

1.       Authenticity:


Every information that is placed on a website needs to be authentic. Means, if the reporter of that story is himself the witness of the event, then he should report it by his real name. If he has lifted it from somewhere else, then the source needs to be pointed with the information.

2.       No Plagiarism:


All type of stories that are placed on internet are sometimes copied by spammers and placed on different websites. Nevertheless, to rank higher, no plagiarism policy needs to be implemented. That means all information that is lifted need to be rephrased. While information that is first hand, a citation is required for that information.

3.       Proper Coding:


The information that is explained above needs to be placed on a website that is not only designed properly, including the responsiveness and color contrast. But also its code needs to pass the w3c consortium test level before it’s deployed on the cloud.

Guessing the future:

Image Credit - Best degree program

Google first rolled out its algorithm update in 2003. Later updates were rolled out in 2011 and 2012. These updates are pretty much criticized by many communities including the news groups and journalists. Most of the criticism is related to the complain that search engines have taken the job of public in their own hands. They are acting as gatekeepers instead of providing the public, whole of the information.

But, if we think about the future, the next big thing in the web community is the semantic website or the Web 4.0. This update of w3c will make the websites not only able to follow tasks. But also decide the content for particular individuals themselves. 

The platforms will be mainly based on server side scripting and a little AJAX or java scripting. Main difference will be in the use of content and data retrieved from users around the internet. This system is currently under construction and is named as the ‘BIG DATA’.

No comments:

Post a Comment

Please do not use abusive words.

Blogger Tips and TricksLatest Tips And TricksBlogger Tricks