Monday, April 18, 2011

SEO UPDATE ON GOOGLE’S TOP SECRET PANDA ALGO CHANGE! (Dewaldt Huysamen) - Special Report

TAME THE BEAR NOW!!!

Including 19 Things You Must Do To Protect Your Rankings From

Michael Campbell



As you all know pandas are usually very cuddly and subtle creatures until provoked then they become very dangerous and aggressive and can attack humans on the spot. This is exactly what happened with Google and there new Algorithm change code named Panda. This is one serious Bear

I have done some intensive research and read a lot of other people’s comments and articles concerning the algo change,
but I have to say Michael Campbell nailed it best and his report is most up to date and correct.

To get all the details on this algo change also nick named Farmer by the SEO crowd, you’ll have to read this whole report compiled by me.

So the content farms were hit the hardest and so was the article directories and
even some press release directories. Yet some of the sites which were
questionable still remained. How did they do it?

What you get in this report is actionable advice. 19 steps from Michael Campbell that you can do right now to improve your rankings.
And as a added bonus I tell you the one thing that everyone does out of a habit, and that is actually causing your rankings to fall. I tell you how to fix it, so you can get found ahead and instead of the competition.

Introduction

Here is some brief history first of all that explains how we aggravated the bear.

In late the late 2009 Google’s Caffeine Algo Update added feed results to their search results pages when it comes to Twitter, Facebook and other Social Media websites. Problem was these results were accidently filled with dirt from content farms, scrapers and low value pages.

Then throughout 2010 most of these dirty gibberish pages and most of the scraped content. Most of the “auto posting” software was also detected, because of
the major evidence they leave behind.

When we started this year Google warned us in the Official Google Blog, that the Big Panda was about to be released from its cage.

Why did they do this? “To find more high quality sites in their search engine.”

So this angry bear affects over 11.8% of Google’s overall queries. This means per day nearly 12 million pages are affected.

To give you a birds eye view on the damage that has been caused, some of the article directory’s keyword rankings went down over 90% according to
SearchMetrics.com.

A lot of the article directories were hit cause most of the articles found on the article directories these days are of low quality and it was found that after reading the first paragraph of a article people would go back to Google and search for other results.

This is not true for all articles on article directories or similar low quality press releases found on press release sites.

There are some good quality articles to be found on article directories but over all
not always worthy of a first place spot in search engines as it was.

Also sites with excessive advertising was hit hard. You know the ones.

They are the ones with a 3 to 1 advertising to content ratio above the fold. Many with blinking banners and animated GIFs that don’t adhere to the IAB’s advertising standard of 15 seconds max for animation.

One of these sites as an example is a article site that lost nearly all rankings, it had three large AdSense blocks, a blinking banner, and two large ad blocks from three separate ad networks.

Most of this content was user generated, in tiny impossible to read 8 point type, except the comments like, “Great Post.” Which are obvious plants. Sheesh!

Well, I hope the spammers are happy now. Because of all there rubbish that was dumped in the last oases of links for years, and now the water is no good.

Article Marketing is near to death now and Content Mills are dead. MFA site

But wait do not worry, for in the words of Albert Einstein, “In the middle of every difficulty lies opportunity.”

Google’s Modus Operandi
So what does Google like to do… They just hire a bunch of students.
Record the behaviour patterns of the humans. Program the machine to mimic the human behaviour. Then get rid of the humans.

In the past this method has worked well for Google. Like the Florida update was based on this method. So it will come to no surprise that Google is using human reviewers again.

The “ratters” as Google likes to call them, visit a list of sites and vote on them with
a custom Chrome browser extension. Their behaviour gets fed into a database. Then the engineers program the algo to mimic what the humans did.

Recently it was revealed in the Wired.com
Q&A with Amit Singhal and Matt Cutts, the Google reps revealed that they told their “outside testers” or ratters to ask themselves questions like these:

Would you be comfortable giving this site your credit card?

Would you be comfortable giving medicine prescribed by this site to your kids?

Do you consider this site to be authoritative?

Would it be okay if this was in a magazine?

Would it be okay if this was in a magazine?

Does this site have excessive ads?

Then Amit Singhal stated, "...based on that, we basically formed some definition of what could be considered low quality."

In addition Google also released the Site BlockerChrome browser extension. Anyone could download it and block sites they don’t like. Thousands of Chrome customers did just that.

Interesting enough Google did not use this collected data in their Panda update but they did looked at it and now use it to justify their algo change to the public.


As Amit Singhal said in the Wired.com interview, "...we compared and it was 84 percent overlap, between sites downgraded by the Chrome blocker and downgraded by the update. So that said that we were in the right direction."

It's also interesting to note that Google's blog states, "...while we're not currently using the domains people block as a signal in ranking, we'll look at the data
and see whether it would be useful, as we continue to evaluate and improve our search results in the future."

Note the words "not currently using the domains" which implies they could add this in the future. It would be easy to
game such data by hiring an army of people in a developing country and voting down your competition... but I'm sure they're aware of that.

Or maybe they're not, because a few sentences later in the interview Matt Cutts states, "Our most recent algorithm does contain signals that can be gamed." Sheesh, no kidding. I'll talk about those more later.


Continuing the Wired.com interview about using outside human ratters, Matt Cutts stated, "I think you look for signals that recreate that same intuition, that same experience that you have as an engineer and that users have. Whenever we look
at the most blocked sites, it did match our intuition and experience."

This is very interesting makes me think of the movie AI. So the algo was programmed to mimic human intuition. Think about your own behaviour for a moment…

The average person can tell if they like a song in less than three seconds. A TV show in less than one second. A website in less than 10 seconds. Just think how fast you are with the radio buttons in the car, or the TV remote.

So what do people do if they do not like a website, they click off in less than 30 seconds. The faster they leave a website the poorer the quality.

Temporal Distortion


There used to be a search engine called
“Direct Hit” circa 1998 – 2000. It powered

several search engines until Ask.com bought the technology over a decade ago.


Rankings were determined by temporal duration. This means how long some one stays on your site before bouncing back to the engine for more search results.
This is the major component of the Panda
algo.

And yes there is still over 200 factors that determine your rankings. All the off-site and on-site factors, along with social
reach apply.

Social Reach? It’s the number of followers you have. The number of retweets, or shares that you get. What you say bout others. Discussion about you and your company. They all contribute.

(If your SEO source thinks social reach is new in this algo change they are about two years behind. You better hire a new SEO company.)

Its clear that the new algo is all about how people find your site to be valuable and this is determined by how long the spend on your site.

Now people might say how can they track this well easy, clear your browser cookies and now go to Google.com and see how many new cookies is on your pc.

Do you have any plugins that show Google pagerank do you have the Google toolbar? Guess what your online activity is being tracked.


Got Google Analytics on your site? Well guess what they are using that data too.
They’re tracking the visitor behaviour while they interact with your site.


Between the cookies, the plugins,toolbars and stats, they track everything. Every click. Every site you visit. Every page. And most important, how long you stay on each page.

So it isn’t just about content anymore. Content may be king, but the king must be dressed in royal clothes and live in a palace, so you want to hang out with the king for extended period of times and return to spend more time.

So in other words what the visitor’s experience is on the page, how engaging the content is, pleasing and attractive is major factors and are the human intuition which Matt Cutts was referring to



So the most important thing you can do is, prevent a visitor from returning to Google and continue his search somewhere else.

Accidently Killing Your Rankings
Almost everyone I know that is serious about their website and its rankings head over to Google and types in their keywords to see where they rank. Fair enough. Except now you are sending one of two messages…

If you search, but don’t click on any listings, you are telling Google that you did not find anything clickworthy. This happens either with a manual search, or with an automated rank checking tool

You are telling Google that none of the top 10 are usable. So if you were in the top 10 spot and didn’t click on your listing you telling Google that the search results suck and they have to revaluate the results, thus hurting your ranking.

If you do click on your listing and know we are all guilty of this. There is nothing like the feeling of going through to your own website from the Google SERP’s. Happy and glowing you head back to Google and search for another keyword.

And just here lies the big problem. You clicked your listing and in a matter of seconds went back to Google. The message you are sending Panda is that the page content sucks. You clicked, landed and returned, all in a few seconds, which is the primary indicator of a low quality site…

So do not do that anymore! M’Kay?

And yes this can be gamed as Matt Cutts stated, getting people from different IP’s to click on your listing and staying on your pages longer than 10 minutes or at random time intervals between 5 – 15 minutes.

But guess what Google tracks IP addresses, just like with Adsense fraud protection. And they will catch on and your site will be banned.

So do not take part in these games or black hat tactics, instead focus on creating high quality pages that delivers the best solution to your visitor’s search preventing them from going to another page or searching for other results for your keyword.

19 Steps to Prevent Low Quality Pages

Google's Michael Wyszomierski stated that, "...low quality content on part of a site can impact a site's ranking as a whole. You should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages, or moving them to a different domain, could help your rankings for the higher quality content."


1) Check your stats for pages with high bounce rates. Then improve them, or remove them, or you'll lose them in the rankings. Keep in mind that low quality pages can drag down your entire domain.

2) Let me repeat... The single most important thing you can do, is to prevent someone from clicking the back button, returning to Google for more search results. You must provide sticky content in terms of your design, layout, copy, graphics, media and navigation.

3) Try and avoid overly huge navigation. Remove graphical elements like big amateurish buttons and clip art. Use real photographs of people like the ones you can purchase from iStockPhoto.

4) If a picture doesn't help tell the story, or enhance it, leave it out. Don't add artwork to a page for the sake of colour. There's nothing wrong with white space. In fact, it provides a necessary relief.

5) If you duplicate content from another site, you might think its duration, or
Tame The Google Bear Now – www.ilead.co.zaPage 6

syndication. But if you do so excessively without adding value, its considered low quality. So you'll want to minimize posting low value content, where all you do is summarize what others have said.

6) Don't be afraid to speak up. Tell us what you know. Share what you've experienced. Be original. Add your thoughts. Interject your opinion. Don't just aggregate... create!

7) Avoid pages with too many (30+) links out. You might think a page of links
would make it sticky, so the viewer doesn't go back for more search results. But unless you provide a paragraph detailing each link, its likely to have the opposite effect.

8) Be sure to put the confidence builders like the credit card logos, secure shopping logos, verified site logo, a toll free 800 number, Better Business Bureau and your other professional trade association logos on your pages. Doing so builds instant trust in the mind of the consumer.

9) Put up all the required legal pages, or lump everything on one "About Us" page. That means having terms of use, privacy policy, spam policy, external links policy, earnings disclaimer, compensation disclosure, DMCA notice, refund policy, etc.

10) If you have AdSense on your pages, you must have wording explaining Google's DoubleClick DART cookies. How they follow you around the internet, tracking your personal viewing habits, so they can determine what kinds of ads might be interested in.

11) When it comes to ads on your site, low quality doesn't necessarily ad heavy. There's plenty of ad-heavy sites still in the index. It's when the ads interfere with the content, or make the content hard to read, requiring additional effort on behalf of the reader, to find where the story starts, continues or stops.

12) Another low quality tactic is where a site gives a top 10 list, but the list isn't on one page. The list is actually 10 paragraphs spread over 10 pages, forcing you to look at dozens of ads with slow
load times.

13) Speaking of which, slow loading pages are almost always low quality. They rely heavily on ad servers and they often hang for 10 seconds or more, while waiting for all the ads to load.

14) Fix your spelling and grammatical errors. If you fail to spell check, its a red flag that your page is low quality.

15) Add some video or sticky content. That's how many video sharing sites emerged unscathed by the Panda, because of the video. They don't experience the traditional bounce back ratios of text based sites. If the user doesn't like the video, they tend to search for others on the same topic before moving onto another site.


16) Videos, audio and photos all engage the audience. At least for a few minutes. As my research shows in the whitepaper, the Ultimate Heatmap, photos of attractive people hold attention longer. So do some types of ads. You can read the paper to find out more.

17) Pick a colour harmony and test your colours. For example, in my research I found that people looking for life insurance quotes hate purple, but love aqua. I tripled conversions, simply by changing the background colour. It affects your bounce rates too.

18) Avoid Arial and Helvetica fonts for your body copy. They were designed for signage in bus and train stations. Use a font that was designed for reading on the computer screen, such as Georgia, Trebuchet or Verdana. The easier it is on the reader, the longer they'll stay.

19) The most important single element of all that everyone can fix... readability. Avoid putting light text on a dark or black background, as it tends to sparkle and
tire the eye. Keep column widths narrow (60 characters max) and the font size at least 12 point, with plenty of line spacing (aka leading) for maximum comfort, legibility and readability.

Conclusion

The average time spent on a website is 57 seconds. Are you trying to be average, or are you striving for more? One way boosts your search engine rankings, the other has an angry Panda Bear waiting.

Dewaldt Huysamen, SEO

Via: http://www.ileadtutorials.com/Latest-News/google-panda-seo-update.html