Best Google Panda Tips, Don’t Get Punk’d Guide!

Google-Panda-AlgorithmAs of 24th Feb 2011, a major update by Google on how it ranks sites has affected 12% of search results and halved many sites’ visitor numbers. Named the Farmer or Panda Update it’s only affecting US Google results as I write but if you’re outside the US it is coming to you soon. Here’s how to find out if you have already been hit, are going to be and what to do about it.

Can your business handle a 50% drop in organic (non-paid) visits from Google? That’s what might be coming your way courtesy of Google’s Panda algorithm update.

Before we get into the whys and wherefores, find out if you’ve been hit by Panda …

Panda update: If your site gets slapped by the Panda then you are going to have to wait for the next update to find out if any changes you’ve made will get the Panda off your back. Below is a full list of Panda update dates for you to use when checking your site’s analytics reports as shown on this page:

• 24 Feb 2011 (USA-only)
• 11 April 2011 (all English language results) 
• 10 May 2011 
• 16 June 2011 
• 23 July 2011 
• 12 August (all languages and probably not a new ‘English’ update)
• 28 Sept 11
• 9th Oct 2011
• 13th Oct 2011 
• 20th Oct 2011 
• 18th Nov 2011 (Google’s tweet announcing this Panda update)
• 15th Jan 2012 
• 28th Feb 2012
• 23rd March 2012
• March to April There was several, non-Panda related updates which could have affected your site.
• 19th April 2012. Don’t confuse changes resulting from this update with those from the Penguin webspam update of (approx) 24th April. That’s a whole other thing. 
• 24th April 2012. This the Penguin webspam update and not Panda.
• 27th April 2012. Yes, just eight days after the previous Panda. Apparently this was a ‘small’ update.
• 24th July 2012. Said to have affected 1% of searches and not though to have been an important one.
• 20th August 2012. 1% of search again affected.
• 27th September 2012. 2.4% English queries, impacted, belatedly announced.
• 28th September 2012. Non-Panda – this one hit Exact-Match Domains. Less than 0.7% of “low quality” sites affected this time.
• 5th November 2012. Panda – 1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced as of 6th November.
• 21st November 2012. Panda – 0.8% of English-language queries; 0.4% worldwide; confirmed, not announced as of 30th November.
• 21st December 2012. 1.3% of English-language queries affected, officially confirmed as a “refresh”.
• 22nd January 2013. 1.3% of English-language queries affected, officially confirmed as a “refresh”.
• 15th March 2013. Panda – confirmed as coming but not confirmed as having happened. Google says that no further “updates” will happen but that Panda will be rolled out along with normal algorithm changes.
• 22nd May 2013. Penguin “2.0” or “#4″ update. 2.3% of English-language queries affected.

Why did Google Panda slap quality sites?

Watch Karon Thackston’s 3 essential post-Panda content tips for ecommerce sites

Has your site been Panda slapped?

At the time of writing, Panda is only hitting US results. Here’s how to use Google Analytics (GA) to find out if your site is affected.

If your site gets most of its search engine traffic from the US then you probably already know if you’ve been affected by Panda or not. With this guide you can see the details of the damage and learn how to analyze where problems might be.

If your site is not US-centric then follow the steps below to see if you will be affected when Panda rolls out across the world.

First go to your GA dashboard.

If your site is not US-centered then you might see something like the graph below and think all is well:

All on on GA?

But dig deeper. Go to the Search Engines report in the Traffic Sources menu (and choose ‘non-paid’):

Google traffic on GA

Then click ‘Google’ to see Google-only traffic (see below):

Click the ‘Keyword’ column heading above the list of keywords (See this highlighted in green in the image below). This reveals a large sub menu (again see below) on which you click ‘Country/Territory’:

Alt text

Enter ‘United States’ into the filter at the bottom of the list of countries.

United States filter on GA

Press ‘Go’ and hope you don’t see this:

Google US non-paid organic visits

That’s more than a 50% drop in organic (non-paid) visits from Google US. Scared yet?

Alternatively, use advanced segments to see organic US Google visits

Using Google Analytics advanced segments will give you more power to analyze what’s happening. Here’s how:

Choose ‘Advanced Segments’ from the left hand menu and then ‘Create new segment’.

Configure with:

• ‘Medium’ Matches exactly ‘organic’


• ‘Country/Territory’ Matches exactly ‘United States


• ‘Source’ Contains ‘google’

That looks like this:

Configuring GA advanced segments

Perhaps name that segment ‘G US organic’

Apply this segment to your GA reports and all the data you see will now be for this segment of visitors only. As you’ll see below, this allows you to look at which of your pages have fared best and worst from Panda.

Before we do that, let’s explore what Google are trying to do.

What in the name of Google is going on?

The aims of Panda are noble: to remove poor quality sites from the top of Google’s results pages. Or as Matt Cutts, Google’s head of spam, puts it in a blog post announcing Panda:

“This update is designed to reduce rankings for low quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

The last thing Google wants is searchers being unhappy with what they find. They might try another search engine if that happens.

Few people other than the low quality sites’ owners and their investors will have a problem with that.

But all major Google updates leave ‘collateral damage’ behind them: sites that just don’t match the target or deserve to be penalized. Google are aware of this and so have asked those with “a high quality site that has been negatively affected by this change” to let them know about it here.

So if you have a high quality site that’s been adversely affected by Panda Farmer then let Google know.

The site used as an example on this page is a high quality site hurt by Panda. It’s core content is hundreds of long in-depth specialist articles plus a Q and A based forum for readers’ problems.

Perhaps the Q&A pages are the problem (those pages could like thin content to Google’s robots). But then I know of two similar sites in different markets that have also been hit but don’t have the Q & A based forum. No, it wont be that easy to work out why an innocent site has suffered.

What factors make a site vulnerable to Panda?

Google like to keep these things secret but the two engineers at the heart of Panda, Matt Cutts and Amit Singhal, gave us some strong clues in an interview with Wired.

Cutts and Singhal revealed their process which I’ll summarize as:

• Conduct qualitative research (that’s speaking with individuals and not a big questionnaire) to find out which of a sample of sites they considered to be low quality and why.

• Use the results to define low quality sites with the factors that Google can measure. This gives Google a mathematical definition of low quality.

If we start here, we can think of a number of factors that Google might be able to measure to define low quality, including:

• A high % of duplicate content. This might apply to a page, a site or both. If it’s a site measure then that might contribute to each page’s evaluation.

• A low amount of original content on a page or site.

• A high % (or number) of pages with a low amount of original content.

• A high amount of inappropriate (they don’t match the search queries a page does well for) adverts, especially high on the page.

• Page content (and page title tag) not matching the search queries a page does well for.

• Unnatural language on a page including heavy-handed on-page SEO (‘over-optimization’ to use a common oxymoron). Eg unnatural overuse of a word on a page.

• High bounce rate on page or site.

• Low visit times on page or site.

• Low % of users returning to a site.

• Low clickthrough % from Google’s results pages (for page or site).

• High % of boilerplate content (the same on every page).

• Low or no quality inbound links to a page or site (by count or %).

• Low or no mentions or links to a page or site in social media and from other sites.

If any of these factors is relevant to Panda, it is unlikely that they will be so on their own.

Multiple factors will likley be required to get ‘Panda points’ (and points do not mean prizes in this game). Panda points will be added up. Cross a threshold (the Panda Line) and all the pages on your site seem to be affected. This includes quality original pages being ranked well below useless scraper sites that have stolen your content.

Google have said that “low quality content on part of a site can impact a site’s ranking as a whole.”

It’s important to define the difference between an algo change and a penalty.

A penalty must be served if it has a time limit and lifted if it is to be removed.

An algo change exists and its results will continue until it is changed, your site changes (or your site gets whitelisted).

Panda is an algo change but no ordinary one. It’s an algo change that works like a penalty because if your site crosses the Panda Line then the whole site is affected, quality pages too.

Panda is penalty by algo.

Thanks to Franz Enzenhofer for pointing out a misreading in an earlier version of this article of Matt Cutts use of the word ‘block’ in the Wired interview.

Is a Panda Slap applied site-wide or at the page level?

If a Panda Slap is site wide then all pages should experience a similar drop in Google organic traffic. On our example site, let’s use the ‘G US organic’ advanced segment to see if that is so …

Go to Content > Top Landing Pages. See below (remember, in this segment we are only looking at visits from organic searches on Google in the US so we have no need to restrict the GA report beyond ‘Landing pages’):

Top landing pages on GA

This report lists all 4,272 landing pages. To test if all pages are equally affected by Panda we can filter the report to show:

• Individual pages. Select a sample and look for exceptions to the visits drop shown above

• Types of pages that can be identified by shared strings in their URLs. Eg forum pages might all have /forum/ in their URLs.

Use the filter at the bottom of the report to do this. Eg:

Filter landing page URLs

I’ve done this on a few sites hit by Panda and I can say that some pages were hit more than others and a few did well out of Panda.

So Farmer Panda is at least to some degree working at the page level.

Find out what types of page have been hit on your site

If your site has been hit then use the filter on GA (as shown above) to find out which pages got hit most by Panda.

I found lots of pages with high quality, unique, in-depth (sometimes thousands of words long) articles that were hit much harder than average. So again, there are no simple answers here. But these did have more advertising on them than average for the sites concerned.

Some forum pages had significant increases in visits. These had long threads, a fair amount of advertising on them (including a pop-up) but less than some other pages.

On this site, I would try changing some of the advertising. In particular, there is a big ‘block’ of advertising that doesn’t feature on the forum pages.

That might not be enough or have any affect at all. For example, on another site I’ve seen hit by Panda, all marketing was removed and no changes have followed (more time might be needed though).

Is a Panda penalty applied at the keyword level?

To find out if Panda is applied at the keyword level and not just to pages you can:

• Find a page that gets results for different keywords

• See if Panda has had different effects on traffic for those different keywords (but to the same page).

If it has then Panda is operating at the keyword level.

I’ve only seen a few examples of Panda having reduced visits to the same page with some keywords but not others. But they are the exception.

The suggestion that Panda operates at the page and site level was supported when I searched on Google US with a unique text string (in quotes) from a near 10-year indexed in-depth original specialist article that had dominated a niche in Google’s results for most of those 10 years. I saw:

• 36 scraped versions of the article.

• two showing above the page with the original.

• one of these being a low quality scrape on a low quality site.

• The other being a part scrape that credits and links back to the original.

• The original page has lost 75% of its organic US Google traffic since Panda.

• That traffic came from over 1,000 different keywords and of those I tested none had been spared.

What to do if you’ve been hit by a Panda

Google suggest:

“If you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”

Let’s add a bit more to that, put it into practical actions and make a process …

• Find the pages and page types hit worst on your site.

• Isolate differences between those hit and those not.

• Test changing those factors on hit pages but use this method of analysis with caution because the pages hit most might not be the pages earning you the penalty.

• Make a list of your different types of pages. Eg, forum, quality article, low quality article, light category, quality category, product, blog post, etc. Put the list in a column in a spreadsheet and start building a table.

• Add columns for relevant factors like ‘lots of ads’, little content, some dupe, all dupe, etc and also number of pages and % drop in Google US organic visits. Fill in the values for each type of page.

• Look at how much of your site (% of pages) is taken up by your lowest quality pages and improve that.

• If you are scraping or otherwise copying other site’s content, replace it with quality original content or test removing some (or even all) of those pages (and adding 301s from them to relevant pages higher up your site’s hierarchy).

• If you have a large number of pages with dupe (of your own copy), weak or almost no content, improve them or remove (and 301) them or block them from Google with robots.txt.

• If you have lots of pages that dupe your own copy (eg, as happens with some content management systems and on a lot of ecommerce sites that build new URLs for ‘faceted’ pages) then add rel=canonical tags to the ‘duped’ pages. This stops Google seeing those pages as dupes.

• Edit any ‘over-optimized’ pages.

• Improve anything that might make the user’s experience better.

• Offer users more when they first enter a page. Eg, images, videos, attractive text and pages linking to your best, related editorial content.

• If possible, make your content’s language more accessible and more real?

• Promote your content on social media including Twitter and Facebook.

• Build your brand awareness across the web wherever you can.

• If you’re sure your site is ‘Google clean’ and worthy, let Google know about it but don’t expect this to have much effect.

• Make as many of these changes as you can at once in the hope of shaking off the penalty quickly. With editorial content improving, you can then add back any marketing you are missing, in steps, checking to see you don’t get slapped again.

5 Steps To B2B Niche Marketing

b2b marketing dan weik san diegoStep one:

Visit Use their search wizard to generate a list of industries and types of companies. (There are also other services online you can use, such as and You can even do some research at your local library.)

Step two:

Make a long list of the types of business-to-business companies that interest you.

Don’t worry about why they interest you. It could be because you’ve worked in that industry before. It could be because your background and education match that industry. It could simply be because your gut says you might like working with those types of companies.

Don’t second-guess your choices. At least not yet. Just make a long, long list.

Step three:

Ask yourself, “Do I know anybody in any of these niche markets?”

For example, if you happen to have a lot of contacts in the office equipment industry, then that’s going to give you a serious advantage. You’ll be able to use those contacts to quickly get the word out about your B2B copywriting services and, hopefully, quickly land your first client.

Step four:

Ask yourself, “Will my experience or education give me an advantage in any of these niche markets?”

Ed Gandia asked himself that question when he started his B2B copywriting business many years ago. His background was in high-end software sales. So the software niche was the ideal fit for him. And, today, he’s one of the most successful B2B copywriters I know.

Step five:

Ask yourself, “Are there enough potential clients in these niche markets?”

Ideally, you need to be reasonably assured that there are at least 200 potential clients in the niche market you ultimately choose.

Once you complete these five steps, you’ll have whittled down your list to probably five or six possibilities. Now, do some more research into those niche markets. Check out a few websites. Get a sense of the type of projects you’ll likely be working on.

Then pick the niche market that makes the most sense to you … and jump in with both feet!

Like I said, there are dozens, if not hundreds, of niche markets up for grabs in the B2B market … niche markets just waiting for a writer who knows how to craft effective emails, website pages, case studies, white papers, ads, and other marketing communications for B2B companies.

Don’t know how to write B2B copy yet? Learn. Learn quickly. Then go out there and claim your niche!

Social Media Increases Business Referral 90% After Interaction

social media marketingA new study conducted by the Internet Advertising Bureau has found that 90 per cent of consumers would recommend a brand to others after interacting with them on social media.

The study focused on FMCG brands Heinz, Kettle and Twinings and found that social media can drive ROI by driving brand sentiment, encouraging consumer engagement and increasing brand loyalty.

More than 4,500 survey responses were collected from each brands social media pages over a two month period and supplemented by 800 interviews to inform the findings.

This showed that 4 out of 5 consumers would be more inclined to buy a brand more after being exposed to their social media, with 83 per cent happy to trial the product in such circumstances.

The uplift in sentiment for each brand was measured as Heinz 22%, Kettle 17% and Twinings 19 per cent, allowing the IAB to estimate that for every £1 spent on social media as much as £3.34 could be generated.
Kristin Brewe, the IAB’s director of marketing & communications said: “The IAB study shows that, when trying to create deeper emotional connections with consumers, social media is an essential channel for brands. This isn’t surprising since social media is the only channel where it’s possible for brands and consumers to have meaningful two-way conversations, making the strength of connections that much stronger.”

Ian Ralph, the director at marketing sciences who conducted the research, adds, “Our research shows that to create an emotional connection brands really need to provide clear, timely and, most important of all, relevant content that develop a conversation. Interestingly, we also found that brands really shouldn’t be afraid about having their products on show and of linking up their social media activities to their business objectives. Social media has the potential to turn brand customers into brand fans.

“By making people love, not just like your brand, you’re more likely to drive future purchases and increase sales.”

The 3 times people are happiest — you may be shocked

girl-looking-at-watch-9248523_s1-200x300According to a new UK study involving Apple and the London School of Economics — reported by Hannah Thomas at Marie Claire — people are happiest when they are having sex, exercising, and visiting the theatre.

Sure, the having sex part is a no brainer. The exercise part is a bit surprising; people are often pretty happy when they’ve finishedexercising — your feel-good hormones are coursing through your body and you have a great sense of accomplishment — but the during part can be quite a struggle. And the theatre? Hey, I like theatre — but I only get there a few times a year, and I imagine only very few people make it part of their regular routine.

The study, which is largely based on updates, via an app, by 45,000 iPhone users regarding levels of contentment, also determined that people are happiest during midday on Christmas. I guess this is while you’re still riding a high from opening a bunch of presents, but before the extended family has arrived? (No mention, though, of how non-observers of this particular holiday are feeling around this time.) People were also found to be very happy on bank holidays — a paid day off work is pretty universally seen as a treat — and generally happier on weekends.

And the unhappiest time? Around 8pm on January 31. No explanation for why this particular date was the least fun, but it’s easy to speculate about the gloom and isolation that can set in mid winter. Further, working, commuting to work, and standing in line were found to be the least happy-making activities. No mention, though, of one of my all-time, least happy moments: when you’re standing in line, during your commute to work, and someone cuts in front. It’s a triple threat, in the worst possible way.

Why Penguin 2.0 is Google’s Gift to the WWW

google_penguin_update1In late May, Google launched the much anticipated update to their Penguin algorithm, dubbed Penguin 2.0. Now that the dust has settled, it’s clear the impact has been fairly significant, and that Google has effectively (but certainly not flawlessly) furthered their mission to identify and penalize webspam and black hat SEO practitioners. Ultimately, although it may be frustrating to experience the shifts and changes of the algorithm updates, content creators should be celebrating Penguin 2.0, as it wholly supports the creation of quality, user-friendly sites and content.

Need some tangible reasons to rejoice about the latest updates? Keep reading and prepare to become a Penguin fan.

HowPenguin 2.0 Rewards the Good Guys

“White hat” SEO optimizers are those who are looking to improve the user experience. They work to create valuable content, a seamless user interface, and ultimately strive to manifest the best overall experience in their niche for each visitor. These are precisely the folks Google is looking to reward with Penguin 2.0. What’s good for search engines is also good for users, and that means Google will not rest if webspam continues to reach top ranking positions in their search results.

As a rule, white hat tactics also indicate good marketing strategies. If a company employs integrity and care in their SEO, they likely carry that attitude through all their business practices, using creativity to entice customers, not trickery. Although Google continues to be the search engine powerhouse, sites like Bing are providing healthy competition, motivating the leader to ensure their user experience is superior. That means highlighting the best sites available for every last search. And that means those professionals who have tirelessly worked to truly provide quality experiences to their users are getting more and more attention and well-deserved prominence.

Expertise Trumps Black Hat SEO

If you run a search for “best plastic surgeon in Hollywood,” you’re obviously looking for the most trustworthy and experienced professional you can find. To help determine this authority, Penguin 2.0 is giving more prominence to author rank, and helping to elevate bonafide experts in each field. Again, the best customer experience equals the best search engine; Google has become increasingly savvy in developing ways to identify who the experts are, and who might is blowing smoke to garner page views.

In a recent video describing the Penguin 2.0 changes, Matt Cutts, Google’s search ambassador, had this to say about identifying experts:

“We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try[ing] to make sure that those rank a little more highly if you’re some sort of authority… we think might be a little more appropriate for users.”

While they obviously won’t reveal the specific methods to ascertain expertise, this is fabulous news for those marketers and content creators that truly are highly knowledgeable in their field. In essence, Google is looking to reward hard work and dedication. That’s a tough mission to argue with.

Rewards for Honest Link Building

Building a network of reputable and reliable link partners is perhaps the most challenging aspect of white hat SEO, and Google is working harder to reward best linking practices with better rankings. In many ways, this is the heart of Penguin 2.0. The best SEO professionals devote a significant amount of effort into attaining inbound links from relevant and respected websites, ideally those on a small list of approved sites that Google calls Hilltops.

A Hilltop is a site Google ranks as the best of the best, and gaining their recognition (in the form of a link) requires two key components. First, the content you offer must clearly be entertaining, solve a definite problem, or showcase some level of remarkable content. Secondly, the Hilltop site must have an obvious resonance with the business it links to. These two principles are near and dear to the best content marketers, so the compatibility is obvious. Hilltops hold a high level of trust in the eyes of Google, as each are operating under the same stringent quality ethics. If you also do not settle for a mediocre experience and can showcase that via your own site, court a Hilltop link and Google will aptly reward you.

Offset Any Negative Effects from Panda

Panda, Google’s other algorithm tour-de-force, has certainly ruffled some feathers with some more mysterious shifts. If you’ve followed content quality rules without engaging in webspam and you’ve still seen your rankings drop, Penguin might just help balance the scales. Best practices with off-site SEO tactics (like social mentions and link building) are some the aspects that Penguin rewards, and an excellent effort in this arena may help you balance the dings of duplicate content, older posts, and other Panda frustrations. Even Matt Cutts acknowledges that Panda can and will be “softened.”

“We are looking at Panda and seeing if we can find some additional signals, and we think we’ve got some to help refine things for sites that are kind of in the border zone… if we can soften the effect a little bit for those sites that we believe have some additional signals of quality, that will help sites that were previously affected… to some degree [by Panda].”

Top Off-Page Tips for Maximum Penguin Benefits

We’ve already outlined that high quality content brimming with expert commentary and a sincere attention to a seamless user experience are the on-site tactics for high rankings. There are a gaggle of off-page signals that can help you increase your rankings through Penguin’s updated algorithms too.

Here are a few of the best practices:

1) Social mentions – These are becoming more and more critical. Engage your fans and followers on social sites like Facebook, LinkedIn, Pinterest, Reddit, and Twitter to share your news, like your content, and otherwise interact with your social offerings. And by all means, do not forget Google+ – you better believe Google notices when mentions are coming from their social hub.

2) News sites – A great PR strategy that generates buzz from any of the major news outlets is a huge boon. Positive mentions and articles on sites like The Huffington Post are extremely powerful.

3) Industry niche sites and blogs – Likewise, finding partnerships in the blogosphere or via other industry experts are also integral to good off-site practices. If none of the experts in your field are giving your company accolades, you lack the credibility Penguin is searching for. Reach out to those folks who are talking about your industry, create a link share, and give them reason to spread your message. Not only is this obviously a critical marketing strategy, but search engines love it too.

4) Print publications – Any offline content can be quoted and cited online, whether it appears in magazines, newspapers, or trade publications. If experts are talking about your site, online or off, Google wants to know.

Google’s mission is simple: to return the best results on the web for any keyword search. To reach this goal, Google is becoming craftier at rewarding the marketers and sites that are working the hardest to give the people what they want. For those content creators and site owners that have always followed rules of quality and integrity, Penguin 2.0 is a godsend. For those that continue to employ sneaky tactics with the hopes of flying under the radar and achieving top rankings, the glory days are over. It’s always good news when the good guys win the battle.

Thanks, Penguin 2.0, for getting us closer to that reality. You may not be flawless, but your intentions are golden.