As a marketing executive, you’re ultimately responsible for the health and performance of your website traffic, list growth, and revenue from new and existing customers. No doubt that’s a lot of responsibility, and “with great power comes great responsibility” (if I could quote Spider-Man). Many channels need your attention, but which particular channel deserves more of your attention than most due to the proportion of traffic or revenue it drives? You guessed it, organic search traffic. And let’s be real, you’re mainly concerned with Google. According to Comscore’s August report, it has a 64 percent market share, with the next closest competitor (Microsoft/Bing) coming in at 20 percent.

How confident are you that your company’s website(s) is minimizing risk with penalties or traffic drops in relation to Google’s algorithm updates? Are you at risk of losing 65 percent or more of your organic search traffic? Skip to the checklist and find out! Or read on for a more in-depth explanation.

Primary Google Algorithms of Concern

SEO companies don’t know many surefire, Google-confirmed details about their algorithm updates, but we do have a lot of very educated assumptions based on case studies and real-world experience. While not ideal, this is enough to help ensure that we are following Google’s best practices. But which should you be most concerned with? Let’s approach this from the angle of Google’s (recent) primary algorithm updates that have impacted websites (both positively and negatively) at a large scale.

Google Panda Update

giant panda eatingThe Panda algorithm largely targets content quality at both a page and site level. This blog post, published by Google on the Webmaster Central Blog, is the most referenced article on the topic.

It asks publishers/website owners to gauge the quality of their site in a manner that the Panda algorithm is (supposedly) gauging it. It all boils down to quality content that provides the natural readability, depth and professional appeal that Google is suggesting its searchers want when they land on one of the sites included in Google’s search results.

There are some specific questions Google asks of your site that you should be aware of:

– Does this article have spelling, stylistic or factual errors?
– Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
– Does the site have duplicate, overlapping or redundant articles on the same or similar topics with slightly different keyword variations?
– Does the article provide original content or information, original reporting, original research or original analysis?
– Does the page provide substantial value when compared to other pages in search results?
– How much quality control is done on content?
– Is the site a recognized authority on its topic?
– Are the articles short, unsubstantial or otherwise lacking in helpful specifics?
– Would users complain when they see pages from this site?

These points get at the heart of what many of us know to be the major targets of Google’s Panda algorithm update, specifically:

  • Duplicate content
  • Thin content
  • Poor grammar and spelling mistakes
  • Regurgitated information from other articles already in Google’s index
  • Overall site design, user experience and appeal
  • Perceived topical and domain authority

You must ensure your site is addressing these issues, and my article on Thin & Duplicate Content for eCommerce Websites is an excellent follow-up read to this article to dig further into these specific, common issues with websites suffering from Google’s Panda algorithm updates.

If you want more explanation of how Google goes about quantifying subjective elements like “page attractiveness,” check out this Whiteboard Friday from Rand Fishkin, What Deep Learning and Machine Learning Mean for the Future of SEO.


Google Penguin Update

fuzzy baby penguin in the snowThe Penguin algorithm is believed to largely target link profile quality at the site level.

However, many SEO professionals don’t realize there is likely a content/engagement quality factor as well. This blog post, published by Google on the Webmaster Central Blog, is the most referenced article on the topic.

It asks publishers/website owners to gauge the quality of their online marketing efforts as they relate to their link profiles, user experience and content quality.

Here are two sentences that I want you to be aware of from this article, which appear to be Google’s motive for the Penguin update.


“In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes attempting to propel sites higher in rankings.”

I’ve highlighted two key phrases that make it clear that Penguin is not only about links. The phrase “link schemes” clearly relates to link building, however, “keyword stuffing” relates to content. Furthermore, Google even offers a visual example of what keyword stuffing looks like on a page. Google suggests this example is extreme, and not every site penalized by the Penguin algorithm is quite as “blatant.” However, it is important to get the point because any sort of unnatural keyword stuffing, to any degree, makes your site vulnerable.

However, do not dismiss keyword targeting in any way. Google actually supports it since it makes logical sense to ensure you’re using the terminology your target audience is using in Google’s search engine. The end goal is to send users to the most relevant search results (cough…cough…Knowledge Graph blows up that theory, but I digress). In this same article, Google says:

“Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.”

Thus, you absolutely should still incorporate keyword research and on-page optimization best practices into your SEO and content strategy —just do it in a natural way. If you don’t know what a “natural way” is, consider hiring someone who does.

Other Google Algorithm Updates to Beware


Hummingbird Algorithm Update

Hummingbird with flowerThis algorithm update was released in the fall of 2013 (read my not-so-thrilled rant post on it, in relation to Knowledge Graph) and is actually an update to the core algorithm.

It is designed to align Google’s search results closer to the intent of users’ search queries (especially with conversational search queries) by focusing on the “meaning behind the words” (otherwise known as “semantic search”). All that matters to you is that you have a strategy in place to actually answer your target audience’s top questions about your industry (and your brand) in an in-depth manner. You may see some pages on your site that saw increases or declines in organic search traffic around the fall of 2013. It is possible that Google is ranking them differently due to their ability to truly answer all the queries that were (and are) driving impressions and traffic to them in Google’s search results.

For more information, visit this post and this post from Search Engine Land.


Page Layout (aka “Top-Heavy”) Algorithm Update

This algorithm update has seen various iterations over the years, although not as frequent as Panda, and targets (penalizes) sites with too many ads above the fold. This can be a tough one for sites that drive significant revenue from advertisements, however, it’s important for the overall health of the Web.

Let’s be real, most people don’t care about your ads (and many times they only click on them by accident), but you still need to make money. We get that. Without revenue, you wouldn’t be providing content to answer users’ queries in the first place. Just ensure you don’t have a slew of ads above the fold, pushing your content below the fold, and you should be OK.

For more information, see Google’s official blog post.


Pirate Algorithm Update

The capture of the pirate Blackbeard paintingThis algorithm update focuses on websites that are the cause of too many DMCA complaints (see Digital Millennium Copyright Act) received by Google.

As long as you’re not consistently copying other publishers’ content into your own site and taking credit for it, you should be unaffected by this algorithm update. While you may be unaffected by the actual algorithm update, you very well could be affected by duplicate content, such as when other websites steal your content. So don’t dismiss this just yet.

If you’re confident you are not copying anyone else’s content, you may still want to run your website’s XML sitemap through Copyscape to see if any other website is stealing your content. If they are, submit DMCA complaints here.


Mobile-Friendly Algorithm Update

iPhone with appsThis algorithm update was dubbed “Mobilegeddon” by Search Engine Land. This was a very “sensational” update since it was announced several weeks before Google implemented.

When it launched, very few publishers saw improvements or declines in mobile rankings, regardless of whether they had mobile sites. There have been spotty reports of improvements or declines since, and the jury seems to still be out (at the time of this writing) as to whether it’s having an impact.

We’ve seen a number of non-mobile-friendly sites continue to be unaffected by this algorithm update. Regardless, Google confirmed in May that more searches (using their search engine) take place using mobile devices vs. desktop in 10 countries, including the United States. If you’re not mobile, you need to be — at least for user experience (and conversions).


(Unofficial) Google Quality Update

This was a very mysterious update that was released in the spring of 2015, covered extensively by Glenn Gabe (who dubbed it the Google “Phantom 2” update) and analyzed in great detail in this post and this post and this post.

The update appears to be very similar to Panda, and Google did not give us much detail.


Checklist for Determining Risk of SEO Penalties

Now that we’ve talked through the major algorithm updates in recent years, let’s move on to what really matters — your analysis of your company’s website. Below we’ve provided a high-level “checklist” of areas to evaluate for risk exposure. For a deeper, more complete analysis with step-by-step instructions, please download our Marketing Executive’s SEO Risk Assessment Workbook.


New Call-to-action

Content Quality

Google’s Panda, Penguin and Hummingbird algorithms relate to content quality from different angles. The following checklist of items will help you identify what should be obvious to an experienced copywriter, content marketer or SEO professional. Unfortunately, very few people encompass all these skills required by these professions. This checklist is designed to help you, the marketing executive, bridge the knowledge gap and ensure that your team can accurately report on the current status of these best practices.


√ Duplicate Content: Does your site have — either internally or externally — duplicated content? In other words, have your editors, webmasters and other administrators copied any significant amounts of content from other websites’ pages into pages on your site or between pages on your own website? For example, do you use manufacturer product descriptions? Do you share your product descriptions with price comparison shopping engines, Amazon, eBay or affiliates?

√ Thin Content: Do any of your website’s pages, which are indexable to search engines, have significantly thin content (i.e., 100 words or less)? There is no exact threshold for this, and it will need to be a judgment call in light of different page types and your competitors’ content depth for similar page types.

√ Over-Optimized Content: Do any of your pages target keywords in an unnatural way? For example, repeating keywords in the meta data or URL and excessive overuse of keywords in the body content to the point that it’s obvious to your reader — and thus, Google’s algorithms.

√ Poorly Edited Content: Does your content consistently have spelling and grammatical errors?

√ Unengaging Content: Is your content missing attractive (and unique) imagery, video, internal links or calls to action to keep people on your website longer? Is it poorly designed?

√ Semantically Relevant Content: Is the content on your site answering all the specific questions your audience is asking, both directly and indirectly? And are you explaining it using the various phrases and terms they are using to find it?


Link Profile

link tinder joke profileJust as with Google’s content quality-related algorithms, Google’s Penguin algorithm (partially) targets unnatural link profiles using learned patterns of abuse and mysterious thresholds. Matt Cutts, former Head of Web Spam at Google, has stated on his blog, “The objective is not to ‘make your links appear natural,’ the objective is that your links are natural.”

If Google had its way, no one would request or build links in a paid or unpaid manner. Links would simply be rewarded for having quality content, products or services. Both Google and we know this never was and never will become a reality, and you’re going to build links, regardless. It’s critical you understand how to measure the risk of your current link profile and any future links built to your website.

Google’s Penguin algorithm is notorious for being incredibly difficult to recover from. With only 1-2 updates per year, it can literally put a company out of business if it’s not careful about its link building practices. Here is a checklist to help assess your risk.


√ Relevancy of linking domains and web pages: Back in 2012, an ex-Google employee was quoted as saying,  “… Getting a link from a high PR page used to always be valuable, today it’s more the relevance of the site’s theme in regards to yours. Relevance is the new PR.” Despite this Aussie’s incorrect grammar, he gave us a very important clue to ensuring a natural and effective link profile in today’s SEO landscape: relevancy. How relevant (to your industry) are the domains linking to you? Not every website is going to be, or needs to be, relevant, but a very large portion should. Don’t go out and disavow or request removal of branded links from The Wall Street Journal, however, seek to build relationships with topically-related publishers in your niche that you can earn links from.

√ Quantity and quality of linking domains: Google considers both the quantity and quality of the other domains linking to your website, especially in relation to your competitors when determining where to rank your website. Always focus on quality over quantity, but keep in mind that you need to continue doing the things required to continue growing the quantity of your inbound links. So create great content that people want to share and be remarkable as a business.

√ Quantity and quality of linking C-blocks: What is a C-block, you ask? It’s the third octet in the IP address and indicates a close relationship between the domains that have the same C-block. If many of your links have the same C-block, then this tips off Google that many of your links are probably from the same network of websites, run by a single company or person. This is unnatural — unless you’re link building. Granted, you have every right to internally link between your domains, but are you doing it in a natural or spammy manner? Trust your gut.

√ Over-Optimized Anchor Text: What is the quantity of money-keyword anchor text in your inbound link profile in relation to branded or generic anchor text? Many SEOs suggest that your money-keyword anchor text should never amount to more than 10-20 percent of your overall incoming links.

√ Social Activity: The jury is still out as to how Google is using social signals in its algorithm to rank websites. Consider the fact that it consistently appears as one of Moz’s Search Engine Ranking Factors, which is an aggregate of 150 professional SEO opinions. Google is paying attention to social, period. To what degree? We do not know and that doesn’t matter. You need to be consistently active on social, and most would give priority to Facebook (maybe Google+), but spend your time where your audience is most often. Don’t you think Google knows what social media platforms, industry blogs and other publications are most popular in your niche?



Cube that says S E OTechnical SEO is where most websites fall short. This also presents an opportunity for your company website to potentially benefit from future Panda updates and other algorithm updates released by Google that relate to indexation of your website.

Algorithm updates aside, you need to assure that Google can crawl your site efficiently and effectively in order to properly index and apply link equity (PageRank) to your website pages.

This list of technical SEO items will help ensure you’re hitting the key points that need to be addressed for technical SEO. It doesn’t stop here, but it’s a damn good start.


  1. XML sitemap: Many websites fall short when it comes to one of the most basic ways of ensuring that Google can crawl and index your content. Ensuring that your XML sitemap only includes indexable, quality content pages — without errors — is a critical first step in technical SEO.
  2. Robots.txt commands: Like sitemaps, we see many shortcomings, even incorrect commands, in /robots.txt files for many websites. Such issues cause improper crawling from search bots, limit your organic search traffic potential and potentially increase your chance of a penalty.
  3. Duplicate URL paths: Matt Cutts, mentioned above, stated in a webmaster video that 25-30 percent of the Web is duplicate content. Clearly, this is quite a challenge when indexing the Web, even for the world’s largest search engine. Are you making Google’s job of indexing and ranking your web pages easier by ensuring you only have single URL paths to each page of content on your website? Consider your tracking URLs, filtered/faceted URLs, etc.
  4. Low-quality utility pages: Are you preventing Google from indexing unnecessary page types on your website, such as internal search results, tag pages, etc.?
  5. Internal crawl errors and redirects: These could affect user experience, so they could contribute to penalty.
  6. Internal linking structure: Such as sitewide over-optimized anchor text with multiple variations on the same page.
  7. Mobile-Friendly: The Web has gone mobile, and Google has been telling us this for years. If you’ve not listened, you will see your traffic engagement and conversion metrics dropping — regardless of SEO.


Clearly, there’s a lot more to SEO than many realize. All these checklist items propose areas of risk, or opportunity, depending on how well your company website addresses them. This is an important distinction to make. These perceived “risks” can actually be turned into “strengths,” which help you better compete with your industry’s competitors in search engines. Now is the time to ensure everyone in your department is on the same page regarding these important elements of SEO and reap the benefits of fixing or implementing them properly.

BONUS: What Our Fellow SEO Professionals Suggest

As a bonus, we decided to reach out to some of our notable colleagues to get their input on SEO items that present risks (or opportunities) to consider. They offer unique perspectives that complement this checklist well. We hope you find them valuable. Enjoy!

Rand Fishkin (Founder, Moz)

Rand FishkinRand Fishkin, founder of Moz, offered his opinion on the top three items he would recommend a marketing executive assess to gauge their company’s risk with Google algorithms or SEO threats in general. As always, he has a unique perspective on how to look at your SEO activities. We especially like his first point regarding pruning the number of low-value pages indexed by Google. Assuming they don’t provide value in any other way, you can delete/redirect them. However, if they do provide value from other traffic sources, we suggest setting them to “noindex,follow” via meta robots tags. Here is what Rand had to say. (Follow him on Twitter)

  1. Percent of your pages receiving >1 search visitor from Google each week/month. If you have a large number of pages with very few search visits, or if you start to see this metric slip on parts of your site, you may be at risk of any number of algorithms that penalize or devalue low-quality content.
  1. Google Webmaster Tools (now “Search Console”) warnings and crawl data/errors. Search Console/Webmaster Tools data is often curiously inaccurate at best, but the crawl errors and warnings are a bright spot that can surface significant issues in need of addressing. Unfortunately, it doesn’t cover nearly all of what you need, but tools like ScreamingFrog, Moz or can help if a regular, robust site crawl is needed.
  1. Direct knowledge and influence over any pro-active link acquisition tactics your organization or its consultants may be using. Neither Google nor Bing nor any of the third-party link data providers will comprehensively show all the links that come to your site. And, to be honest, no one has the time to review at that level of detail. But, if you know about every proactive link technique being employed by your team or your consultants, you can dramatically lower, or even eliminate, the risk of being caught up in Google’s link-based penalties. So long as all your links are from high-quality sites and given editorially, you should be fine. Link spam created by others, intentional on their part or not, almost never adversely affects rankings.


Glenn Gabe (Founder, G-Squared Interactive)

Glenn GabeGlenn Gabe, founder of G-Squared Interactive, is an industry-leader in Google algorithm analysis and penalty recovery. Based upon his experience with helping companies with grave SEO situations, such as battling major algorithm updates and serious technical SEO problems, Glenn believes it’s critically important to have executive leadership heavily involved from the beginning. Sure, C-level executives aren’t the ones auditing and executing changes, but they can sure make or break an SEO initiative. Here are three things Glenn believes executives can do to gauge their risk with regard to Google algorithm updates and/or major SEO threats. (Follow him on Twitter)

  1. Don’t get blindsided, know your weaknesses: I can’t tell you how many times companies have called me after getting absolutely blindsided by an algorithm update. Yet after quickly checking out the site, I can easily spot serious SEO problems (either from a technical, content or links standpoint). By actively and continually auditing, crawling and analyzing your site from an SEO perspective, you can nip serious problems in the bud.
  1. Have the right team in place: When I think of the most successful SEO projects I’ve worked on, it’s hard to overlook the power of having the right team executing at a high level. And that’s not just the SEO team. I’m also referring to designers, developers, business analysts, digital marketers and project managers (PMs). By putting the right team in place, you have a much better chance at beating a serious SEO problem than randomly throwing some smart people in a room.
  1. Avoid red tape: Red tape can be a massive challenge for getting important SEO changes implemented efficiently. I’ve seen bureaucracy absolutely kill SEO efforts. Don’t let other parts of your organization push around SEO. Chances are they don’t understand the full impact that search can have on increasing sales, increasing awareness, fighting off the competition, etc. Don’t let important changes get pushed to “next quarter,” “end of year” or “sometime in the distant future.” SEO is not a nice-to-have. It’s incredibly important for most organizations. Stand up for SEO. That’s how you win.


Marie Haynes (Consultant/Founder, Marie Haynes Consulting Inc.)

Marie HaynesMarie Haynes, founder of Marie Haynes Consulting Inc., offered some great insight about measuring the impact of algorithm updates on your organic Google traffic, as well as how to gauge your need for a link audit and how to view the Panda algorithm as an opportunity. Be sure to check out her huge list of possible and suspected algorithm changes to stay on top of the latest updates. (Follow her on Twitter)


It’s tough to narrow things down to three items when it comes to assessing risks, but here are my thoughts:


  1. At this time, my most useful tool when it comes to assessing the impact of major algorithm changes, such as Panda and Penguin, is the use of Google Analytics. If we can look at the Google organic traffic and see a date where traffic dips that coincides with a known or suspected refresh or update of either Panda or Penguin, then this is a good clue to help us determine where the problem areas are. However, this technique is likely to become less helpful as time goes by, as eventually Google plans to make both of these filters a baked in part of the algorithm.
  1. I often see site owners who rush to get a link audit done, when perhaps the issues with the site go far beyond links. To assess whether a link audit is a good idea, I look for any of the following (if none of these are present, then there is a good chance that the site’s problems go beyond the link profile):
      • History of obviously low quality link building
      • History of large scale link building of any kind
      • A high presence of links anchored with keywords for which you are trying to rank
      • A drop in traffic that coincides with a known Penguin update or refresh
      • History of negative SEO or links as a result of pharma-hacks and the like (Although it is debatable how effective these are at lowering rankings.)
  1. My third tip involves how site owners can possibly benefit from the Panda algorithm. Having a technically sound site is incredibly important. I think every site should have a thorough site quality audit done every few years. Now, this next part is a bit of conjecture on my part, but I do believe that user experience is becoming more and more of a factor in the eyes of Panda. We could argue about how Google would determine which site was better from a user’s perspective. Many people have had great discussions about whether things like bounce rate, dwell time, form completions, etc. are ranking factors. But I like to look at the overall picture from Google’s perspective. Google’s goal is to show users the most helpful result for their query. If, for whatever reason, users are consistently preferring to hang out or order products on your competitor’s site, then Google wants to show that site first.

I think a lot can be gained by having a third party objectively review your competitors and give opinions on what they are doing better than you and how you could improve. If a competitor’s product page has helpful reviews, videos, awesome photographs and a guide on how to use the product, and your page has the standard product photo and stock description and that’s it, which page will users prefer? Which page will Google want to rank first?

It is my opinion sites that can figure out how to consistently be as useful as possible, will be able to get a leg up as Panda continues to find new ways to determine which sites are the highest quality ones.