Thursday, October 9, 2014

Your Guide to Google's Panda 4.1 Algorithm

Google's Panda algorithm update has been shaking up the world of SEO since its first iteration back in 2011. Over the course of 26 confirmed updates and refreshes, the algorithm wizards at Google have been refining the content-focused search engine update and rolling their new features out gradually. Now, the 27th update has arrived, and it's significant enough to be called 4.1.
I've written this guide to summarize the most important elements of the Panda 4.1 update, including the motivation behind its release and the potential consequences for online businesses.

Understanding the Landscape

Before I get too deep in the mechanics of the new Panda update, it's important to recap the stages of development that have led to this point.

The first Panda update, known as Panda 1.0, was released in February of 2011, affecting nearly 12 percent of all search queries and making it one of the most significant algorithm changes in Google's history. Intended to weed out low-quality content and reward sites with high-quality, actively updated content, the Panda update radically changed SEO strategies from a basis in keyword-stuffing gimmick to a basis in user-focused content strategy.

Following up on that update, Google released Panda 2.0 in April of 2011, affecting about 2 percent of all search queries. Several updates followed on a monthly basis, until the next major update--Panda 3.0--in October of 2011, affecting another 2 percent of all queries. Monthly updates resumed over the course of 2012, but in 2013, those updates slowed to a crawl, with the only officially announced updates hitting in January and March of 2013.

The next major update was in May of 2014, a rollout known as Panda 4.0, which affected approximately 7.5 percent of all English search queries, adding more features to reward high-quality content and detect instances of low-quality writing. Now, we're looking at the rollout of a major follow-up to that update: Panda 4.1.

Enter Panda 4.1

The rollout began on September 25, 2014, but according to an official Google+ update from Pierre Far, the rollout was a slow, gradual one, planned end in the first week of October. According to Far, the update continues in the tradition of previous Panda updates by identifying low-quality content with more precision. Google has evidently added a few new signals of low quality to its algorithm--but of course, it's not making those signals public.

Depending on the geographic location of the site in question, Panda 4.1 is estimated to impact between 3-5 percent of search queries, establishing it as more impactful than the minor monthly updates that occurred throughout 2011 and 2012, but less significant than the flagship updates of 1.0, 2.0, 3.0, and 4.0.

Good news and bad news

According to Pierre Far, the update should be good for small- to medium-sized businesses who prioritize the quality of their content. Ideally, the highest rankings of search queries should become more diversely populated with a wide range of different site sizes. This means site owners who have been stepping up their efforts to produce more high quality content should be rewarded with higher rankings as a result of the update.

However, as with all Google updates, there's also a chance to drop in the rankings as a result of Panda 4.1. If any of the "new signals" that determine the quality of the content on your site tell Google that you're not up to their standards, you could suffer a drop--even if you're doing everything else right.

Who's affected?

Early analyses are already illustrating a clear pattern. According to SearchMetrics, it looks like the biggest winners of the update are news sites, content-based sites, and download portals. The logic behind this indicates that because these sites are regularly updated with new, (presumably) high-quality information, they're getting an extra boost in favoritism thanks to whichever new ranking signals were added to the Panda code.

On the other hand, the sites that have been hit hardest so far appear to be lyric sites, gaming sites, and some medical content sites. Put bluntly, these sites tend to have thin, repeated, or aggregated content--which to Google, translates to unoriginal or low-quality work. Lyrics sites tend to feature the exact same content as their competitors (since song lyrics don't change from site to site), gaming sites don't have as much content as other platforms, and medical content sites often abuse already-existing stores of content to produce an aggregated store of information.

Specific examples of sites that were hit hard include medterms.com, which saw a drop in almost 40 percent of their search visibility, as well as hallmark.com, office365.com, and eHow.

How to Recover if You've Been Hit by Panda 4.1

If you've noticed an increase in rankings, then congratulations--the Panda 4.1 update has been working in your favor! If, however, you've noticed your organic visits or your search visibility dropping over the course of the last couple weeks, Panda 4.1 could be the culprit. It will take some time before you're able to recover fully, but you can start that path with these steps:

1. Review and Renew Your Content Strategy

The best thing you can do is perform a content audit of your current site. Take a look at some of your competitors--preferably, some of your competitors who are currently ranking higher than you as of this update. What are their content strategies doing that yours is not? What is yours doing that theirs are not doing? Don't simply mimic their strategies. Instead, try to learn from them and incorporate some new ideas into your campaign.

How often are you updating your blog? How many ideas are original? What types of content appear to be the most useful, and are you providing those? How in-depth does your content get? These are all important questions to answer as you plan out the next few months of your campaign.

2. Delete Duplicate and Thin Content

From what we've seen so far, it looks like Panda 4.1 has hit content aggregators and duplicators the hardest. Straightforward plagiarism and content syndication have been dealt with already by earlier Panda updates, but Panda 4.1 has become more sophisticated. Are any of your content sources basic rewordings of other articles? Are many of them quote-based, or reliant on outside sources? It's best to get rid of them and focus more on creating unique, original content.

What to Look for in the Future

This most recent update comes approximately four months after the last one we saw, back in May. Throughout 2012, Panda was treated with near-monthly updates, but if this new spacing is any indication, Panda could be on a quarterly updating schedule--at least for the time being. That would put the next update in December of 2014 or January of 2015, and it could carry an additional set of ranking signals.

It's certain that Google will continue to update their algorithm with more sophisticated signals and ranking formulas; it's only a matter of when. But Panda 4.1 is continuing a several-year tradition of rewarding high-quality original content, and it's likely that this trend will only continue. The best thing you can do is refine your content creation strategy and make sure your blog is as original, unique, valuable, and well-written as possible. Publish new, high-quality content regularly, and you'll put yourself in a perfect position for a boost when Panda 4.2--or whatever the next update is--starts rolling out.

Tuesday, September 30, 2014

A Guide To SEO For Entrepreneurs In Locally Serving Industries

For personal injury lawyers, chiropractors, dentists, and other practitioners in locally serving businesses, SEO plays a major part in attracting clients and achieving success. Unfortunately, you can mess up in a heartbeat if you aren’t careful. Let’s look at how to appropriately deal with SEO in ways that are pleasing to both search engines and your audience.
The Value of SEO for Locally Serving Industries

According to Lawyernomics, the second most popular way to find an attorney is via Google , Bing, or Yahoo . A healthy 21.9% start their search in the little text box on their internet’s home page. Another 10.5% look elsewhere on the internet. That means nearly one in three clients is starting their search for an attorney online.

While those numbers are specifically geared towards lawyers, the statistics for other local industries is similar. Whether you’re a chiropractor, dentist, or some other practitioner, you can expect a large percentage of your new client traffic to come from online sources – specifically search engines.

The Difficulty of Understanding SEO

Sounds pretty easy, right? Invest heavily in SEO, and you’ll begin to see your business grow and expand. If only it were that simple. Like anything else in the business world, it takes hard work to be successful.

The trouble with tapping into SEO is that Google and other search engines are continually changing their algorithms and rules. What’s true today may not be true tomorrow. They are free to update as they please – and they do quite frequently. As a result, it’s almost impossible for practitioners with fulltime jobs to learn the best SEO practices in their free time. As soon as you learn how to do something, the rules will change. This is where SEO professionals come into play.

Investing in Professional SEO Help

Conduct a search for SEO help and you’ll find millions of results. SEO is a booming industry, and people with knowledge of how it works are eager to find a place in the market. Unfortunately for entrepreneurs, lawyers, and others with little experience in the industry, it’s challenging to determine which SEOs are experts and which are salesmen.

As a credible law firm or business, you can’t afford to associate yourself with an SEO “professional” that doesn’t know the rules. You could (A) end up wasting lots of money, or (B) damage your online credibility. Often, it’s both.

One industry expert relates the SEO frontier to the “Wild West Era” of American history – and there are a lot of good analogies to be gleaned from this example. Much like the western frontier was largely unknown in the 18th and 19th centuries, there is still a lot to learn about SEO. That means so-called “experts” will take advantage of this fact and advertise their “skills” without any real understanding of how it works. Just keep in mind that anyone can advertise and look for reviews and testimonials when selecting a professional SEO.

The Latest Algorithm: Google Hummingbird

When searching for SEO help, it’s valuable to understand some of the basics about how search engine optimization works. While you may not have the time to study the intricate details of the practice, a fundamental understanding of what’s legal, what’s not, and how Google’s latest algorithm works will allow you to have productive conversations with SEO candidates.

Three years ago it was Google Panda. Google Penguin released later in 2012, and Google Hummingbird followed. Released in October 2013, Hummingbird is the most sophisticated algorithm to date. One of the most interesting and important elements to note is the focus it puts on semantic search and local results. For lawyers and locally serving practitioners, it’s important to understand what this means. Here are a few tips to keep in mind:

  • High-quality content. Regardless of what you’re told, high-quality content is still the most important aspect of SEO. Your practice needs to focus on its core practice areas and hone in on semantically pleasing content.
  • Contact info matters. Because Hummingbird seeks to give searchers local results, your contact information needs to be accurate and appropriately displayed. This means setting up and maintaining social media profiles, including Google Plus.
  • Keywords have their place. When discussing what strategy an SEO professional will implement for your practice, pay attention to how they discuss keywords. While they are diminishing slightly in value, they still matter. There needs to be a healthy balance.
  • Banding Together for Knowledge and Power

For professionals in locally serving industries, it’s important to band together to preserve and build your online presence in an ever-changing world of SEO. Instead of attempting to handle it on your own, consider joining forces with industry peers to learn how to best attack the issue and gain internet visibility.

Take, for example, Michael Ehline of Ehline Law Firm PC. In an effort to teach and share information about SEO guidelines, search practices, and information, he started the Circle of Legal Trust. This group consists of attorneys, consumers, and search experts all focused on improving ethical internet practices and helping professionals improve their understanding of how to operate in an increasingly internet-based marketing world.

For those in other locally serving industries, there are similar groups and organizations. While it’s difficult to fully understand SEO without an extensive background in the industry, it’s easy to grasp the significance of maintaining a healthy presence online. By partnering with other professionals and groups, you can ensure your practice is in good hands.

Monday, September 29, 2014

Is Your Website Being Indexed Properly by Google?

One of the most common problems experienced when trying to rank in Google, is that your website is not currently being indexed correctly. If this is the case, it means Google is failing to access your web pages to index your site’s content effectively.
To check whether your site is efficiently crawled and listed, you will need to log into your Google Webmaster Tools and check the “Google index” tab. There you will find the total number of pages the search engine has indexed. If you see a drop in the number of these pages, you are likely to experience a decrease in traffic levels.

Finding the Reason Behind Your Indexing Issues
If you’ve taken a look at your Webmaster Tools and it’s clear that not all your pages are being found by Google’s crawlers, now is the time to take a closer examination at the possible problems Google is experiencing with your website.

Does Your Site Have Crawler Errors?
To find out if Google is indexing your site fully, begin by heading to your Google Webmaster Tools dashboard and checking your Crawler Error messages. The most-likely error message you will find is a 404 HTTP Status Code warning. It signals that the URL cannot be found.

Other crawling errors include:
  • Robots.txt – A   poorly scripted Robots.txt file can be detrimental to your Google indexing. This text file is like a set of instructions telling a search engine crawler not to index parts of your website. If it includes a line such as “User-agent: *Disallow: /” this basically tells every single crawler it experiences to ‘get lost’ – including Google.
  • .htaccess – This invisible file can do nasty things if incorrectly configured on your site. Most FTP clients allow you to toggle hidden/seen files so that you can access it if required.
  • Meta Tags – If you have pages that aren’t being indexed, be sure they don’t have the following meta tags in the source code: <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
  • Sitemaps – If you receive a Sitemaps crawling error, it means your website sitemap is not updating properly; your old sitemap is being repeatedly sent to Google instead. When you’ve tackled any issues signalled by the Webmaster Tools, make sure you run a fresh sitemap and re-submit it.
  • URL Parameters – Google allows the option to set URL parameters when it comes to dynamic links. However, incorrect configuration of these can result in pages that you do want picked up being dropped instead.
  • DNS or Connectivity issues – If Google’s spiders simply can’t reach your server, then you may encounter a crawler error. This could be for a variety of reasons such as your host is down for maintenance or had a glitch of their own.
  • Inherited Issues – If you have bought an old domain or moved your website to an old website’s location it is possible the previous site had a Google penalty. This will inhibit indexing of the new site. You will have to file a reconsideration request with Google.
  • If you are considering using a historic domain for your site, be sure to take a look at its history before purchasing. You can make use of the Internet Archive’s Wayback Machine to see pages that were previously hosted on your domain.


Does Your Site Have Syntax Errors or Structural Complications?
Google is very tolerant when it comes to HTML mark-up mistakes within webpages, but it is possible that syntax errors can prevent indexing (in extreme cases). Check your site’s HTML with the W3C’s HTML Validator to see a report of errors you need to correct.

Google advises you make your site structure as logical as possible. Every page should be reachable from at least one test link. You can use a text browser, like Lynx, to look at your site much the same way the spiders see it. Remember, the parts of your site that use frames, JavaScript, Flash, session IDs, cookies and DHTML may be missed by crawlers.

Does Your Site Have Inbound Links?
To be indexed with Google, your website needs to have at least one quality inbound link from another website already indexed in the search engine. This is a common reason it takes a lot of new websites a while to be successfully indexed.

One way to create some quick links is to update social networks with your website URL or add a link on an existing related website that you own. Social media profiles that carry high weight include: Facebook pages, Twitter profiles, Google+ profiles/pages, LinkedIn profiles, YouTube channels, and Pinterest profiles.

Offsite content is another excellent way to build links that will help your site get indexed properly. Offsite content is content relevant to your site that is hosted elsewhere: guest posts on other blogs in your niche. Just keep in mind, you need to make sure these external sites are all high quality, as links from ‘spammy’ sites will do your website harm instead of good. The best way to ensure your links are high quality is to make sure that they have ‘natural links’, links that develop as part of the dynamic nature of the internet where other sites link to content they find valuable.

See Google’s Webmaster Guidelines for a more in-depth understanding of what they consider to these to be.

Has Google Penalized You?
One of the most difficult obstacles in proper indexation by Google is a Google penalty. There are a number of reasons why you might encounter a penalty from Google, but if you do not deal with the issue they raise, you may be deindexed (removed from their search engines).

Avoid Google Penalties by Steering Clear of The Following Techniques:
  • Automatically generating content
  • Link schemes
  • Plagiarizing or duplicating content
  • Cloaking
  • Sneaky redirects
  • Hidden links & text
  • Doorway pages
  • Content scraping
  • Affiliate programs with little content value
  • Using irrelevant keywords
  • Pages that install trojans, viruses, & other adware
  • Abusing rich snippets
  • Automating queries to Google

Recovering from Google penalties requires hard work and due diligence to remove links on your part; you will need to submit a reconsideration request before your site is effectively indexed and ranked once more.

Fix Your Indexing
Most of these checks are quick and easy to make, so don’t let your SEO and link building efforts go to waste – make sure your website is indexed correctly by Google. It’s surprising how many websites make some of the smallest mistakes, and it prevents their site from being indexed correctly. In the end, it hurts their website’s rankings, which hurts their traffic, which hurts their sales.

Friday, September 26, 2014

SEO: How to Identify Low Quality Links

Links are the lifeblood of organic search. But the quality of those links can boost or kill a site’s rankings. This article suggests methods to determine the quality of in-bound links to your site. At the end of the article, I’ve attached an Excel spreadsheet to download, to help you evaluate links to your site.
Importance of Links

Search engine algorithms have traditionally relied heavily on links as a measure of a site’s worthiness to rank. After all, links are, essentially, digital endorsements from the linking site as to the value of the site to which it is linking.

Google was founded on this concept of links indicating value. In addition to the relevance signals that other engines used in their algorithms, Google added PageRank, a method of calculating ranking value similar to the way that citations in the scientific community can indicate the value of a piece of research.

When site owners began creating artificial methods of increasing the number of links pointing to their sites to improve their rankings, the search engines retaliated with link quality measures. Google’s Penguin algorithm is one such algorithmic strike intended to remove the ranking benefit sites can derive from poor quality links.

What Makes a Low Quality Link?

Unfortunately, the definition of a poor quality link is murky. Poor quality links come from poor quality sites. Poor quality sites tend to break the guidelines set by the search engines. Those guidelines increasingly recommend that sites need to have unique content that real people would get real value from. That’s pretty subjective coming from companies (search engines) whose algorithms are based on rules and data.

The “unique” angle is easy to ascertain: If the content on a site is scraped, borrowed, or lightly repurposed it is not unique. If the site is essentially a mashup of information available from many other sources with no additional value added, it is not unique. Thus, if links come from a site that does not have unique content — i.e., a site considered low quality — those links would be low quality as well.

Search engines can identify unique content easily because they have records of every bit of content they’ve crawled. Comparing bits and bytes to find copies is just a matter of computing power and time. For site owners, it’s more difficult and requires manual review of individual sites.

There are other known indicators of low-quality sites as well, such as overabundance of ads at the top of the page, interlinking with low-quality sites, and presence of keyword stuffing and other spam tactics. Again, many of these indicators are difficult to analyze in any scalable fashion. They remain confusing to site owners.

In the absence of hard data to measure link and site quality in a scalable way, search engine optimization professionals can use a variety of data sources that may correlate with poor site quality. Examining those data sources together can identify which sites are likely to cause link quality issues for your site’s link profile.

Data such as Google toolbar PageRank, Alexa rankings, Google indexation and link counts, and other automatable data are unreliable at best in determining quality. In most cases, I wouldn’t even bother looking at some of these data points. However, because link quality data and SEO performance metrics for other sites is not available publicly, we need to make due with what we can collect.

These data should be used to identify potential low-quality sites and links, but not as an immediate indicator of which sites to disavow or request link removal. As we all know, earning links is hard even when you have high quality content, especially for new sites. It’s very possible that some of the sites that look poor quality based on the data signals we’ll be collecting are really just new high-quality sites, or sites that haven’t done a good job of promoting themselves yet.

While a manual review is still the only way to determine site and link quality, these data points can help determine which sites should be flagged for manual review.

A couple of reports can provide a wealth of information to sort and correlate. Receiving poor marks in several of the data types could indicate a poor quality site.

Google reports the top 1,000 domains that link to pages on your site. “Links” refers to the total number of links that domain has created pointing to any page on your site. “Linked Pages” refers to the number of pages that domain has linked to. So a domain may link to 10 pages on your site, but those links are on every page of their own site. If the linking site has 100 pages, that’s 1,000 “links” to 10 “linked pages.”

You can also download this report that shows a large sample of the exact pages linking to your site. In some cases the links are from domains not listed in the Link Domain Report, so you may want to add the domains from this report also.

Red flags. Generally, higher numbers of “links” and “linked pages” indicate that the domain is a poor-quality site.

This plugin turns Excel into an SEO data collector, enabling you to enter formulas that gather data from various websites.

What to use. For link quality I typically use the following.
  • Home page Google PageRank. Shows Google toolbar PageRank, which is only updated every three months and may not show accurate data but useful as a relative comparison. Higher numbers are better.
  • Google indexation. The number of pages Google chooses to report are indexed for the domain. The pages reported by Google are widely believed to be a fraction of the actual number, but it’s useful as a relative comparison. It’s the same as doing a site:domain.com search. Higher numbers are better.
  • Google link count. The number of links pointing to a domain according to Google. Wildly underreported, but just barely useful as a relative comparison. Same as doing a link:domain.com search. Higher numbers are better.
  • Alexa Reach. The number of Alexa toolbar users that visit the domain in a day. Higher numbers are better.
  • Alexa Link Count. The number of links to the domain according to Alexa’s data. Higher numbers are better.
  • Wikipedia entries. The number of times the domain is mentioned in Wikipedia. Higher numbers are better.
  • Facebook Likes. The number of Facebook Likes for the domain. Higher numbers are better.
  • Twitter count. The number of Twitter mentions for the domain. Higher numbers are better.

Cautions. Every cell in the spreadsheet will execute a query to another server. If you have many rows of data, this plugin will cause Excel to not respond and you’ll have to force it to quit in your task manager. I recommend the following steps.
  • Turn on manual calculation in the Formulas menu: Formulas > Calculation > Calculate Manually. This prevents Excel from executing the formulas every time you press enter, and will save a lot of time and frustration. Formulas will only execute when you save the document or click Calculate Now in the aforementioned options menu.
  • Paste the formulas down one column at a time in groups of 50 to 100. It seems to respond better when the new formulas are all of the same type (only Alexa Reach data, for example) than if you try to execute multiple types of data queries at once.
  • Use Paste Special. When a set of data is complete, copy it and do a Paste Special right over the same cells. That removes the formulas so they don’t have to execute again. I’d leave the formulas in the top row so you don’t have to recreate them all if you need to add more domains later.
  • Use a PC if you can because Apple computers tend to stall out more quickly with this plug in.

Manual Quality Review

If a site has high numbers in the Google Webmaster Tools reports and low numbers in the SEO Tools data, it should be manually checked to determine if it’s a poor quality site, sending poor-quality links your way. The following are the quality signals I use for manually reviewing link quality.
  • Trust. Would you visit this site again? Do you feel confident about buying from the site or relying on its advice? Would you recommend it to your friends? If not, it’s probably low quality.
  • Source. Is this site a source of unique information or products? Does this site pull all of its content from other sites via APIs? Is it scraping its content from other sites with or without a link back to the source site? Does it feel like something you could get from a thousand other sites? If so, it’s probably low quality.
  • Ad units in first view. How many paid ad units are visible when you load the page? More than one? Or if it’s only one, does it dominate the page? If you weren’t paying close attention would it be possible to confuse the ads with unpaid content? If so, it’s probably low quality.
  • Use Searchmetrics. Enter the domain in the Searchmetrics’ search box to get search and social visibility, rankings, competitors, and more. It’s free, with an option to subscribe for many more features. I’ve included this in the manual review section because you have to paste each domain in separately. It does, however, provide a balancing analytical approach to the subjective nature of manual review.

Finally, when reviewing sites manually, don’t bother clicking around the site to review multiple pages. If one page is poor quality it’s likely that they all are. In particular, the home page of a site typically represents the quality of the entire site. Download this Excel spreadsheet to help organize and evaluate links to your site.a

Thursday, September 25, 2014

Panda 4.1 Google’s 27th Panda Update Is Rolling Out

Google has announced that the latest version of its Panda Update a filter designed to penalize “thin” or poor content from ranking well has been released.
Google said in a post on Google+ that a “slow rollout” began earlier this week and will continue into next week, before being complete. Google said that depending on location, about 3%-to-5% of search queries will be affected.

Anything different about this latest release? Google says it’s supposed to be more precise and will allow more high-quality small and medium-sized sites to rank better. From the post:

Based on user (and webmaster!) feedback, we’ve been able to discover a few more signals to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.
New Chance For Some; New Penalty For Others
The rollout means anyone who was penalized by Panda in the last update has a chance to emerge, if they made the right changes. So if you were hit by Panda, made alterations to your site, you’ll know by the end of next week if those were good enough, if you see an increase in traffic.

The rollout also means that new sites not previously hit by Panda might get impacted. If you’ve seen a sudden traffic drop from Google this week, or note one in the coming days, then this latest Panda Update is likely to blame.

About That Number
Why are we calling it Panda 4.1? Well, Google itself called the last one Panda 4.0 and deemed it a major update. This isn’t as big of a change, so we’re going with Panda 4.1.

We actually prefer to number these updates in the order that they’ve happened, because trying to determine if something is a “major” or “minor” Panda Update is imprecise and lead to numbering absurdities like having a Panda 3.92 Update.

But since Google called the last one Panda 4.0, we went with that name — and we’ll continue on with the old-fashioned numbering system unless it gets absurd again.

For the record, here’s the list of confirmed Panda Updates, with some of the major changes called out with their AKA (also known as) names:

Panda Update 1, AKA
Panda 1.0, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
Panda Update 2, AKA
Panda 2.0, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
Panda Update 8 AKA
Panda 3.0, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
Panda Update 11, Feb. 27, 2012 (no change given; announced)
Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
Panda Update 16, June 25, 2012: (about 1% of queries; announced)
Panda Update 17, July 24, 2012:(about 1% of queries; announced)
Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
Panda Update 20 , Sept. 27, 2012 (2.4% English queries, impacted, belatedly announced
Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
Panda Update 25, March 15, 2013 (confirmed as coming; not confirmed as having happened)
Panda Update 26 AKA
Panda 4.0, May 20, 2014 (7.5% of English queries were affected; confirmed, announced)
Panda Update 27 AKA
Panda 4.1, Sept. 25, 2014 (3-5% of queries were affected; confirmed, announced)
The latest update comes four months after the last, which suggests that this might be a new quarterly cycle that we’re on. Panda had been updated on a roughly monthly basis during 2012. In 2013, most of the year saw no update at all.

Of course, there could have been unannounced releases of Panda that have happened. The list above is only for those that have been confirmed by Google.

Wednesday, September 24, 2014

How The Apple Watch Could Change The World Of Local SEO

Apple recently unveiled the up-and-coming sixth generation of its iPhone, as well as its next gadget, which will arrive in stores in early 2015—the Apple Watch. SmartWatches have been experimented with by other companies in the past, but Apple’s foray into wearable smart technology could mark the beginning of a new tech era and some radical changes for the world of local SEO.
If you represent a local business trying to boost online visibility for your brand, it’s time to start looking at how the Apple Watch is changing the rules and think about what you can do to stay ahead of your competition.

The Apple Watch: Bringing SmartWatches Back Into the Spotlight

SmartWatches have occasionally popped up on the market for the past few years, but none of them have really caught on with the public. Wearable technology, like SmartWatches and Google Glass, have generated significant interest, but only on a theoretical level thus far; the devices have generated word-of-mouth buzz, but sales and reviews have been lukewarm at best.

The Apple Watch seeks to change that landscape and bring wearable technology into public favor. Featuring the now-familiar voice-activated digital assistant Siri, Apple Play, Apple Music, and surely thousands of downloadable apps, the Apple Watch is stepping ahead of its SmartWatch competitors.

The most popular feature of the new device, and the most significant for local SEO, is its new mapping feature. Rather than showing a map and speaking audible directions, like smartphones and older navigation systems, the SmartWatch will use a system known as “haptic feedback” to provide hands-free, eye-free directions with directional buzzes. Time will tell how functional and practical this system is for navigation, but the early buzz seems to indicate an overwhelming excitement for the new product. If Apple delivers the same level of quality in its SmartWatch as it has its many generations of iPhone, it could be a true game changer.

Siri’s Relationship with Bing

Local search today is still dependent on search engines. Google is by far the most popular search engine, especially for local results, so most SEO campaigns cater specifically to Google. The popularity of smartphones keeps rising, leading many to perform searches on their mobile devices rather than on their home computer. The result of these trends is that search queries are changing; rather than typing search queries on a keyboard, users are speaking the search queries into their smartphones’ microphones.

With the dawn of the Apple Watch, Siri may accelerate this trend. The Apple Watch’s small screen and location on the wrist may make it more difficult to use your fingers to input data, encouraging users to speak search queries rather than type them.

Tell Siri to search for a “burger restaurant,” and she’ll populate a handful of local results. But currently, Siri uses Bing to populate that information. That means that local search marketers, in order to capture the attention of Apple Watch users, will need to adjust their strategies to focus on Bing Local results (instead of just relying on Google). Fortunately, many fundamental strategies will still apply—such as optimizing listings on local directories like Yelp—but Bing may soon see a surge in popularity due to Siri’s reliance on it.

Optimizing for Apple Maps

The Apple Watch will come with Apple Maps as a default navigation system. While many iPhone users have opted to use Google Maps on their devices instead, the Apple Watch could foster a new generation of Apple Maps users. That means local search markers will need to take extra steps to ensure their businesses can be found on Apple Maps.

Apple Maps treats local businesses differently than its contemporaries. It doesn’t offer a “claim your business” style system, like Google does, that allows business owners to identify themselves and present accurate information for their directory. Apple Maps does provide an opportunity to report mistakes in listings, but this is not as accurate, transparent, or efficient as the similar system that Yelp! offers.

Apple Maps does pull at least some information from local directories and other information providers such as Yelp!, TomTom, Factual, and Localeze, so it’s possible to improve your listing on Apple Maps simply by updating your information on third party sites. This is already a best practice for local marketers, but it will take some extra effort to claim and update your information on some of the lesser-known third party local platforms.

“Super Local” Searches

Local search results are already impressive; Google can detect your general location when you search for, say, “Mexican restaurants,” and show you a list of Mexican restaurants near your current location (usually based on your IP address). While the notion is speculative for the time being, it seems reasonable that the onset of SmartWatch popularity could give rise to a new level of local search using GPS location information. Instead of focusing on results for a given query within a city, the SmartWatch could give you results within a given city block.

Again, this “super local” search is merely speculative at this point, but it pays to look to the future. Optimizing for a very specific crowd could eventually become more important than optimizing for a city or region.

Mobile Coupons and User Engagement

Mobile coupons have already become popular with smartphones, and interactive elements like QR codes have given smartphone users a chance to use their technology in real life for some kind of benefit (like a discount or more information). This trend will increase in sophistication as the Apple Watch arrives on the scene.

Users will demand even more immediacy, so if you can find a way to cater to those users faster than your competition, you’ll be on top of your local competitive market. While there are currently no details on specific offers local retailers can make to serve the Apple Watch crowd, it’s an idea to keep in the back of your mind as you rethink your local optimization strategy.

Overall, the fundamentals of local search will remain the same—ensure accurate information across all your local directories and give users an excellent mobile experience—but the Apple Watch will mark the beginning of a new series of trends. Business owners and marketers will have to spend more time optimizing for Bing and Apple Maps specifically, and will have to be prepared for the onset of super-specific local searches. Keep an eye out for more details about the Apple Watch as we get closer to its 2015 release.

Tuesday, September 23, 2014

Humanizing Your SEO Keywords

The Death Of Traditional SEO

Traditional SEO is dying and has actually been dying for quite some time now. Google, holding the lion’s share of the search market, has been on a quest to humanize big data. The company wants a world where our search technology values and prefers quality content over the “black hat SEO” infused garbage that cluttered the digital world.

You might not have noticed it. This change came in the form of penguins, hummingbirds, pandas, etc. The major algorithm updates Google has pushed over the recent years have all focused the search engine’s sights on high quality, human content. From these updates, it has become harder and harder to utilize traditional SEO methods. In fact, many experts and practitioners have made radical changes to their SEO approach.

Whether this is all for the better is still up for debate. What isn’t up for debate, however, is the need to adapt.
The Death Of The Keyword

In the past, one “go-to” SEO technique was the keyword: a simple word or phrase to include in the content and metadata of a site. We did this to get noticed by search engine crawlers: add a keyword here and there in the body, title, metatags, etc, and you’d be good. Unfortunately, this approach became abusive. It became too easy to create content that was effective for search crawlers, but not quite as much for human readers.

In Google’s string of algorithm updates, the preference for traditional keywords has decreased. In fact, Google has been known to penalize sites where this traditional SEO tactic has been used at the loss of quality content. Because of this change, a new approach is required to fill the gap left in the keyword’s wake.

Humanizing Keywords

It turns out you don’t have to radically change your approach to SEO keywords. Instead, you have to use keywords in a more human manner.

Dr. Andy Williams, author of “SEO 2014 & Beyond” asks a simple question, “Does your article (or site) sound as if it was written by an expert?” The answer to this question is a litmus test for how human your content appears to Google. The more human, the better. The key lies in the choice of keywords, the variety of keywords, and how these keywords are used.

Niche Vocabulary

If you open up a textbook on any subject, you will find a natural list of terms and words that are often used when talking about said subject. In many ways, these words makeup a unique language for the subject. To humanize your content, you have to identify and use this unique subject vocabulary.

  • Discover the Language – just like with traditional SEO keywords, the first step is to figure out what words and terms you should be using for your topic. If you are a subject matter expert writing in your expert field, then chances are you naturally know this niche vocabulary. If you aren’t familiar with the language, however, then some research will be required. Search for a list of common terms, or analyze a few other articles written with the same topic, to see the language others use with a given subject.
  • Use the Language – once you know the niche vocabulary, you have to use it in a natural way. The days of randomly seeding an article with loosely connected keywords are gone. Your content must be written in the way an expert would write or speak about it. The easier it is for your audience to read it, the more Google will likely value it.

The reason why all this works is simple: niche vocabulary is natural vocabulary. Before, keywords that might been useful for SEO wouldn’t have necessarily been the most relevant to the subject. Conversely, terms that were most relevant to the topic might not have been the most obvious choice as a traditional SEO keyword.

A Public Speaking Case Study

As a speech educator, I’m naturally writing articles about public speaking to share with the world. In the past, my traditional SEO keyword approach would have been straightforward: find a keyword generator, type in “public speaking”, and then pepper my content with the top five keywords that came up. Sometimes this approach would have resulted in a natural sounding article, but more often than not, my SEO trickery was blatant to human readers.

Given that I’ve been doing this whole “public speaking” thing for over 12 years, it’s safe to assume I’m pretty familiar with the niche vocabulary. Once I began to realize and embrace the changes that Google brought with its algorithm updates, the process of making more optimized content actually became easier. All I had to do was write articles in the same language I would use elsewhere with real humans. Suddenly, less relevant keywords (ie “fear of public speaking”) that I would mention many times in an article disappeared. Relevant terms (ie “delivery”, “attention grabber”, “ethos”, “topical organization”) that would have never come up in a keyword search had more importance. The terms that related more to the actual topic became more important.

Embrace The Change

26836f82e46040fca5a806badc0036547afa3dbd

Niche vocabulary is a method for not only making your content more search engine friendly, it also makes your life easier. This allows you to write in a way that is natural for you, your readers, and the search engines. So, the next time you’re tempted to use that keyword generator, take a breath and just write. Chances are you will make something more optimized with a lot less work.