Thursday, October 9, 2014

Your Guide to Google's Panda 4.1 Algorithm

Google's Panda algorithm update has been shaking up the world of SEO since its first iteration back in 2011. Over the course of 26 confirmed updates and refreshes, the algorithm wizards at Google have been refining the content-focused search engine update and rolling their new features out gradually. Now, the 27th update has arrived, and it's significant enough to be called 4.1.
I've written this guide to summarize the most important elements of the Panda 4.1 update, including the motivation behind its release and the potential consequences for online businesses.

Understanding the Landscape

Before I get too deep in the mechanics of the new Panda update, it's important to recap the stages of development that have led to this point.

The first Panda update, known as Panda 1.0, was released in February of 2011, affecting nearly 12 percent of all search queries and making it one of the most significant algorithm changes in Google's history. Intended to weed out low-quality content and reward sites with high-quality, actively updated content, the Panda update radically changed SEO strategies from a basis in keyword-stuffing gimmick to a basis in user-focused content strategy.

Following up on that update, Google released Panda 2.0 in April of 2011, affecting about 2 percent of all search queries. Several updates followed on a monthly basis, until the next major update--Panda 3.0--in October of 2011, affecting another 2 percent of all queries. Monthly updates resumed over the course of 2012, but in 2013, those updates slowed to a crawl, with the only officially announced updates hitting in January and March of 2013.

The next major update was in May of 2014, a rollout known as Panda 4.0, which affected approximately 7.5 percent of all English search queries, adding more features to reward high-quality content and detect instances of low-quality writing. Now, we're looking at the rollout of a major follow-up to that update: Panda 4.1.

Enter Panda 4.1

The rollout began on September 25, 2014, but according to an official Google+ update from Pierre Far, the rollout was a slow, gradual one, planned end in the first week of October. According to Far, the update continues in the tradition of previous Panda updates by identifying low-quality content with more precision. Google has evidently added a few new signals of low quality to its algorithm--but of course, it's not making those signals public.

Depending on the geographic location of the site in question, Panda 4.1 is estimated to impact between 3-5 percent of search queries, establishing it as more impactful than the minor monthly updates that occurred throughout 2011 and 2012, but less significant than the flagship updates of 1.0, 2.0, 3.0, and 4.0.

Good news and bad news

According to Pierre Far, the update should be good for small- to medium-sized businesses who prioritize the quality of their content. Ideally, the highest rankings of search queries should become more diversely populated with a wide range of different site sizes. This means site owners who have been stepping up their efforts to produce more high quality content should be rewarded with higher rankings as a result of the update.

However, as with all Google updates, there's also a chance to drop in the rankings as a result of Panda 4.1. If any of the "new signals" that determine the quality of the content on your site tell Google that you're not up to their standards, you could suffer a drop--even if you're doing everything else right.

Who's affected?

Early analyses are already illustrating a clear pattern. According to SearchMetrics, it looks like the biggest winners of the update are news sites, content-based sites, and download portals. The logic behind this indicates that because these sites are regularly updated with new, (presumably) high-quality information, they're getting an extra boost in favoritism thanks to whichever new ranking signals were added to the Panda code.

On the other hand, the sites that have been hit hardest so far appear to be lyric sites, gaming sites, and some medical content sites. Put bluntly, these sites tend to have thin, repeated, or aggregated content--which to Google, translates to unoriginal or low-quality work. Lyrics sites tend to feature the exact same content as their competitors (since song lyrics don't change from site to site), gaming sites don't have as much content as other platforms, and medical content sites often abuse already-existing stores of content to produce an aggregated store of information.

Specific examples of sites that were hit hard include medterms.com, which saw a drop in almost 40 percent of their search visibility, as well as hallmark.com, office365.com, and eHow.

How to Recover if You've Been Hit by Panda 4.1

If you've noticed an increase in rankings, then congratulations--the Panda 4.1 update has been working in your favor! If, however, you've noticed your organic visits or your search visibility dropping over the course of the last couple weeks, Panda 4.1 could be the culprit. It will take some time before you're able to recover fully, but you can start that path with these steps:

1. Review and Renew Your Content Strategy

The best thing you can do is perform a content audit of your current site. Take a look at some of your competitors--preferably, some of your competitors who are currently ranking higher than you as of this update. What are their content strategies doing that yours is not? What is yours doing that theirs are not doing? Don't simply mimic their strategies. Instead, try to learn from them and incorporate some new ideas into your campaign.

How often are you updating your blog? How many ideas are original? What types of content appear to be the most useful, and are you providing those? How in-depth does your content get? These are all important questions to answer as you plan out the next few months of your campaign.

2. Delete Duplicate and Thin Content

From what we've seen so far, it looks like Panda 4.1 has hit content aggregators and duplicators the hardest. Straightforward plagiarism and content syndication have been dealt with already by earlier Panda updates, but Panda 4.1 has become more sophisticated. Are any of your content sources basic rewordings of other articles? Are many of them quote-based, or reliant on outside sources? It's best to get rid of them and focus more on creating unique, original content.

What to Look for in the Future

This most recent update comes approximately four months after the last one we saw, back in May. Throughout 2012, Panda was treated with near-monthly updates, but if this new spacing is any indication, Panda could be on a quarterly updating schedule--at least for the time being. That would put the next update in December of 2014 or January of 2015, and it could carry an additional set of ranking signals.

It's certain that Google will continue to update their algorithm with more sophisticated signals and ranking formulas; it's only a matter of when. But Panda 4.1 is continuing a several-year tradition of rewarding high-quality original content, and it's likely that this trend will only continue. The best thing you can do is refine your content creation strategy and make sure your blog is as original, unique, valuable, and well-written as possible. Publish new, high-quality content regularly, and you'll put yourself in a perfect position for a boost when Panda 4.2--or whatever the next update is--starts rolling out.

Tuesday, September 30, 2014

A Guide To SEO For Entrepreneurs In Locally Serving Industries

For personal injury lawyers, chiropractors, dentists, and other practitioners in locally serving businesses, SEO plays a major part in attracting clients and achieving success. Unfortunately, you can mess up in a heartbeat if you aren’t careful. Let’s look at how to appropriately deal with SEO in ways that are pleasing to both search engines and your audience.
The Value of SEO for Locally Serving Industries

According to Lawyernomics, the second most popular way to find an attorney is via Google , Bing, or Yahoo . A healthy 21.9% start their search in the little text box on their internet’s home page. Another 10.5% look elsewhere on the internet. That means nearly one in three clients is starting their search for an attorney online.

While those numbers are specifically geared towards lawyers, the statistics for other local industries is similar. Whether you’re a chiropractor, dentist, or some other practitioner, you can expect a large percentage of your new client traffic to come from online sources – specifically search engines.

The Difficulty of Understanding SEO

Sounds pretty easy, right? Invest heavily in SEO, and you’ll begin to see your business grow and expand. If only it were that simple. Like anything else in the business world, it takes hard work to be successful.

The trouble with tapping into SEO is that Google and other search engines are continually changing their algorithms and rules. What’s true today may not be true tomorrow. They are free to update as they please – and they do quite frequently. As a result, it’s almost impossible for practitioners with fulltime jobs to learn the best SEO practices in their free time. As soon as you learn how to do something, the rules will change. This is where SEO professionals come into play.

Investing in Professional SEO Help

Conduct a search for SEO help and you’ll find millions of results. SEO is a booming industry, and people with knowledge of how it works are eager to find a place in the market. Unfortunately for entrepreneurs, lawyers, and others with little experience in the industry, it’s challenging to determine which SEOs are experts and which are salesmen.

As a credible law firm or business, you can’t afford to associate yourself with an SEO “professional” that doesn’t know the rules. You could (A) end up wasting lots of money, or (B) damage your online credibility. Often, it’s both.

One industry expert relates the SEO frontier to the “Wild West Era” of American history – and there are a lot of good analogies to be gleaned from this example. Much like the western frontier was largely unknown in the 18th and 19th centuries, there is still a lot to learn about SEO. That means so-called “experts” will take advantage of this fact and advertise their “skills” without any real understanding of how it works. Just keep in mind that anyone can advertise and look for reviews and testimonials when selecting a professional SEO.

The Latest Algorithm: Google Hummingbird

When searching for SEO help, it’s valuable to understand some of the basics about how search engine optimization works. While you may not have the time to study the intricate details of the practice, a fundamental understanding of what’s legal, what’s not, and how Google’s latest algorithm works will allow you to have productive conversations with SEO candidates.

Three years ago it was Google Panda. Google Penguin released later in 2012, and Google Hummingbird followed. Released in October 2013, Hummingbird is the most sophisticated algorithm to date. One of the most interesting and important elements to note is the focus it puts on semantic search and local results. For lawyers and locally serving practitioners, it’s important to understand what this means. Here are a few tips to keep in mind:

  • High-quality content. Regardless of what you’re told, high-quality content is still the most important aspect of SEO. Your practice needs to focus on its core practice areas and hone in on semantically pleasing content.
  • Contact info matters. Because Hummingbird seeks to give searchers local results, your contact information needs to be accurate and appropriately displayed. This means setting up and maintaining social media profiles, including Google Plus.
  • Keywords have their place. When discussing what strategy an SEO professional will implement for your practice, pay attention to how they discuss keywords. While they are diminishing slightly in value, they still matter. There needs to be a healthy balance.
  • Banding Together for Knowledge and Power

For professionals in locally serving industries, it’s important to band together to preserve and build your online presence in an ever-changing world of SEO. Instead of attempting to handle it on your own, consider joining forces with industry peers to learn how to best attack the issue and gain internet visibility.

Take, for example, Michael Ehline of Ehline Law Firm PC. In an effort to teach and share information about SEO guidelines, search practices, and information, he started the Circle of Legal Trust. This group consists of attorneys, consumers, and search experts all focused on improving ethical internet practices and helping professionals improve their understanding of how to operate in an increasingly internet-based marketing world.

For those in other locally serving industries, there are similar groups and organizations. While it’s difficult to fully understand SEO without an extensive background in the industry, it’s easy to grasp the significance of maintaining a healthy presence online. By partnering with other professionals and groups, you can ensure your practice is in good hands.

Monday, September 29, 2014

Is Your Website Being Indexed Properly by Google?

One of the most common problems experienced when trying to rank in Google, is that your website is not currently being indexed correctly. If this is the case, it means Google is failing to access your web pages to index your site’s content effectively.
To check whether your site is efficiently crawled and listed, you will need to log into your Google Webmaster Tools and check the “Google index” tab. There you will find the total number of pages the search engine has indexed. If you see a drop in the number of these pages, you are likely to experience a decrease in traffic levels.

Finding the Reason Behind Your Indexing Issues
If you’ve taken a look at your Webmaster Tools and it’s clear that not all your pages are being found by Google’s crawlers, now is the time to take a closer examination at the possible problems Google is experiencing with your website.

Does Your Site Have Crawler Errors?
To find out if Google is indexing your site fully, begin by heading to your Google Webmaster Tools dashboard and checking your Crawler Error messages. The most-likely error message you will find is a 404 HTTP Status Code warning. It signals that the URL cannot be found.

Other crawling errors include:
  • Robots.txt – A   poorly scripted Robots.txt file can be detrimental to your Google indexing. This text file is like a set of instructions telling a search engine crawler not to index parts of your website. If it includes a line such as “User-agent: *Disallow: /” this basically tells every single crawler it experiences to ‘get lost’ – including Google.
  • .htaccess – This invisible file can do nasty things if incorrectly configured on your site. Most FTP clients allow you to toggle hidden/seen files so that you can access it if required.
  • Meta Tags – If you have pages that aren’t being indexed, be sure they don’t have the following meta tags in the source code: <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
  • Sitemaps – If you receive a Sitemaps crawling error, it means your website sitemap is not updating properly; your old sitemap is being repeatedly sent to Google instead. When you’ve tackled any issues signalled by the Webmaster Tools, make sure you run a fresh sitemap and re-submit it.
  • URL Parameters – Google allows the option to set URL parameters when it comes to dynamic links. However, incorrect configuration of these can result in pages that you do want picked up being dropped instead.
  • DNS or Connectivity issues – If Google’s spiders simply can’t reach your server, then you may encounter a crawler error. This could be for a variety of reasons such as your host is down for maintenance or had a glitch of their own.
  • Inherited Issues – If you have bought an old domain or moved your website to an old website’s location it is possible the previous site had a Google penalty. This will inhibit indexing of the new site. You will have to file a reconsideration request with Google.
  • If you are considering using a historic domain for your site, be sure to take a look at its history before purchasing. You can make use of the Internet Archive’s Wayback Machine to see pages that were previously hosted on your domain.


Does Your Site Have Syntax Errors or Structural Complications?
Google is very tolerant when it comes to HTML mark-up mistakes within webpages, but it is possible that syntax errors can prevent indexing (in extreme cases). Check your site’s HTML with the W3C’s HTML Validator to see a report of errors you need to correct.

Google advises you make your site structure as logical as possible. Every page should be reachable from at least one test link. You can use a text browser, like Lynx, to look at your site much the same way the spiders see it. Remember, the parts of your site that use frames, JavaScript, Flash, session IDs, cookies and DHTML may be missed by crawlers.

Does Your Site Have Inbound Links?
To be indexed with Google, your website needs to have at least one quality inbound link from another website already indexed in the search engine. This is a common reason it takes a lot of new websites a while to be successfully indexed.

One way to create some quick links is to update social networks with your website URL or add a link on an existing related website that you own. Social media profiles that carry high weight include: Facebook pages, Twitter profiles, Google+ profiles/pages, LinkedIn profiles, YouTube channels, and Pinterest profiles.

Offsite content is another excellent way to build links that will help your site get indexed properly. Offsite content is content relevant to your site that is hosted elsewhere: guest posts on other blogs in your niche. Just keep in mind, you need to make sure these external sites are all high quality, as links from ‘spammy’ sites will do your website harm instead of good. The best way to ensure your links are high quality is to make sure that they have ‘natural links’, links that develop as part of the dynamic nature of the internet where other sites link to content they find valuable.

See Google’s Webmaster Guidelines for a more in-depth understanding of what they consider to these to be.

Has Google Penalized You?
One of the most difficult obstacles in proper indexation by Google is a Google penalty. There are a number of reasons why you might encounter a penalty from Google, but if you do not deal with the issue they raise, you may be deindexed (removed from their search engines).

Avoid Google Penalties by Steering Clear of The Following Techniques:
  • Automatically generating content
  • Link schemes
  • Plagiarizing or duplicating content
  • Cloaking
  • Sneaky redirects
  • Hidden links & text
  • Doorway pages
  • Content scraping
  • Affiliate programs with little content value
  • Using irrelevant keywords
  • Pages that install trojans, viruses, & other adware
  • Abusing rich snippets
  • Automating queries to Google

Recovering from Google penalties requires hard work and due diligence to remove links on your part; you will need to submit a reconsideration request before your site is effectively indexed and ranked once more.

Fix Your Indexing
Most of these checks are quick and easy to make, so don’t let your SEO and link building efforts go to waste – make sure your website is indexed correctly by Google. It’s surprising how many websites make some of the smallest mistakes, and it prevents their site from being indexed correctly. In the end, it hurts their website’s rankings, which hurts their traffic, which hurts their sales.

Friday, September 26, 2014

SEO: How to Identify Low Quality Links

Links are the lifeblood of organic search. But the quality of those links can boost or kill a site’s rankings. This article suggests methods to determine the quality of in-bound links to your site. At the end of the article, I’ve attached an Excel spreadsheet to download, to help you evaluate links to your site.
Importance of Links

Search engine algorithms have traditionally relied heavily on links as a measure of a site’s worthiness to rank. After all, links are, essentially, digital endorsements from the linking site as to the value of the site to which it is linking.

Google was founded on this concept of links indicating value. In addition to the relevance signals that other engines used in their algorithms, Google added PageRank, a method of calculating ranking value similar to the way that citations in the scientific community can indicate the value of a piece of research.

When site owners began creating artificial methods of increasing the number of links pointing to their sites to improve their rankings, the search engines retaliated with link quality measures. Google’s Penguin algorithm is one such algorithmic strike intended to remove the ranking benefit sites can derive from poor quality links.

What Makes a Low Quality Link?

Unfortunately, the definition of a poor quality link is murky. Poor quality links come from poor quality sites. Poor quality sites tend to break the guidelines set by the search engines. Those guidelines increasingly recommend that sites need to have unique content that real people would get real value from. That’s pretty subjective coming from companies (search engines) whose algorithms are based on rules and data.

The “unique” angle is easy to ascertain: If the content on a site is scraped, borrowed, or lightly repurposed it is not unique. If the site is essentially a mashup of information available from many other sources with no additional value added, it is not unique. Thus, if links come from a site that does not have unique content — i.e., a site considered low quality — those links would be low quality as well.

Search engines can identify unique content easily because they have records of every bit of content they’ve crawled. Comparing bits and bytes to find copies is just a matter of computing power and time. For site owners, it’s more difficult and requires manual review of individual sites.

There are other known indicators of low-quality sites as well, such as overabundance of ads at the top of the page, interlinking with low-quality sites, and presence of keyword stuffing and other spam tactics. Again, many of these indicators are difficult to analyze in any scalable fashion. They remain confusing to site owners.

In the absence of hard data to measure link and site quality in a scalable way, search engine optimization professionals can use a variety of data sources that may correlate with poor site quality. Examining those data sources together can identify which sites are likely to cause link quality issues for your site’s link profile.

Data such as Google toolbar PageRank, Alexa rankings, Google indexation and link counts, and other automatable data are unreliable at best in determining quality. In most cases, I wouldn’t even bother looking at some of these data points. However, because link quality data and SEO performance metrics for other sites is not available publicly, we need to make due with what we can collect.

These data should be used to identify potential low-quality sites and links, but not as an immediate indicator of which sites to disavow or request link removal. As we all know, earning links is hard even when you have high quality content, especially for new sites. It’s very possible that some of the sites that look poor quality based on the data signals we’ll be collecting are really just new high-quality sites, or sites that haven’t done a good job of promoting themselves yet.

While a manual review is still the only way to determine site and link quality, these data points can help determine which sites should be flagged for manual review.

A couple of reports can provide a wealth of information to sort and correlate. Receiving poor marks in several of the data types could indicate a poor quality site.

Google reports the top 1,000 domains that link to pages on your site. “Links” refers to the total number of links that domain has created pointing to any page on your site. “Linked Pages” refers to the number of pages that domain has linked to. So a domain may link to 10 pages on your site, but those links are on every page of their own site. If the linking site has 100 pages, that’s 1,000 “links” to 10 “linked pages.”

You can also download this report that shows a large sample of the exact pages linking to your site. In some cases the links are from domains not listed in the Link Domain Report, so you may want to add the domains from this report also.

Red flags. Generally, higher numbers of “links” and “linked pages” indicate that the domain is a poor-quality site.

This plugin turns Excel into an SEO data collector, enabling you to enter formulas that gather data from various websites.

What to use. For link quality I typically use the following.
  • Home page Google PageRank. Shows Google toolbar PageRank, which is only updated every three months and may not show accurate data but useful as a relative comparison. Higher numbers are better.
  • Google indexation. The number of pages Google chooses to report are indexed for the domain. The pages reported by Google are widely believed to be a fraction of the actual number, but it’s useful as a relative comparison. It’s the same as doing a site:domain.com search. Higher numbers are better.
  • Google link count. The number of links pointing to a domain according to Google. Wildly underreported, but just barely useful as a relative comparison. Same as doing a link:domain.com search. Higher numbers are better.
  • Alexa Reach. The number of Alexa toolbar users that visit the domain in a day. Higher numbers are better.
  • Alexa Link Count. The number of links to the domain according to Alexa’s data. Higher numbers are better.
  • Wikipedia entries. The number of times the domain is mentioned in Wikipedia. Higher numbers are better.
  • Facebook Likes. The number of Facebook Likes for the domain. Higher numbers are better.
  • Twitter count. The number of Twitter mentions for the domain. Higher numbers are better.

Cautions. Every cell in the spreadsheet will execute a query to another server. If you have many rows of data, this plugin will cause Excel to not respond and you’ll have to force it to quit in your task manager. I recommend the following steps.
  • Turn on manual calculation in the Formulas menu: Formulas > Calculation > Calculate Manually. This prevents Excel from executing the formulas every time you press enter, and will save a lot of time and frustration. Formulas will only execute when you save the document or click Calculate Now in the aforementioned options menu.
  • Paste the formulas down one column at a time in groups of 50 to 100. It seems to respond better when the new formulas are all of the same type (only Alexa Reach data, for example) than if you try to execute multiple types of data queries at once.
  • Use Paste Special. When a set of data is complete, copy it and do a Paste Special right over the same cells. That removes the formulas so they don’t have to execute again. I’d leave the formulas in the top row so you don’t have to recreate them all if you need to add more domains later.
  • Use a PC if you can because Apple computers tend to stall out more quickly with this plug in.

Manual Quality Review

If a site has high numbers in the Google Webmaster Tools reports and low numbers in the SEO Tools data, it should be manually checked to determine if it’s a poor quality site, sending poor-quality links your way. The following are the quality signals I use for manually reviewing link quality.
  • Trust. Would you visit this site again? Do you feel confident about buying from the site or relying on its advice? Would you recommend it to your friends? If not, it’s probably low quality.
  • Source. Is this site a source of unique information or products? Does this site pull all of its content from other sites via APIs? Is it scraping its content from other sites with or without a link back to the source site? Does it feel like something you could get from a thousand other sites? If so, it’s probably low quality.
  • Ad units in first view. How many paid ad units are visible when you load the page? More than one? Or if it’s only one, does it dominate the page? If you weren’t paying close attention would it be possible to confuse the ads with unpaid content? If so, it’s probably low quality.
  • Use Searchmetrics. Enter the domain in the Searchmetrics’ search box to get search and social visibility, rankings, competitors, and more. It’s free, with an option to subscribe for many more features. I’ve included this in the manual review section because you have to paste each domain in separately. It does, however, provide a balancing analytical approach to the subjective nature of manual review.

Finally, when reviewing sites manually, don’t bother clicking around the site to review multiple pages. If one page is poor quality it’s likely that they all are. In particular, the home page of a site typically represents the quality of the entire site. Download this Excel spreadsheet to help organize and evaluate links to your site.a

Thursday, September 25, 2014

Panda 4.1 Google’s 27th Panda Update Is Rolling Out

Google has announced that the latest version of its Panda Update a filter designed to penalize “thin” or poor content from ranking well has been released.
Google said in a post on Google+ that a “slow rollout” began earlier this week and will continue into next week, before being complete. Google said that depending on location, about 3%-to-5% of search queries will be affected.

Anything different about this latest release? Google says it’s supposed to be more precise and will allow more high-quality small and medium-sized sites to rank better. From the post:

Based on user (and webmaster!) feedback, we’ve been able to discover a few more signals to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.
New Chance For Some; New Penalty For Others
The rollout means anyone who was penalized by Panda in the last update has a chance to emerge, if they made the right changes. So if you were hit by Panda, made alterations to your site, you’ll know by the end of next week if those were good enough, if you see an increase in traffic.

The rollout also means that new sites not previously hit by Panda might get impacted. If you’ve seen a sudden traffic drop from Google this week, or note one in the coming days, then this latest Panda Update is likely to blame.

About That Number
Why are we calling it Panda 4.1? Well, Google itself called the last one Panda 4.0 and deemed it a major update. This isn’t as big of a change, so we’re going with Panda 4.1.

We actually prefer to number these updates in the order that they’ve happened, because trying to determine if something is a “major” or “minor” Panda Update is imprecise and lead to numbering absurdities like having a Panda 3.92 Update.

But since Google called the last one Panda 4.0, we went with that name — and we’ll continue on with the old-fashioned numbering system unless it gets absurd again.

For the record, here’s the list of confirmed Panda Updates, with some of the major changes called out with their AKA (also known as) names:

Panda Update 1, AKA
Panda 1.0, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
Panda Update 2, AKA
Panda 2.0, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
Panda Update 8 AKA
Panda 3.0, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
Panda Update 11, Feb. 27, 2012 (no change given; announced)
Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
Panda Update 16, June 25, 2012: (about 1% of queries; announced)
Panda Update 17, July 24, 2012:(about 1% of queries; announced)
Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
Panda Update 20 , Sept. 27, 2012 (2.4% English queries, impacted, belatedly announced
Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
Panda Update 25, March 15, 2013 (confirmed as coming; not confirmed as having happened)
Panda Update 26 AKA
Panda 4.0, May 20, 2014 (7.5% of English queries were affected; confirmed, announced)
Panda Update 27 AKA
Panda 4.1, Sept. 25, 2014 (3-5% of queries were affected; confirmed, announced)
The latest update comes four months after the last, which suggests that this might be a new quarterly cycle that we’re on. Panda had been updated on a roughly monthly basis during 2012. In 2013, most of the year saw no update at all.

Of course, there could have been unannounced releases of Panda that have happened. The list above is only for those that have been confirmed by Google.

Wednesday, September 24, 2014

How The Apple Watch Could Change The World Of Local SEO

Apple recently unveiled the up-and-coming sixth generation of its iPhone, as well as its next gadget, which will arrive in stores in early 2015—the Apple Watch. SmartWatches have been experimented with by other companies in the past, but Apple’s foray into wearable smart technology could mark the beginning of a new tech era and some radical changes for the world of local SEO.
If you represent a local business trying to boost online visibility for your brand, it’s time to start looking at how the Apple Watch is changing the rules and think about what you can do to stay ahead of your competition.

The Apple Watch: Bringing SmartWatches Back Into the Spotlight

SmartWatches have occasionally popped up on the market for the past few years, but none of them have really caught on with the public. Wearable technology, like SmartWatches and Google Glass, have generated significant interest, but only on a theoretical level thus far; the devices have generated word-of-mouth buzz, but sales and reviews have been lukewarm at best.

The Apple Watch seeks to change that landscape and bring wearable technology into public favor. Featuring the now-familiar voice-activated digital assistant Siri, Apple Play, Apple Music, and surely thousands of downloadable apps, the Apple Watch is stepping ahead of its SmartWatch competitors.

The most popular feature of the new device, and the most significant for local SEO, is its new mapping feature. Rather than showing a map and speaking audible directions, like smartphones and older navigation systems, the SmartWatch will use a system known as “haptic feedback” to provide hands-free, eye-free directions with directional buzzes. Time will tell how functional and practical this system is for navigation, but the early buzz seems to indicate an overwhelming excitement for the new product. If Apple delivers the same level of quality in its SmartWatch as it has its many generations of iPhone, it could be a true game changer.

Siri’s Relationship with Bing

Local search today is still dependent on search engines. Google is by far the most popular search engine, especially for local results, so most SEO campaigns cater specifically to Google. The popularity of smartphones keeps rising, leading many to perform searches on their mobile devices rather than on their home computer. The result of these trends is that search queries are changing; rather than typing search queries on a keyboard, users are speaking the search queries into their smartphones’ microphones.

With the dawn of the Apple Watch, Siri may accelerate this trend. The Apple Watch’s small screen and location on the wrist may make it more difficult to use your fingers to input data, encouraging users to speak search queries rather than type them.

Tell Siri to search for a “burger restaurant,” and she’ll populate a handful of local results. But currently, Siri uses Bing to populate that information. That means that local search marketers, in order to capture the attention of Apple Watch users, will need to adjust their strategies to focus on Bing Local results (instead of just relying on Google). Fortunately, many fundamental strategies will still apply—such as optimizing listings on local directories like Yelp—but Bing may soon see a surge in popularity due to Siri’s reliance on it.

Optimizing for Apple Maps

The Apple Watch will come with Apple Maps as a default navigation system. While many iPhone users have opted to use Google Maps on their devices instead, the Apple Watch could foster a new generation of Apple Maps users. That means local search markers will need to take extra steps to ensure their businesses can be found on Apple Maps.

Apple Maps treats local businesses differently than its contemporaries. It doesn’t offer a “claim your business” style system, like Google does, that allows business owners to identify themselves and present accurate information for their directory. Apple Maps does provide an opportunity to report mistakes in listings, but this is not as accurate, transparent, or efficient as the similar system that Yelp! offers.

Apple Maps does pull at least some information from local directories and other information providers such as Yelp!, TomTom, Factual, and Localeze, so it’s possible to improve your listing on Apple Maps simply by updating your information on third party sites. This is already a best practice for local marketers, but it will take some extra effort to claim and update your information on some of the lesser-known third party local platforms.

“Super Local” Searches

Local search results are already impressive; Google can detect your general location when you search for, say, “Mexican restaurants,” and show you a list of Mexican restaurants near your current location (usually based on your IP address). While the notion is speculative for the time being, it seems reasonable that the onset of SmartWatch popularity could give rise to a new level of local search using GPS location information. Instead of focusing on results for a given query within a city, the SmartWatch could give you results within a given city block.

Again, this “super local” search is merely speculative at this point, but it pays to look to the future. Optimizing for a very specific crowd could eventually become more important than optimizing for a city or region.

Mobile Coupons and User Engagement

Mobile coupons have already become popular with smartphones, and interactive elements like QR codes have given smartphone users a chance to use their technology in real life for some kind of benefit (like a discount or more information). This trend will increase in sophistication as the Apple Watch arrives on the scene.

Users will demand even more immediacy, so if you can find a way to cater to those users faster than your competition, you’ll be on top of your local competitive market. While there are currently no details on specific offers local retailers can make to serve the Apple Watch crowd, it’s an idea to keep in the back of your mind as you rethink your local optimization strategy.

Overall, the fundamentals of local search will remain the same—ensure accurate information across all your local directories and give users an excellent mobile experience—but the Apple Watch will mark the beginning of a new series of trends. Business owners and marketers will have to spend more time optimizing for Bing and Apple Maps specifically, and will have to be prepared for the onset of super-specific local searches. Keep an eye out for more details about the Apple Watch as we get closer to its 2015 release.

Tuesday, September 23, 2014

Humanizing Your SEO Keywords

The Death Of Traditional SEO

Traditional SEO is dying and has actually been dying for quite some time now. Google, holding the lion’s share of the search market, has been on a quest to humanize big data. The company wants a world where our search technology values and prefers quality content over the “black hat SEO” infused garbage that cluttered the digital world.

You might not have noticed it. This change came in the form of penguins, hummingbirds, pandas, etc. The major algorithm updates Google has pushed over the recent years have all focused the search engine’s sights on high quality, human content. From these updates, it has become harder and harder to utilize traditional SEO methods. In fact, many experts and practitioners have made radical changes to their SEO approach.

Whether this is all for the better is still up for debate. What isn’t up for debate, however, is the need to adapt.
The Death Of The Keyword

In the past, one “go-to” SEO technique was the keyword: a simple word or phrase to include in the content and metadata of a site. We did this to get noticed by search engine crawlers: add a keyword here and there in the body, title, metatags, etc, and you’d be good. Unfortunately, this approach became abusive. It became too easy to create content that was effective for search crawlers, but not quite as much for human readers.

In Google’s string of algorithm updates, the preference for traditional keywords has decreased. In fact, Google has been known to penalize sites where this traditional SEO tactic has been used at the loss of quality content. Because of this change, a new approach is required to fill the gap left in the keyword’s wake.

Humanizing Keywords

It turns out you don’t have to radically change your approach to SEO keywords. Instead, you have to use keywords in a more human manner.

Dr. Andy Williams, author of “SEO 2014 & Beyond” asks a simple question, “Does your article (or site) sound as if it was written by an expert?” The answer to this question is a litmus test for how human your content appears to Google. The more human, the better. The key lies in the choice of keywords, the variety of keywords, and how these keywords are used.

Niche Vocabulary

If you open up a textbook on any subject, you will find a natural list of terms and words that are often used when talking about said subject. In many ways, these words makeup a unique language for the subject. To humanize your content, you have to identify and use this unique subject vocabulary.

  • Discover the Language – just like with traditional SEO keywords, the first step is to figure out what words and terms you should be using for your topic. If you are a subject matter expert writing in your expert field, then chances are you naturally know this niche vocabulary. If you aren’t familiar with the language, however, then some research will be required. Search for a list of common terms, or analyze a few other articles written with the same topic, to see the language others use with a given subject.
  • Use the Language – once you know the niche vocabulary, you have to use it in a natural way. The days of randomly seeding an article with loosely connected keywords are gone. Your content must be written in the way an expert would write or speak about it. The easier it is for your audience to read it, the more Google will likely value it.

The reason why all this works is simple: niche vocabulary is natural vocabulary. Before, keywords that might been useful for SEO wouldn’t have necessarily been the most relevant to the subject. Conversely, terms that were most relevant to the topic might not have been the most obvious choice as a traditional SEO keyword.

A Public Speaking Case Study

As a speech educator, I’m naturally writing articles about public speaking to share with the world. In the past, my traditional SEO keyword approach would have been straightforward: find a keyword generator, type in “public speaking”, and then pepper my content with the top five keywords that came up. Sometimes this approach would have resulted in a natural sounding article, but more often than not, my SEO trickery was blatant to human readers.

Given that I’ve been doing this whole “public speaking” thing for over 12 years, it’s safe to assume I’m pretty familiar with the niche vocabulary. Once I began to realize and embrace the changes that Google brought with its algorithm updates, the process of making more optimized content actually became easier. All I had to do was write articles in the same language I would use elsewhere with real humans. Suddenly, less relevant keywords (ie “fear of public speaking”) that I would mention many times in an article disappeared. Relevant terms (ie “delivery”, “attention grabber”, “ethos”, “topical organization”) that would have never come up in a keyword search had more importance. The terms that related more to the actual topic became more important.

Embrace The Change

26836f82e46040fca5a806badc0036547afa3dbd

Niche vocabulary is a method for not only making your content more search engine friendly, it also makes your life easier. This allows you to write in a way that is natural for you, your readers, and the search engines. So, the next time you’re tempted to use that keyword generator, take a breath and just write. Chances are you will make something more optimized with a lot less work.

Monday, September 22, 2014

You Need a Modern SEO Strategy. Here's How to Shape One.

You probably know that search engine optimization (or SEO) can help more people find your company's website when they search for businesses like yours online. But when’s the last time you ensured your SEO strategy is following best practices -- and getting the results your business really needs?
Here’s why you need a modern SEO strategy, along with tips on how to build a program that helps you gain tangible results (customers) not just a high ranking, from organic search.

You can use SEO to drive traffic to your business website. But if your visitors can’t find your site or contact you, then you haven’t added much value to your business. First make sure that your website is optimized to convince interested visitors to contact you. That means you need an updated, responsive website design.

If your site isn’t built for today’s best practices, it may not even show up in search results. And outdated sites that don’t provide a great user experience raise a red flag to potential customers. This is amplified if your business doesn’t have a mobile site now that Google warns users when a site is not optimized for mobile viewing. And with scores of consumers searching for local information via mobile, having a mobile site is simply not optional any longer.

The next step is to drive more conversions of site visitors to customers. A great, mobile-optimized design will help with that, but your site also needs to feature multiple contact methods like a phone number, email address, web form and chat so that visitors can reach out to you when they are ready -- and through the method they prefer.

When you have this trifecta of a modern, mobile, and conversion-optimized website, you’re on your way to converting more of the website traffic you get from SEO.

Creating content for search engine optimization.

Search engines love content that’s recent, relevant and authoritative. One of the most challenging aspects of modern SEO is creating content that’s relevant to your business.

Both foundational website content and fresh, timely content (like blog posts) focused around the main topics and keywords of your business can help searchers find your website. To create compelling, search-friendly content, start off with a topic or keyword list that’s relevant to your business, products or services and target location.

Then create relevant, ongoing content about those topics to so that search engines will continually index your site in the search results. The most effective way to do this is to set up a blog and regularly post unique, informative posts that your audience will enjoy.

But, customer-facing content isn’t the only material to focus on. Populate your pages’ metadata fields with important keywords and accurate descriptions of its content. In addition, including on your site semantic markup code (like that found at Schema.org) will emphasize specific information about your business, like the name, business type and location. These back-end tactics can improve how quickly and easily search engines can match your site to specific, localized searches, helping it show up in organic search results.

Using social media and local sites to drive traffic.

Modern SEO doesn’t end with attending to your website. You also need to develop your offsite presence on social media sites, local listings and directories that can help amplify your site's SEO. A presence on social media sites and local directories increases the amount of information that consumers can find about your business when they search.

That’s because social signals and properly configured listings (or “citations”) might influence how search engines surface your website in organic results. Making sure these pages are active and optimized for search can also help them appear in organic-search results for your company's branded or key terms.

Set up your company's profiles on top social media sites (like Facebook, Twitter and Google+) and within important local listings (like those appearing on Google, Bing, Yahoo!, Yelp and CitySearch), populate them with optimized, accurate content and link them back to your business website.

Not only are these third-party sites a key destination for local searchers, but these sources can also signal to search engines that your business website is authoritative and relevant, helping boost its visibility in the search results.

Getting a return on the investment.

Here’s your final test for a truly modern approach to SEO: Once you’re getting site visitors, converting them into customers is the next step involved with gaining a return on the SEO investment. This first means responding to and following up with your site's visitors regularly (via phone and or email) so that they actually become customers.

But also track them back to the marketing source and figure out whether they came to the site from your SEO efforts so you can know if your marketing is working. This can be done through a manual or automated process, but it does require ongoing measurement and monitoring so you can see trends, view results and make adjustments to your program.

Wednesday, September 17, 2014

What 7 Things Everybody Needs to Know About Google Search Optimization

As one of the most visited websites around the globe, with over 12 billion searches sent through its engine per month and owning 67.5 percent of the U.S. search market, Google can play a huge role in the popularity of your own site. Whether it's business-related or you're just trying to establish yourself as a presence and brand online.
SEO may seem like magic to an outsider. It is a structured approach and measured practice of optimizing your pages (and website) to improve your find-ability factor in search engines. If done right you can get listed on top of the first page of results for your specific keywords. While nothing in life is really free, your listings that are shown in the organic, natural results will not ding your wallet each time a searcher clicks. (That's the model for paid advertising, like Google Adwords).

Here are 7 things that everybody should know about Google search optimization.

1. Google Is Always Evolving

Google is constantly changing and evolving at a high speed, and it has had a steady level of growth over the past decade with new software, patents, products, web properties, and corporate acquisitions accumulating over time.

"Among Google's goals: Improve the user experience by delivering relevant, fresh, quality content and, at the same time, crack down on those using questionable search engine optimization techniques to gain unjustified ranking position." - CIO.com

The index also changes and that alters operations. Due to these changes it's impossible to ever completely comprehend the Google algorithms or make any sort of permanent change to your site that will always comply with the standards that Google is keeping. Add to this the fact that Google's main business is paid advertising and naturally wants businesses and users to engage there.

2. Everybody Wants A Piece Of The Google Action

This might seem obvious. You know at the back of your mind that all of your competitors are also vying for that top spot on the search list.

"If your business has an active, well optimized and maintained website with high quality unique content, then you may find yourself a step ahead already." - SEJ
This isn't enough, however. If your company has no real online presence; has not built trust via quality links or has a random selection of social media profiles, blogs, and widgets that provide little value or tries to manipulate Google to try to acquire high positions unnaturally, you could be losing out. Your competitor may have already overtaken you, and web traffic is decreasing.

3. Quality Matters Most To Google

Believe it or not, one of the most important concepts that you must understand when it comes to Google is that their standards for the quality of content are very high.

Despite the millions of website operators who strive for better rankings within the search engine, the most important parties to Google are their (paid) users, which means that they want to provide content that satisfies and makes readers come back again and again.

4. Structure Your Data When You Can

We know that although not everything needs to be structured and organized into some specific schema, Google prefers structured data as much as possible as it's easier to analyze and requires less computing resources. Google responds by greatly increasing your chances for visibility and traffic. Especially when they know the best data to present to readers, whether it's a product listing, video, news or event announcement.

5. Don't Buy Links, Work For Them

It may be possible to build up the number of links back to your site by purchasing them in some way, but this is viewed as a negative behavior by Google.

The truth is that if you buy your links, then they aren't likely a good representative of how honest your content is. Google is cracking down on this behavior and give out penalties where they see it occurring. Even if you were not aware of it. You can earn links the hard way through great quality information and products that people want to see. It may then be shared naturally.

6. Get On Board With Google+

It hasn't appeared to be one of the most popular social media sites on the internet, but using Google+ does tend to have a correlation with search engine rankings. Your personal network will see it. Even if the use of this tool only provides a miniscule difference in the grand scheme of things, every little bit helps in your online presence building.

7. Get Moving

The content of your site is the most important aspect that Google wants you to focus on for SEO benefit. Rankings may and will come over time, but there are hundreds of factors to the Google algorithm. Content and links are critical, but so is the speed at which your website and pages operate.

Success or failure of your website is ultimately measured in visibility, traffic, stickiness (how long visitor stay and if/when they come back) your message and positioning, as well as conversions (sales). With so many moving parts, make sure your content is readily available and that your pages load fast. Improve your site speed and make users and Google happy in the process.

WHAT'S NEXT?
Before you add a second story to your house or construct a brand new home, be clear on where you are and what you want. For Google SEO, that means getting an SEO Audit.

Thursday, September 11, 2014

Don’t Overlook Importance of Marketing SEO

If you run a business, how much time and effort do you invest in marketing? Now how much time and effort do you put towards marketing and search engine optimization (SEO)?
If the answer to one or both questions is little or even none, then you might very well be running a company that is struggling to make ends meet. Among the reasons for this is that very few people are getting an opportunity to see what all your business and specifically website has to offer.

If marketing and SEO have not been seen much or any of an emphasis with your company, turn that around sooner rather than later.

SEO Draws in More Web Traffic

Among the various reasons to have a strong SEO plan in place for your company:
  • SEO helping to push strong content drives more traffic to your site
  • SEO and the proper marketing work hand-in-hand to get your company website more highly ranked on search engines such as Google
  • SEO allows you to hone in on specific keywords and links that are popular in your given industry.
In order to properly team up SEO and marketing, keep the following in mind:
  • Relevant content – If your site doesn’t have much to offer visitors, why then would they come there in the first place? One of the ways to get traffic to your site via SEO is by making sure you have a ranking better blog to offer them. A stellar blog can help drive traffic your way by giving consumers informative posts (both in-house and guest) on your specific industry. Relevant content also ups your ranking on search engines, while mediocre and/or outdated content will lower it;
  • Promote content – Just having quality content on your site is not enough; market it to gain the most exposure. One of the best ways to go about doing this is using social media. SM is a great marketing tool that allows you to put your site in front of countless eyes. While you do not necessarily have to have new content on the site seven days a week, don’t just do a post here and there either. At the least, new content should be posted to your site and promoted every other day. Make sure you have your Facebook, Twitter, Google+ and other relevant SM icons in an easy to locate position so readers can share the content;
  • Key in on keywords – Lastly, while your content should never be written from the stand point of being viewed as spam, you should focus in on relevant keywords that will strike a response. Using a Google analytics report, you can zero in on which keywords are directing your traffic. By knowing just how and when consumers go in search of products and services, you can market your campaigns to reach them sooner rather than later.
 If your efforts up to now have been lackluster or even futile when it comes to SEO and marketing, finish out 2014 strong, and then kick off 2015 with the goal of turning your company and your brand into an SEO marketing machine.

Tuesday, September 9, 2014

Summer SEO News You Should Know About: HTTPS. Rich Snippets, More

Summer is coming to a close and with it the end of (yet) another memorable season in the world of SEO. When Matt Cutts, Google’s head of webspam, announced his three month sabbatical we all joked that no updates could possibly roll out without him at the helm.
In his absence we’ve heard rumors of a suspected Penguin update, encryption as a ranking signal, and the death of authorship. Here are some of the SEO tidbits you may have missed out on this summer.

HTTPS: Encryption for SEO

Summer SEO News You Should Know About: HTTPS. Rich Snippets, More image https Dollarphotoclub 67642635 300x300Security breaches on the web seem to be an everyday occurrence. Sites large and small are impacted by the nefarious activities of spammers and hackers. Increasing security measures on a site is imperative to protecting a company’s bottom line today, and of course the privacy of your customer’s information. Search engines have taken notice and want to provide a better experience online for their users. In August Google announced they will begin using website encryption, HTTPS, as a ranking signal giving preferential treatment to sites which encrypt data for their users. The change is estimated to effect less than 1% of global queries so far, but has web developers scrambling to make it.

Encryption has always been a best practice as it provides heightened security by encrypting a user’s session with an SSL Digital Certificate. For online banking and ecommerce sites encryption is common, but for other industries it’s less so. Implementation often involves expense and additional measures, which many companies have put off – until now.

A recent study shows that currently to date no ranking benefits were observed as a result of this change. First, Google admittedly did state it was a “very lightweight signal” affecting a small amount of global queries. Some speculate that the roll out may not have gone into full effect yet; still others are calling Google’s bluff. Whichever camp you belong to, it’s clear that encryption is a focus for Google this summer and is likely to increase in importance overtime as the privacy trend continues. Forewarned is forearmed.

Rich Snippets

Google’s Webmaster Trends Analyst John Mueller shocked the SEO community announcing that the search giant is abandoning Google Authorship, a microdata markup attributing content to a particular author. Those in the know may have seen this coming, with a change to search results pages (SERPs) in July when bylines and images were stripped. Something was up with this particular feature.

John Mueller’s announcement, seen here on Google +, explains that through testing they discovered the information wasn’t as useful as once thought; in fact, it was distracting in some instances. Once thought to increase clickthrough rates, author snippets didn’t actually deliver significant improvement. Low adoption rates is likely another cause for the decision to get rid of authorship. While sites like The New York Times adopted markup, the Washington Post (for another example) did not. Without widespread adoption, author snippets didn’t accurately reflect perceived value.

Google plans to continue their discovery of additional uses for rich snippets in SERPs, because rich snippets provide useful information for the end user. Event snippets are growingly common, showing date, location and time of the event in SERPs. Music album snippets, product snippets such as reviews, recipe snippets, and video snippets are just a few of the opportunities available. No doubt rich snippets are here to stay, although it’s unfortunate that some SEO experts continue to recommend continued use of authorship snippets.

Pigeon

One particular Google update of importance this summer was rolled out at the end of July. The update aimed to provide more accurate and relevant local search results. It’s described to have tied more closely traditional web ranking signals with local search signals. Search Engine Land took the liberty of naming it the Pigeon update (“Pigeon is the name we decided on because this is a local search update and pigeons tend to fly back home”).

Summer SEO News You Should Know About: HTTPS. Rich Snippets, More image Pigeon Dollarphotoclub 40706319While speculation abounds with this update, experts noticed that Pigeon affects only English queries in the U.S. Moz has reported reduced numbers of 7-packs that show in local queries, with an increase in 3-packs. Experts have seen that the search radius has been reduced for most local queries and, most importantly, that local businesses seem to be favored over local brands.

Pigeon may seem like a small change, but it’s certainly changed the SERPs for many verticals. As expected, effects are still being monitored and analyzed. Best practices are still important for sites dependent on local traffic. This article discusses some great best practices for local search including:

  1. Set up your own individual Google + local page and do it correctly: all information provided, images, local phone number, proper category and consistent name/address/phone number (NAP) information.
  2. Consistent NAP throughout your website (About Us and/or footer), and consistent on other online listings as well.
  3. Encourage your customers to leave reviews of your business.
  4. Geo-specific information should be added to metatags as appropriate, with your site optimized for geo queries.

It’s likely that you’ve been doing most things right with your local strategy. Pigeon will require a second look, ensuring you’re maintaining the best possible website for local rankings. Those trying to benefit from local search without being in that area (i.e., actually local) aren’t likely to benefit much longer. Consider evaluating your current local strategy and revamping as needed based on Pigeon.

Penguin 3.0?

The rumored Penguin 3.0 update on the lips of many has yet to surface this summer. Barry Schwartz speculated that we would see an update over the Labor Day weekend, and Mozcast has been indicating fluctuations in the space for several weeks now. We continue to wait… and wait.Summer SEO News You Should Know About: HTTPS. Rich Snippets, More image penguin with magnifying glass Dollarphotoclub 59892474 e1410210253133 300x228

Protect your site from Penguin

Ensure you’re not a sitting duck for a penalty by looking for the telltale signs of a Penguin prone site. Ask yourself these questions:

  1. Do you have over-optimized pages and backlinks with over-optimized commercial anchor text?
  2. Do you have a high number of total links but low number of links from unique domains?
  3. Have you purchased links in the past or conducted gray-hat link building strategies such as reciprocal links, text link advertisements and link exchanges?
  4. Is your backlink profile diversified with a mixture of link types, authority and relevancy?

If you answered yes to #1-#3 you’re not sitting in a good position. If you answered #4 you’re on the right track. Experts recommend reviewing your backlink profile regularly and understanding what a natural backlink profile looks like. Check out this Google Penguin Recovery Kit by Vertical Measures to learn more about recovery tips. To avoid future Penguin penalties clean up past activities and ensure you’re earning authoritative, relevant and worthwhile links from here on out.

Monday, September 1, 2014

SEO considerations when moving from HTTP to HTTPS

Back in March 2014, Matt Cutts gave the audience at SMX West a little tip that making a site secure (i.e. using Secure Sockets Layer or SSL encryption) was going to trend in 2014. Matt wanted to ensure that sites that utilize SSL encryption would see a ranking boost within Google, however at the time there were some people at Google that did not agree with him, nor did they want this to happen.

Well, only a couple of weeks ago Google announced that using SSL encryption will give sites a ranking boost within Google’s SERPs. Although away on leave, Matt Cutts did tweet about this new update on the 7th August 2014:

Matt Cutts tweet

Within the post tweeted; Google has published some clear guidelines on what they expect to see from a site using HTTPS (aka HTTP over Transport Layer Security or TLS). They also confirmed that due to the positive response, they’ve made this a positive signal for ranking websites, however they also state that this is a very “light-weight signal” and will only affect less than 1% of global search queries. The secure signal will apparently carry less weight than other signals such as a website containing high-quality content, but may become a stronger signal in the future.

It’s still early days and although Google has given some guidelines on what they want to see from a website, there are a number of other aspects from an SEO perspective to take into account when moving your website from HTTP to HTTPS.

Tips when moving HTTP to HTTPS

Moving your website from HTTP to HTTPS is very much like migrating your website to a new URL structure, or even moving to a brand new domain. Past experience has told us that there’s so much that can go wrong if things aren’t implemented correctly.

Google has given some guidelines on moving to HTTPS here and here, and Barry Schwartz over at Search Engine Roundtable covered the topic too. There are also a few other SEO aspects that you should take into consideration before you commit to moving your website to HTTPS.

Firstly, you need to choose the right level of certification (i.e. 2,048 bit certificate) from an accredited/trusted provider. Once you’ve completed this step, there are a few other SEO considerations that will be important to migrating successfully:
  • Ensure all your internal links point to the new HTTPS URLs.
  • Ensure any external links and new social shares point to the new HTTPS URLs, if you’re still getting links to the old HTTP version of your website Google can become confused and you won’t see the benefit that these new links have the potential to pass on to your website structure. Google won’t be able to decipher which is the most authoritative page that deserves a higher ranking.
  • Ensure that all rel=canonical tags within your HTML don’t point to the old HTTP version. Once you move over to HTTPS these tags must be changed to the new HTTPS URLs, as this helps Googlebot understand which version of the page should be used to rank. Again, if you still point to the HTTP version then Google will once again become confused over what page should be ranking in the SERPs.
  • Ensure that you’ve mapped out the new HTTPS URLs on a page-to-page level – you basically want an exact duplicate URL structure the only thing that is changing is that ‘http://’ will become ‘https://’.
  • Once you’ve got these in place you then want to implement a permanent 301 redirect on a page level. Do not 301 redirect everything (either via global or via a wild card redirect) to the home page as this will kill all your rankings overnight.
  • Finally, you need to watch your Webmaster Tools account post go live and monitor for any issues Google may be having with your new HTTPS website.

Following these points will ensure that your website has the best chance of maintaining its current rankings. The reason why I say best chance is because with Google, any major change to a website, even if done correctly, can still result in either short term or long term ranking drop or fluctuation. This could be from just a small drop in one or two places for a few days to some major drops that could last for weeks or even months. Rectifying any problematic change to a website can take time to recuperate, especially with Google’s re-crawl and re-indexation rates.

Here’s an example of a website that recently underwent a URL migration mid-2013, and then a domain migration in early2014 of which the above recommendations were not followed (N.B. this was a standard migration and not a HTTP to HTTPS migration, however the move is essentially the same and as you can see, this particular website hasn’t yet fully recovered):

URL migration example

On a side note and from a business perspective, it might be a good idea to implement this level of change during a quiet period of the year. If Christmas is a busy time of year for you for example, then I’d recommend holding off on this change until the New Year. That way if any major mistake is made, or if Google takes a while to update things, it won’t affect your revenue stream from Google too much. This also gives you a little bit more time if something does go wrong and you need to fix things, whilst you wait for Google to re-index and rank the HTTPS URLs

Reasons NOT to move to HTTPS

One other consideration is to not make the move from HTTP to HTTPS if you have an already existing issue or penalty with Google in place. If you make this move whilst under a manual or algorithmic penalty, it may cause Google to think that you’re trying to escape the penalty and they may lose even more trust with your website making things even harder to recover from. I’d recommend that you fix any existing issue Google is having with your website first, whether it be a links based (Penguin penalty) or content based (Panda penalty) issue and then make the move over to HTTPS.

Finally, if you think that moving your website to HTTPS is going to fix any existing issues or is going to happen without any difficulty, then think again. In the eyes of Google this will be seen as a massive change to your website, and you have to be very careful to get things right to avoid damaging your rankings. With regards to any existing issues or penalties, Google will eventually figure things out and pass on the existing penalty to your new URL structure.

The important thing to remember is to treat the migration from HTTP to HTTPS as important as a URL or domain migration -if done wrong it can have a detrimental effect on your organic visibility within Google. It’s also important to bear in mind the signals your website is sending to Google. If any signals around your HTTP URLs remain, or are created in the future, this can cause Google to become confused and they may rank the wrong page. Help Googlebot to find the new HTTPS pages on your website by keeping things simple; any confusing signals can take Google a long time to figure out and update things in its SERPs.

Finally, do expect some ranking issues -as mentioned earlier, even when done right, a site can hit some ranking turbulence while Google works out the change. Following these recommendations will give your website the best chance of holding its current position.

Just remember, if you don’t get this right it can completely destroy your rankings within Google and can take a lot of cleaning up on your part. If you’re unsure, or need further support on moving your website from HTTP to HTTPS, then get in touch with us and we’d be more than happy to help.