Tuesday, June 30, 2015

Advanced Local SEO Competition Analysis

Posted by Casey_Meraz

Competition in local search is fierce. While it's typical to do some surface level research on your competitors before entering a market, you can go much further down the SEO rabbit hole. In this article we will look at how you can find more competitors, pull their data, and use it to beat them in the search game.

Since there are plenty of resources out there on best practices, this guide will assume that you have already followed the best practices for your own listing and are looking for the little things that might make a big difference in putting you over your competition. So if you haven't already read how to perform the Ultimate Local SEO Audit or how to Find and Build Citations then you should probably start there.

Disclaimer: While it's important to mention that correlation does not mean causation, we can learn a lot by seeing what the competition has done.

Some of the benefits of conducting competitive research are:

  • You can really dive into your customers' market and understand it better.
  • You can figure out who your real customers area and better target them.
  • You can get an understanding of what your competitors have done that has been successful without re-inventing the wheel.

Once you isolate trends that seem to make a positive difference, you can create a hypothesis and test. This allows you to constantly be testing, finding out what works, and growing those positive elements while eliminating the things that don't produce results. Instead of making final decisions off of emotion, make your decisions off of the conversion data.

A good competition analysis will give you a strong insight into the market and allow you to test, succeed, or fail fast. The idea behind this process is to really get a strong snapshot of your competition at a glance to isolate factors you may be missing in your company's online presence.

Disclaimer 2: It's good to use competitors' ideas if they work, but don't make that your only strategy.

Before we get started

Below I will cover a process I commonly use for competition analysis. I have also created this Google Docs spreadsheet for you to follow along with and use for yourself. To make your own copy simply go to File > Make A Copy. (Don't ask me to add you as an owner please :)

Let's get started

1. Find out who your real competitors are

Whether you work internally or were hired as an outside resource to help with your client's SEO campaign, you probably have some idea of who the competition is in your space. Some companies may have good offline marketing but poor online marketing. If you're looking to be the best, it's a good idea to do your own research and see who you're up against.

In my experience it's always good to find and verify 5-10 online competitors in your space from a variety of sources. You can use tools for this or take the manual approach. Keep in mind that you have to screen the data tools give you with your own eye for accuracy.

How do you find your "real" competitors?

We're going to look at some tools you can use to find competitors here in a second, but keep in mind you want to record everything you find.

Make sure to capture the basic information for each competitor including their company name, location, and website. These tools will be useful at a later time. Record these in the "competitor research" tab of the spreadsheet.

Method 1: Standard Google searches for competitors

This is pointing out the obvious, but if you have a set of keywords you want to rank for, you can look for trends and see who is already ranking where you want to be. Don't limit this to just one or two keywords, instead get a broader list of the competitors out there.

To do this, simply come up with a list of several keywords you want to rank for and search for them in your geographic area. Make sure your Geographic preference is set correctly so you get accurate data.

  1. Collect a list of keywords
  2. Search Google to see which companies are ranking in the local pack
  3. Record a list of the companies' names and website URLs in the spreadsheet under the competitor research tab.

To start we're just going to collect the data and enter it into the spreadsheet. We will revisit this data shortly.

Outside of the basics, I always find it's good to see who else is out there. Since organic and local rankings are more closely tied together than ever, it's a good idea to use 3rd party tools to get some insight as to what else your website could be considered related to.

This can help provide hidden opportunities outside of the normal competition you likely look at most frequently.

Method 2: Use SEMRUSH.com

SEMRush is a pretty neat competitive analysis tool. While it is a paid program, they do in fact have a few free visits a day you can check out. It's limited but it will show you 10 competitors based on keyword ranking data. It's also useful for recording paid competition as well.

To use the tool, visit www.SEMRush.com and enter your website in the provided search box and hit search. Once the page loads, you simply have to scroll down to the area that says "main competitors". If you click the "view full report" option you'll be taken to a page with 10 competition URLs.

Put these URLs into the spreadsheet so we can track them later.

Method 3: Use SPYFU.com

This is a cool tool that will show your top 5 competitors in paid and organic search. Just like SEMRush, it's a paid tool that's easy to use. On the home page, you will see a box that loads where you can enter your URL. Once you hit search, a list of 5 websites will populate for free.

Enter these competitors into your spreadsheet for tracking.

Method 4: Use Crunchbase.com

This website is a goldmine of data if you're trying to learn about a startup. In addition to the basic information we're looking for, you can also find out things like how much money they've raised, staff members, past employee history, and so much more.

Crunchbase also works pretty similarly to the prior tools in the sense that you you just enter your website URL and hit the search button. Once the page loads, you can scroll down the page to the competitors section for some data.

While Crunchbase is cool, it's not too useful for smaller companies as it doesn't seem to have too much data outside of the startup world.

Method 5: Check out Compete.com

This tool seems to have limited data for smaller websites but it's worth a shot. It can also be a little bit more high-level than I prefer, but you should still check it out.

To use the tool visit www.compete.com and enter the URL you want to examine in the box provided then hit search.

Click the "Find more sites like" box to get list of three related sites. Enter these in the provided spreadsheet.

Method 6: Use SimilarWeb.com

SimilarWeb provides a cool tool with a bunch of data to check out websites. After entering your information, you can scroll down to the similar sites section which will show websites it believes to be related.

The good news about SimilarWeb is that it seems to have data no matter how big or small your site is.


2. After you know who they are, mine their data

Now that we have a list of competitors, we can really do a deep dive to see who is ranking and what factors might be contributing to their success. To start, make sure to pick your top competitors from the spreadsheet and then look for and record the information below about each business on the Competitor Analysis tab.

You will want to to pull this information from their Google My Business page.

If you know the company's name, it's pretty easy to find them just by searching the brand. You can add the geographic location if it's a multi-location business.

For example if I was searching for a Wendy's in Parker, Colorado, I could simply search this: "Wendy's Parker, CO" and it will pull up the location(s).

Make sure to take and record the following information from their local listings. Get the data from their Google My Business (Google + Page) and record it in the spreadsheet!

  1. Business name - Copy and paste the whole business name. Sometimes businesses keyword stuff a name or have a geographic modifier. It's important to account for this.
  2. Address - The full address of the business location. Although we can't do anything about its physical location, we will search using this information shortly.
  3. City, state, zip code - The city, state, and zip listed on the Google My Business listing.
  4. Phone number - Take the listing's primary number
  5. Phone number 2 - Take the listing's secondary number like an 800 number.
  6. Landing page URL - The one connected to their Google My Business listing.
    PRO TIP: The URL will display as the root domain, but click the link to see if it takes you to an internal landing page. This is essential!
  7. Number of categories - Does your listing have more or less categories than the listing?
  8. Categories in Google My Business
    You can find the categories by clicking on the main category of the listing. It will pop out a list of all of the categories the business is listed under. If you only see one after doing this, open your browser and go to View Source. If you do Ctrl+F you can search the page for "GCID" without the quotes. This will show you the categories they're listed under if you look through the HTML.
  9. Does the profile appear to be 100% complete?
  10. How many reviews do they have?
  11. Is their business name visible in Google Street View? Obviously there is not much we can do about this, but it's interesting especially considering some patents Bill Slawski was recently talking about.

** Record this information on the spreadsheet. A sample is below.

What can we do with this data?

Since you've already optimized your own listing for best practices, we want to see if there is any particular trends that seem to be working better in a certain area. We can then create a hypothesis and test it to see if any gains are losses are made. While we can't isolate factors, we can get some insight as to what's working the more you change it.

In my experience, examining trends is much easier when the data is side by side. You can easily pick out data that stands out from the rest.

3. Have a close(r) look at their landing pages

You already know the ins and outs of your landing page. Now let's look at each competitor's landing page individually. Let's look at the factors that carry the most weight and see if anything sticks out.

Record the following information into the spreadsheet and compare side by side with your company vs. the successful ones.

Page title of landing page
City present? - Is the city present in the landing page meta title?
State present? - Is the state present in the landing page meta title?
Major KW in title? Is there a major keyword in the landing page meta title?
Content length on landing page - Possibly minor but worth examining. Copy/paste into MS Word
H1 present? - Is the H1 tag present?
City in H1? - Does the H1 contain the city name?
State in H1? - Does the H1 have the state or abbreviation in the heading?
Keyword in H1? - Do they use a keyword in the H1?
Local business schema present? - Are they using schema? Find out using the Google structured data testing tool here.
Embedded map present? - Are they embedding a Google map?
GPS coordinates present? - Are they using GPS coordinates via schema or text?


4. Off site: See what google thinks is authoritative

Recently, I was having a conversation with a client who was super-excited about the efforts his staff was making. He proudly proclaimed that his office was building 10 new citations a day and added over 500 within the past couple of months!

His excitement freaked me out. As I suspected, when I asked to see his list, I saw a bunch of low quality directory sites that were passing little or no value. One way I could tell they were not really helping (besides the fact that some were NSFW websites), was that the citations or listings were not even indexed in Google.

I think it's a reasonable assumption that you should test to see what Google knows about your business. Whatever Google delivers about your brand, it's serving because it has the most relevance or authority in its eyes.

So how can we see what Google sees?

It's actually pretty simple. Just do a Google Search. One of the ways that I try to evaluate and see whether or not a citation website is authoritative enough is to take the competition's NAP and Google it. While you've probably done this many times before for citation earning, you can prioritize your efforts based off of what's recurring between top ranked competitor websites.

As you can see in the example below where I did a quick search for a competitor's dental office (by pasting his NAP in the search bar), I see that Google is associating this particular brand with websites like:

  1. The company's main website
  2. Whitepages
  3. Amazon Local (New)
  4. Rateadentist.com
  5. DentalNeighbor.com

Pro Tip: Amazon local is relatively new, but you can see that it's going to carry a citation benefit in local search. If your clients are willing, you should sign up for this.

Don't want to copy and paste the NAP in a variety of formats? Use Andrew Shotland's NAP Hunter to get your competitor's variants. This tool will easily open multiple window tabs in your browser and search for combinations of your competitor's NAP listings. It makes it easy and it's kind of fun.

5. Check important citations

With citations, I'm generally in the ballpark of quality over quantity. That being said, if you're just getting the same citations that everyone else has, that doesn't really set you apart does it? I like to tell clients that the top citation sources are a must, but it's good to seek out opportunities and monitor what your competition does so you can keep up and stay ahead of the game.

You need to check the top citations and see where you're listed vs. your competition. Tools like Whitespark's local citation finder make this much easier to get an easy snapshot.

If you're looking to see which citations you should find and check, use these two resources below:

Just like in the example in the section above, you can find powerful hidden gems and also new website opportunities that arise from time to time.

Just because you did it once doesn't mean you should leave it alone

A common mistake I see is businesses thinking it's ok to just turn things off when they get to the top.That's a bad idea. If you're serious about online marketing, you know that someone is always out to get you. So in addition to tracking your brand mentions through the Fresh Web Explorer, you also need to be tracking your competition at least once a month! The good news is that you can do this easily with Fresh Web Explorer from Moz.

So what should you setup in Fresh Web Explorer?

  • Your competitor's brand name - Monitor their mentions and see what type of marketing they're doing!
  • Your competitor's NAP - Easily find new citations they're going after
  • City+Industry+Keywords - Maybe there are some hidden gems outside of your competition you could go after!

Plus track anything else you can think of related to your brand. This will help the on-going efforts get a bit easier.

6. Figure out which citations have dofollow links

Did you know some citation sources have dofollow links which mean they pass link juice to your website? Now while these by themselves likely won't pass a lot of juice, it adds an incentive for you to be proactive with recording and promoting these listings.

When reviewing my competition's citations and links I use a simple Chrome plugin called NoFollow which simply highlights nofollow links on pages. It makes it super easy to see what's a follow vs. a nofollow link.

But what's the benefit of this? Let's say that I have a link on a city website that's a follow link and a citation. If it's an authority page that talks highly about my business, it would make sense for me to link to it from time to time. If you're getting links from websites other than your own and linking to these high quality citations you will pass link juice to your page. It's a pretty simple way of increasing the authority of your local landing pages.

7. Links, links, links

Since the Pigeon update almost a year ago, links started to make a bigger impact in local search. You have to be earning links and you have to earn high quality links to your website and especially your Google My Business Landing page.


If the factors show you're on the same playing field as your competition except in domain authority or page authority, you know your primary focus needs to be links.

Now here is where the research gets interesting. Remember the data sources we pulled earlier like compete, spyfu.com, etc? We are now going to get a bigger picture on the link profile because we did this extra work. Not only are we just going to look at the links that our competition in the pack has, we've started to branch out of that for more ideas which will potentially pay off big in the long run.

What to do now

Now we want to take every domain we looked at when we started and run Open Site Explorer on each and every domain. Once we have these lists of links, we can then sort them out and go after the high quality ones that you don't already have.


Typically, when I'm doing this research I will export everything into Excel or Google Docs, combine them into one spreadsheet and then sort from highest authority to least authority. This way you can prioritize your road map and focus on the bigger fish.

Keep in mind that citations usually have links and some links have citations. If they have a lot of authority you should make sure you add both.

8. But what about user behavior?

If you feel like you've gone above and beyond your competition and yet you're not seeing the gains you want, there is more you have to look at. Sometimes as an SEO it's easy to get in a paradigm of just the technical or link side of things. But what about user behavior?


It's no secret and even some recent tests are showing promising data. If your users visit your site and then click back to the search results it indicates that they didn't find what they were looking for. Through our own experiments we have seen listings in the SERPs jump a few positions in hours just based off of user behavior.

So what does this mean for you?

You need to make sure your pages are answering the users queries as they land on your page, preferably above the fold. For example, if I'm looking for a haircut place and I land on your page, I might be wanting to know the hours, pricing, or directions to your store. Making information prevalent is essential.

Make sure that if you're going to make these changes you test them. Come up with a hypothesis, test the results, and come to conclusion or another test based off of the data. If you want to know more about your users, I say that you need to find as much about them as human possible. Some services you can use for that are:

1. Inspectlet - Record user sessions and watch how they navigate your website. This awesome tool literally allows you to watch recorded user sessions. Check out their site.

2. LinkedIn Tracking Script - Although I admit it's a bit creepy, did you know that you can see the actual visitors to your website if they're logged into LinkedIn while browsing your website? You sure can. To do this complete the following steps:

1. Sign up for a LinkedIn Premium Account
2. Enter this code into the body of your website pages:

<img src="https://www.linkedin.com/profile/view?authToken=zRgB&authType=name&id=XXXXX" />


3. Replace the XXXXX with your account number of your profile. You can get this by logging into your profile page and getting the number present after viewid?=
4. Wait for the visitors to start showing up under "who's viewed your profile"

3. Google Analytics - Watch user behavior and gain insights as so what they were doing on your website.

Reviews

Speaking of user behavior, is your listing the only one without reviews? Does it have fewer or less favorable reviews? All of these are negative signals for user experience. Do you competitors have more positive reviews? If so you need to work getting more.


Meta descriptions

While this post was mainly geared towards local SEO as in Google My Business rankings, you have to consider that there are a lot of localized search queries that do not generate pack results. In these cases they're just standard organic listings.

If you've been deterred to add these by Google picking its own meta descriptions or by their lack of ranking benefit, you need to check yourself before you wreck yourself. Seriously. Customers will make a decision on which listing to click on based on this information. If you're not thinking about optimizing these for user intent on the corresponding page then you're just being lazy. Spend the time, increase CTR, and increase your rankings if you're serving great content.

Conclusion

The key to success here is realizing that this is a marathon and not a sprint. If you examine the competition in the top areas mentioned above and create a plan to overcome, you will win long term. This of course also assumes you're not doing anything shady and staying above board.

While there were many more things I could add to this article, I believe that if you put your focus on what's mentioned here you'll have the greatest success. Since I didn't talk too much about geo-tagged media in this article, I also included some other items to check in the spreadsheet under the competitor analysis tab.

Remember to actively monitor what those around you are doing and develop a pro-active plan to be successful for your clients.

What's the most creative thing you have seen a competitor do successfully local search? I would love to hear about it in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Videos on Facebook: Native vs YouTube. Which Wins? by @brentcsutoras

We've heard references eluding to the idea that Facebook was favoring native video uploads vs. third-party video embeds. Here's what we found out.

The post Videos on Facebook: Native vs YouTube. Which Wins? by @brentcsutoras appeared first on Search Engine Journal.

Monday, June 29, 2015

Sunday, June 28, 2015

Google Adds Pinterest, Vine, and More To New Mobile Search Carousel by @mattsouthern

Google announced an addition to its mobile search results that will give people quick access to what the company feels are searchers’ “favorite” websites.

The post Google Adds Pinterest, Vine, and More To New Mobile Search Carousel by @mattsouthern appeared first on Search Engine Journal.

Friday, June 26, 2015

How to Rid Your Website of Six Common Google Analytics Headaches

Posted by amandaecking

I've been in and out of Google Analytics (GA) for the past five or so years agency-side. I've seen three different code libraries, dozens of new different features and reports roll out, IP addresses stop being reported, and keywords not-so-subtly phased out of the free platform.

Analytics has been a focus of mine for the past year or so—mainly, making sure clients get their data right. Right now, our new focus is closed loop tracking, but that's a topic for another day. If you're using Google Analytics, and only Google Analytics for the majority of your website stats, or it's your primary vehicle for analysis, you need to make sure it's accurate.

Not having data pulling in or reporting properly is like building a house on a shaky foundation: It doesn't end well. Usually there are tears.

For some reason, a lot of people, including many of my clients, assume everything is tracking properly in Google Analytics... because Google. But it's not Google who sets up your analytics. People do that. And people are prone to make mistakes.

I'm going to go through six scenarios where issues are commonly encountered with Google Analytics.

I'll outline the remedy for each issue, and in the process, show you how to move forward with a diagnosis or resolution.

1. Self-referrals

This is probably one of the areas we're all familiar with. If you're seeing a lot of traffic from your own domain, there's likely a problem somewhere—or you need to extend the default session length in Google Analytics. (For example, if you have a lot of long videos or music clips and don't use event tracking; a website like TEDx or SoundCloud would be a good equivalent.)

Typically one of the first things I'll do to help diagnose the problem is include an advanced filter to show the full referrer string. You do this by creating a filter, as shown below:

Filter Type: Custom filter > Advanced
Field A: Hostname
Extract A: (.*)
Field B: Request URI
Extract B: (.*)
Output To: Request URI
Constructor: $A1$B1

You'll then start seeing the subdomains pulling in. Experience has shown me that if you have a separate subdomain hosted in another location (say, if you work with a separate company and they host and run your mobile site or your shopping cart), it gets treated by Google Analytics as a separate domain. Thus, you 'll need to implement cross domain tracking. This way, you can narrow down whether or not it's one particular subdomain that's creating the self-referrals.

In this example below, we can see all the revenue is being reported to the booking engine (which ended up being cross domain issues) and their own site is the fourth largest traffic source:

self-referrals-2.png

I'll also a good idea to check the browser and device reports to start narrowing down whether the issue is specific to a particular element. If it's not, keep digging. Look at pages pulling the self-referrals and go through the code with a fine-tooth comb, drilling down as much as you can.

2. Unusually low bounce rate

If you have a crazy-low bounce rate, it could be too good to be true. Unfortunately. An unusually low bounce rate could (and probably does) mean that at least on some pages of your website have the same Google Analytics tracking code installed twice.

Take a look at your source code, or use Google Tag Assistant (though it does have known bugs) to see if you've got GA tracking code installed twice.

While I tell clients having Google Analytics installed on the same page can lead to double the pageviews, I've not actually encountered that—I usually just say it to scare them into removing the duplicate implementation more quickly. Don't tell on me.

3. Iframes anywhere

I've heard directly from Google engineers and Google Analytics evangelists that Google Analytics does not play well with iframes, and that it will never will play nice with this dinosaur technology.

If you track the iframe, you inflate your pageviews, plus you still aren't tracking everything with 100% clarity.

If you don't track across iframes, you lose the source/medium attribution and everything becomes a self-referral.

Damned if you do; damned if you don't.

My advice: Stop using iframes. They're Netscape-era technology anyway, with rainbow marquees and Comic Sans on top. Interestingly, and unfortunately, a number of booking engines (for hotels) and third-party carts (for ecommerce) still use iframes.

If you have any clients in those verticals, or if you're in the vertical yourself, check with your provider to see if they use iframes. Or you can check for yourself, by right-clicking as close as you can to the actual booking element:

iframe-booking.png

There is no neat and tidy way to address iframes with Google Analytics, and usually iframes are not the only complicated element of setup you'll encounter. I spent eight months dealing with a website on a subfolder, which used iframes and had a cross domain booking system, and the best visibility I was able to get was about 80% on a good day.

Typically, I'd approach diagnosing iframes (if, for some reason, I had absolutely no access to viewing a website or talking to the techs) similarly to diagnosing self-referrals, as self-referrals are one of the biggest symptoms of iframe use.

4. Massive traffic jumps

Massive jumps in traffic don't typically just happen. (Unless, maybe, you're Geraldine.) There's always an explanation—a new campaign launched, you just turned on paid ads for the first time, you're using content amplification platforms, you're getting a ton of referrals from that recent press in The New York Times. And if you think it just happened, it's probably a technical glitch.

I've seen everything from inflated pageviews result from including tracking on iframes and unnecessary implementation of virtual pageviews, to not realizing the tracking code was installed on other microsites for the same property. Oops.

Usually I've seen this happen when the tracking code was somewhere it shouldn't be, so if you're investigating a situation of this nature, first confirm the Google Analytics code is only in the places it needs to be.Tools like Google Tag Assistant and Screaming Frog can be your BFFs in helping you figure this out.

Also, I suggest bribing the IT department with sugar (or booze) to see if they've changed anything lately.

5. Cross-domain tracking

I wish cross-domain tracking with Google Analytics out of the box didn't require any additional setup. But it does.

If you don't have it set up properly, things break down quickly, and can be quite difficult to untangle.

The older the GA library you're using, the harder it is. The easiest setup, by far, is Google Tag Manager with Universal Analytics. Hard-coded universal analytics is a bit more difficult because you have to implement autoLink manually and decorate forms, if you're using them (and you probably are). Beyond that, rather than try and deal with it, I say update your Google Analytics code. Then we can talk.

Where I've seen the most murkiness with tracking is when parts of cross domain tracking are implemented, but not all. For some reason, if allowLinker isn't included, or you forget to decorate all the forms, the cookies aren't passed between domains.

The absolute first place I would start with this would be confirming the cookies are all passing properly at all the right points, forms, links, and smoke signals. I'll usually use a combination of the Real Time report in Google Analytics, Google Tag Assistant, and GA debug to start testing this. Any debug tool you use will mean you're playing in the console, so get friendly with it.

6. Internal use of UTM strings

I've saved the best for last. Internal use of campaign tagging. We may think, oh, I use Google to tag my campaigns externally, and we've got this new promotion on site which we're using a banner ad for. That's a campaign. Why don't I tag it with a UTM string?

Step away from the keyboard now. Please.

When you tag internal links with UTM strings, you override the original source/medium. So that visitor who came in through your paid ad and then who clicks on the campaign banner has now been manually tagged. You lose the ability to track that they came through on the ad the moment they click on the tagged internal link. Their source and medium is now your internal campaign, not that paid ad you're spending gobs of money on and have to justify to your manager. See the problem?

I've seen at least three pretty spectacular instances of this in the past year, and a number of smaller instances of it. Annie Cushing also talks about the evils of internal UTM tags and the odd prevalence of it. (Oh, and if you haven't explored her blog, and the amazing spreadsheets she shares, please do.)

One clothing company I worked with tagged all of their homepage offers with UTM strings, which resulted in the loss of visibility for one-third of their audience: One million visits over the course of a year, and $2.1 million in lost revenue.

Let me say that again. One million visits, and $2.1 million. That couldn't be attributed to an external source/campaign/spend.

Another client I audited included campaign tagging on nearly every navigational element on their website. It still gives me nightmares.

If you want to see if you have any internal UTM strings, head straight to the Campaigns report in Acquisition in Google Analytics, and look for anything like "home" or "navigation" or any language you may use internally to refer to your website structure.

And if you want to see how users are moving through your website, go to the Flow reports. Or if you really, really, really want to know how many people click on that sidebar link, use event tracking. But please, for the love of all things holy (and to keep us analytics lovers from throwing our computers across the room), stop using UTM tagging on your internal links.

Now breathe and smile

Odds are, your Google Analytics setup is fine. If you are seeing any of these issues, though, you have somewhere to start in diagnosing and addressing the data.

We've looked at six of the most common points of friction I've encountered with Google Analytics and how to start investigating them: self-referrals, bounce rate, iframes, traffic jumps, cross domain tracking and internal campaign tagging.

What common data integrity issues have you encountered with Google Analytics? What are your favorite tools to investigate?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Early Morning Tweets Get More Clicks, + More From Buffer’s Latest Study by @mattsouthern

Social software company Buffer has analyzed 4.8 million tweets to determine when are the best times to tweet according to your timezone.

The post Early Morning Tweets Get More Clicks, + More From Buffer’s Latest Study by @mattsouthern appeared first on Search Engine Journal.

Thursday, June 25, 2015

A Facebook Account is No Longer Required to Use Messenger by @mattsouthern

Facebook took the last remaining step toward truly making Messenger its own platform, announcing today that people will be able to use the service without a Facebook account.

The post A Facebook Account is No Longer Required to Use Messenger by @mattsouthern appeared first on Search Engine Journal.

Wednesday, June 24, 2015

Instagram Now Lets You Search By Location, Explore Real-Time Trends, + More by @mattsouthern

Instagram announced that it has added new search functionality to its iOS and Android apps that will allow users to find photos by location, discover real-time trending searches, and more.

The post Instagram Now Lets You Search By Location, Explore Real-Time Trends, + More by @mattsouthern appeared first on Search Engine Journal.

Why ccTLDs Should Not Be an Automatic Choice for International Websites

Posted by Liam_Curley

There are many articles on domain structure for international sites. Many, if not all, recommend the use of ccTLDs due to the geo signals they send to Google; but I’ve read very few articles that substantiate this type of claim with any research or evidence. Is this recommendation outdated? With every passing year, Google gets better at reading and setting geo signals. By introducing hreflang and improving Google Webmaster Tools (recently rebranded as Google Search Console) with regards to setting target countries, it’s so much easier to get geo signals right than it was a few years ago.

With the recent changes Google has been making, I am left questioning whether or not we really need ccTLDs to target other countries. Do they have a positive impact on rankings? If they don’t, why would you use them? If you can set geo signals via webmaster tools or hreflang tags, is it better to consolidate your link equity with one domain and separate everything with subfolders?

I wanted to look at the market data concerning ccTLDs and their performance on different international versions of Google. I wanted to know whether ccTLDs demonstrated any tendency of outranking sites with gTLDs (as defined here) that had a greater DA or PA. If ccTLDs did demonstrate this trait, then perhaps there is merit in selecting them over subfolder structure. If not, and the ranking of websites on SERPs shows the general trend of order by DA/PA, then surely there is no reason to structure an international website with a ccTLD and the best option is to consolidate all links on one site and geo target the subfolders. I understand that there is more to this decision if we take into account the user's preference to interact with local domain websites. We'll touch on that point later. For now, I just want to focus on how Google seems to treat ccTLDs.

The SERP Research

The hypothesis

ccTLDs don’t supersede PA as a ranking signal. I believed that if I gathered a decent sample size, the general trend would show that ccTLDs didn’t tend to outrank sites with a gTLD and higher PA.

Local link ratio doesn’t correlate with high rankings. Rand’s research suggests local links have a positive impact on a sites ranking on local search engines. Does the ratio of local links correlate with a higher ranking? If they do, then this could lead us to believe that a consolidation of local links on a local ccTLD would support successful international SEO. If there is no correlation, then this would further support that there is little ranking benefit with this regard to using a ccTLD, as we can receive local links to a gTLD.

A local IP address doesn’t improve rankings. There still seems to be some opinion in the community that hosting a site on a local IP address will help rankings on local versions of Google.

Methodology

I wanted to gather data for competitive terms from several competitive markets. The first task was determining which markets to select. I made a decision based on the markets that have the highest B2C spend per digital consumer. I initially picked out the top 10, then selected five from those based on which sites I was able to work with (linguistically). The markets selected were: U.S., U.K., Canada, Australia, and Italy.

Next, I selected the keyword categories that I would use to analyze SERPs. I picked out the sectors based on the biggest digital B2C market sectors in the U.S.. From the top 10, I selected five: clothes, toys and games, computer and consumer electronics, furniture and home furnishings, and auto parts.

Then, I decided to identify 10 keywords for each category in each market. Keywords were selected by inputting a broad keyword into AdWords for each category (say, "game"), filtering by search volume, and selecting the highest search entries that had an average AdWords suggested bid of higher than £0.05 which would provide terms that had high search volume and commercial relevance.

This was done for each category in each market.

I collated data from the top 10 pages ranking for each SERP, giving me a total of 2,500 web pages to analyze. Searches were conducted for each keyword on the local version of Google (e.g., google.it) using the SEO Global Chrome extension from RedFly Marketing, allowing me to see the search results for a local user.

Analysis of data

Once the keywords were selected for each market, I collected the following data from each SERP:

  • Ranking position
  • URL
  • Domain structure
  • Domain authority
  • Page authority
  • Page title
  • IP address location
  • Local link ratio

From this information, I would also collect the following on each web page entry on the SERP:

  • Is there an exact keyword match in the domain?
  • Is there a partial keyword match in the domain?
  • Is the exact keyword used in the URL?
  • Is a broad keyword used in the page title?
  • Is an exact keyword used in the page title?

Each entry was given a yes or no for the questions above, which would allow me to compare domain performances on a like for like basis with regards some of the basic on-page SEO elements.

Once this data was collected, I started to identify the following:

  • Whether the ccTLD was outranking a gTLD that had a higher PA
  • Whether the ccTLD was outranking a gTLD that had a higher PA, where both the ccTLD and gTLD in question had matching on-page SEO implementation for the keyword in question

Research limitations

Let’s start with the obligatory "correlation does not equal causation." Nothing discovered in this research will definitively prove or disprove ranking factors for international SEO. However, I believe that this kind of research does throw up interesting data, and any SEO trends and correlations discovered through this type of research can set us on our own path to research further and look for more concrete signals to prove or disprove these results.

I had a decision to make regards whether to measure ccTLD ranking over TLDs with a higher PA or a higher DA. I decided to go with PA. Predominantly because I’m looking at the ranking performance of a page, not a website. DA has a direct impact on PA, but if we measured performance against DA, I think we’d be less likely to get a true picture (e.g., blogs on subdomains, and small sites with a keyword in the domain ranking with their home page).

The resources available for this research (i.e., me) meant there was a limit to the volume of SERPs and web pages analyzed. My limited linguistic skills meant I couldn’t analyze SERPs from a broader language base (e.g., Nordic and Japanese), and I could only collect data from the top 10 rankings for each SERP.

Also, ideally the data would have been drawn from the SERPs over one day. I collected the data manually. (I could have set up a crawl, but at the time I didn’t have the knowledge available to do that.) So, it was taken over the course of around six weeks.

Finally, I mentioned that I compare the rank of pages based on like for like on-page SEO. Due to time restraints, I was limited to a handful of what I deemed to be key on-page SEO signals. Therefore, it’s open to debate as to whether the signals I selected are the key signals for on-page SEO.

The results

research-cctld-vs-gtld-infographic-large

Discussion

ccTLDs are not outranking gTLDs. Graphs 1 and 2 demonstrate that the majority of ccTLDs are not outranking gTLDs that have a higher PA. Graph 1 shows that 46% of ccTLDs reviewed outrank a gTLD with a higher PA. However, when we only count "outranking" to occur when both the ccTLD and the gTLD have the same basic on-page SEO (e.g., keyword in title, URL and/or domain), we see that the percentage of ccTLDs outranking gTLDs falls to 24 percent.

This information doesn’t definitively tell us whether or not a local ccTLD is a ranking factor in national SERPs, but it does indicate that it’s probably not a signal that generally outweighs PA. That being the case, from a purely SEO perspective (not considering online consumer psychology), a subfolder must be the best domain structure for the majority of international sites. Unless you or your client is a major brand with a large budget, the resources required to launch several ccTLDs and build enough authority for each to make them visible in their respective search engines makes a ccTLD an unwise selection.

A Local IP address doesn’t pack a punch. Again, this research can’t definitively determine whether an IP address does or doesn’t provide ranking signals for national SERPs, but Graph 5 suggests that if it does, the signals are weak. Of the 474 ccTLDs with a local IP address, only 19 percent were outranking a gTLD with a higher PA. This figure suggests that an IP address has little direct impact on rankings, even when combined with a local ccTLD. That said, it's worth checking out this article on IP host location from Richard Baxter, which presents a different finding.

A Local link ratio has no relationship with high local rankings. While Rand’s research indicates local links have an impact on local search results, a local link ratio doesn’t have a relationship with high rankings. There doesn’t appear to be a benefit of setting up a ccTLD to gain local links for an international market. Local links can be earned for any domain and any structure, whether ccTLD or subfolder.

Implications for international SEO

It is difficult to make an accurate, broad statement on best practice for international SEO. Every market is likely to be slightly different with regards the way that users interact with content, as well as the way that search engines crawl and rank web pages. You also have to take into account that if you’re working with a client on SEO for different international markets, goals and resources will vary. Toys "R" Us does very well in the SERPs we analyzed with a ccTLD structure, but then they have the resources available to support multiple domains and earn local authority and PR for each domain.

The research looked at SERPs for five countries and 2,500 web pages. The results for each country did vary, and while analyzing 500 web pages for each country doesn’t represent a sufficient sample size to make a sound opinion on each, it does lead me to believe that the choice of whether to use a ccTLD or a gTLD for an international market could vary depending on the market in question. More information is available here on the data collected from each country. To summarize, here are the findings:

sample-countries-for-serps-infographic-l

I’ve omitted the U.S. from the second table, as there were only two web pages with a ccTLD from the 500 analyzed. That confirms what many of us would have suspected or known: ccTLDs aren’t widely used in the U.S. With hindsight, it probably would have been more interesting to swap the U.S. with a different country for analysis.

The information above suggests that maybe there is some variation in how sites rank in different international search engines. It’s also interesting to note that ccTLDs are more popular in some markets than other, which could have an impact on the user relationship and interaction with a website depending on it’s domain structure.

Consumer psychology and ccTLDs

Let’s put aside what I’d consider to be some of the ranking implications behind a choice of domain structure. There’s another consideration to be made when it comes to selecting a domain structure for an international site: Does a local domain have a positive impact on consumer psychology and the choice of buying or browsing on one site over another?

As with the SEO argument for a ccTLD, there are plenty of articles and research that suggest consumers prefer to shop on an eCommerce site with a local domain rather than a generic domain (U.S. excluded). Eli Schwartz recently wrote an article summarizing research he’d conducted on the searcher perception of ccTLDs . The post provided some really interesting results. However, I didn’t necessarily agree with the approach taken with one of the questions put to respondents regarding eCommerce and the impact of ccTLDs on purchase decisions.

In the study, Eli asked each respondent this: “Of the links below, which is most likely to offer the most reliable express shipping to your home?” The respondent was then asked to select either a website with a .com domain, or one with a local ccTLD. The results are interesting, but if we’re looking for insight into eCommerce buying decisions, I think it’s a bit of a leading question. If you ask the respondent a question like this, and give them the choice of a local domain or a generic domain, they’re likely to answer yes to the ccTLD. However, I don’t believe that this indicates that the ccTLD is used as an aid to make a purchase decision. It tells us if you strip all other buying aids from the process, boil it down to the choice between one domain and another, the respondent selects the local domain. Real-life buying decisions don’t work like this.

Following on from my research on international rankings, I wanted to try and create a real life test environment where respondents pick one website over another to purchase a product.

Test 1 – Impact of domain structure when a consumer is browsing an ecommerce store

Using CrowdFlower and UsabilityHub, I created a test for U.K.-based respondents. First, the respondent was presented with the following information:

“You're looking to purchase a new laptop. You've done your research and found the make and model that you'd like to buy. You find this laptop on two eCommerce websites. Based on the page your about to view, which site would you buy the laptop from?”

The respondent was then presented with the following two eCommerce sites:

DABS-Moz.jpgLaptops-direct-Moz.jpg

Both sell the same laptop with the same specification, same price, same delivery and same returns offer. The key difference between the two is that one is hosted on a .com domain and one is on a .co.uk. The design and layout for each is different, but I’ve attempted to create a real-life situation, and you’d never be choosing between two eCommerce stores with the same design.

Two hundred sixty-two respondents participated in the Dabs vs. Laptops Direct selection, and 174 of these respondents provided feedback on why they made their decision.

The results are as follows:

dabs-v-laptopsdirect-infographic-1-large

As you can see, none of the respondents selected either website due to the domain structure of the store. Choices were predominantly made on a preference for less ads or clutter, product information, usability, or branding. It seems clear to me that when the consumer is browsing an eCommerce site, the domain structure plays no part in their purchase decision. Although not tested here, localization indicators such as language, currency, delivery, and returns policy will arguably dictate whether or not you stand a chance of winning their business rather than the domain.

Test 2 – Impact of domain structure when consumer is browsing the SERPs

After I’d reviewed consumer decision-making while on the webpage, I wanted to see if ccTLDs were a genuine factor in consumer psychology on a SERP when the user is making their browsing decision.

In the next test, U.K. respondents were presented with the following text:

“You're looking to find an eCommerce site that sells car parts. You go to Google and search for 'car parts'. You see the following results page. Which website would you click on first?”

The respondent was presented with a SERP for car parts, making sure that one ccTLD of four websites (the third organic result) was available in the organic results. As you can see, the second organic result, a gTLD, contains U.K. within the domain:

google-serp-test-moz-google.jpg

The following heat map shows the websites selected by the respondents:

SERP-car-parts-Moz.jpg

The 200 respondents were then asked to give a reason for their selection. The results are as follows:

car-parts-serp-infographic-2-large.jpg

It does seem that a ccTLD can play a part in the browsing selection for a portion of the audience. Eleven percent of the respondents indicate they made their selection because the website was based in the U.K., although they don’t specify how they made that assumption (i.e., could be ccTLD, meta description, etc.). Five percent of the respondents specifically mention the local domain as the reason for their choice (although they seem to be confusing the autopartsuk.com as a U.K. domain). Seventeen percent of our respondents made the website selection based on their belief that the website was based in the U.K.

The research also shows how important the meta description is in the user-browsing decision, something that I think often gets overlooked by SEOs. In fact, 30 percent of our respondents indicated they made their selection based on information provided in the meta (mentioning things like free delivery, range of stock, and discounts). I think that when we get a website ranking for a really important keyword, SEOs can be a bit like the football (or soccer) team that’s just scored a goal. We’re so engulfed in the success of scoring that we switch off at kickoff, letting the other team score straight away. There is a danger that we think we’ve won when one of our web pages ranks well, when in fact that’s just part of the job. We still need to compete for the user’s attention once we’re on the SERP, and entice them to click on our website instead of the competitor's.

Do Google’s new ‘branded breadcrumbs’ change the significance of ccTLDs?

We’ve seen that a number of users make a SERP selection based on their assumption that the selected website is based locally. At present, the domain structure is used as a key indicator of a websites location. However, as part of the mobile algorithm update, Google’s announced a move from a URL display to a branded breadcrumb that will remove the domain structure from the SERP. On mobile, from a location perspective, the domain structure will no longer influence a users SERP selection. The 17 percent of respondents making the selection based on location will look for other information to aid their decision.

For now, on mobile at least, the SERPs present a level playing field for ccTLDs and gTLDs with regards to consumer psychology. The meta description is even more important in enticing the click.

Conclusions

For me, the research shows that choosing a ccTLD as the domain structure for an international site shouldn’t be the automatic decision that it seems to be for many. While further research is required, I don’t believe that a ccTLD domain structure has a big enough impact on rankings to warrant selecting this option over a subfolder, which allows us to consolidate links and boost DA and PA on all of our international content. We can geotarget subfolders via webmaster tools and hreflang tags, and as a local ccTLD doesn’t seem to supersede PA as a ranking factor, we should act accordingly and launch international sites with the highest PA possible (i.e., subfolders).

The research on consumer psychology does show that a ccTLD can have a positive impact on SERP user selections. However, meta descriptions can also be used to promote local service and delivery. The changes announced by Google for mobile SERPs will remove URLs from the selection equation, and we've seen that when a user is on a website, they pay little attention to the domain location.

While I feel this is the right advice for most brands, it’s probably not the right advice for all. If you’re working with a large brand, you might have the resources available to earn the marginal gains in every facet of what you do. If further research shows that ccTLDs do have some ranking impact, no matter how small, and that improves your ranking by one position for each keyword, then the impact could result in a significant amount of extra traffic if you’re working for a large eCommerce customer.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!