Innovate not imitate!

Innovate not imitate!
Interested in the latest Growth hacks?

Welcome to our blog

Interested in the latest Growth hacks?

Welcome to our blog!

We want to help you start/manage and grow your business using innovative strategies and implementation. We have a passion for helping businesses and companies of various sizes see the same success that we have achieved.

Our skillsets are wide and varied, from business strategy, marketing, to online strategy. An increasing number of companies are turning to the internet and online media as a means to maximising their marketing reach and exposure. This is special area of focus for us and we do more than simple SEO strategies.

See our website for more: www.innovatetoaccelerate.com

Thursday 31 March 2016

Google Changes Sign In Flow for Leaving Local Business Reviews

google signin local reviewsGoogle has changed the login flow for those users who are not signed into a Google account but are trying to leave a review for a local business in the local knowledge panel.  And it seems the change was likely made to combat pop-up blockers.

Conrad O’Connell spotted the change and created this GIF of it in action.

google local signin

It seems that Google is doing this to combat pop-up blockers. With many people doing the free Windows 10 upgrade, it might have become a bigger issue since the Microsoft Edge browser, the default browser in Windows 10, comes with a popup blocker turned on by default, and it is likely that many people aren’t aware it is there, or how to turn it off.

I noticed that when I got this new overlay while clicking the “Write a review” button that a pop-up was also blocked.  When I unblocked the popup, I saw this:

google local login 2

When you close the popup, you see the overlay login.

google local login 3

So there is definitely the possibility this change was made to combat those with a popup blocker who might not know they have one, which would result in a non-action when trying to click the link.

A nice change for local businesses as well, to make it easier for customers to leave a review.

I also wouldn’t be surprised if Google makes this change to other logins as well, if Google finds many users are actually using this to login due to popup blockers.

The post Google Changes Sign In Flow for Leaving Local Business Reviews appeared first on The SEM Post.



from The SEM Post http://ift.tt/1pNoptu
http://ift.tt/1UEPcWw via IFTTT

Google Showing Recipe Videos in Recipe Carousels in Search Results

recipe video carouselsShort recipe videos are becoming increasingly popular, especially on Facebook where these videos – which can be viewed without sound with subtitles – are constantly being shared.  Google is taking advantage of the popularity of recipe videos and are now showcasing them in their recipe carousels too.

Here is an example:

recipe video carousel

google recipe carousel video

And while Google has been showcasing videos from Allrecipes for a short while, they are now also showing videos from other sites too.

google recipe carousel video 2

Here are a few more examples which show other sites with the videos in the recipe carousel:

recipe video carousel 6 recipe video carousel 5

 

What is worth noting for recipe sites is that these videos do not have to be self-hosted in order to show up as a video in these recipe carousels.  Embedding a YouTube video onto the page can result in the video being attributed to the site in the carousel, rather than just YouTube.  For example, here is the embedded YouTube video on the Incredible Egg site:

recipe video carousel 7

And showing that it is a YouTube embed:

recipe video carousel 4

The video is another example of a Facebook style video with no voice instruction and all text instruction displayed on the video itself.  There does not appear to be specific video markup on the page though, so Google seems to be pulling it from recognizing the video embed.

But it also means that recipe websites could potentially embed videos that they haven’t created, and have them rank in these recipe carousels for the video.  This could open up SEO opportunities for recipe sites that don’t create videos, but it is hard to say if it would come at the cost of a regular non-video result in the carousel.  While many people will watch a silent recipe video on Facebook as they scroll past it, how many would prefer to watch a video for a recipe over one they can read the ingredients and instructions for, especially since it isn’t clear that clicking on a video recipe will also bring up that information too.

While the AllRecipes videos seem to come up for multiple searches, I first noticed non-AllRecipes videos earlier this month.  And as of this morning, Google is showing many more non-AllRecipes video results in the mobile carousels.

The post Google Showing Recipe Videos in Recipe Carousels in Search Results appeared first on The SEM Post.



from The SEM Post http://ift.tt/1MVKI5B
http://ift.tt/1q5jZ1O via IFTTT

Advertisers Can Opt Out of Any Bing Ads Annotation

bing ads opt out annotationsNot happy with how some Bing Ads annotations appear in your Bing ads in the search results?  Or are concerned with ad annotation overload with your ads?  Bing now supports the ability for advertisers to opt out of any of the annotations.

If there are specific annotations you don’t want running for your ads, because you are uncomfortable with it for any reason or because you don’t feel they add any value or enhance your ads or campaign strategy, you just need to either contact you Bing Ads rep/team or contact Bing Ads customer support in order to turn any of them off.

You can find the annotation types here, but not all are included on the list, and there are often annotations that are run on a test basis as well.  It is not clear when Bing Ads began allowing individual opting out, but not even all reps seem to be aware of it.

Previously, Bing Ads advertisers would have to opt out of all the annotations if they didn’t want a particular one used in their ads.  Normally Bing will show various annotations if they feel it improves the quality or the performance of the ads.

 

The post Advertisers Can Opt Out of Any Bing Ads Annotation appeared first on The SEM Post.



from The SEM Post http://ift.tt/1PHqsEO
http://ift.tt/1UuzsVE via IFTTT

Accelerated Mobile Pages (AMP) – What You Need to Know

christian ampAccelerated Mobile Pages (AMP) – what you need to know

Google’s Accelerated Mobile Pages or AMP is one of the hot topics SEOs have to face this year. AMP allows faster page load times even in areas with slow internet access. But this comes at a price: Currently, the function set AMP offers is limited and mainly suitable for news-related websites. Nevertheless, to stay competitive website owners should keep track of the development and check if and when they can build an AMP version of their website.

What is AMP

AMP consists of a reduced set of HTML and JavaScript. By allowing only a predefined selection of commands and tags, websites running under AMP are supposed to be leaner and faster. Additionally, AMP uses a Content Delivery Network (CDN) provided by Google. Static content like CSS, images and JavaScript is cached there and transferred via HTTP 2.0 to allow faster page load times

The style elements used by AMP websites should be inline. CSS files must have a maximum size of 50 kb.  Width and height of images and other site elements have to be declared in advance in order to render pages faster.

AMP is open source and still in development by a large community. Google is supporting the AMP project and provides the Content Delivery Network to cache static content of the pages.

What are the advantages of AMP

As mentioned above, the primary goal of inventing AMP was to accelerate page load time – especially for mobile pages. The growing amount of mobile users and the still fragmentary coverage of fast internet access in the world are good arguments for implementing AMP.

AMP is suitable for mainly static, information-based websites like blogs or news sites whereas it is much more difficult to build transactional pages or website applications using AMP. This is the reason why currently most of the existing AMP sites are news websites. Examples are Daily News, BBC, Frankfurter Allgemeine, The Guardian and New York Times.

Analytics can be implemented easily into AMP websites. There are standard tags that allow the usage of different analytics providers. If more than one analytics provider is to be used, one doesn’t have to define website events more than once. For simple events, an amp-pixel tag can be used. For more complex events, there is the amp-analytics tag.

Another advantage of AMP sites is the chance to appear in Google’s news carousel. Since a few days Google shows selected websites at this prominent position together with an AMP label – provided you use structured data for articles.

Are there any disadvantages of AMP?

In addition to the limited field of operation (focus on static news-related websites), AMP has some more drawbacks. First of all, there is only limited support for ad networks. Despite the fact that Google meanwhile has integrated more than 20 ad networks that can be used on AMP websites, the rest of online ads are currently excluded.

The reason for this is that ads are one of the main factors that slow down a website. Google demands fast and asynchronous ads, playing together well with the rest of the AMP websites. These requirements aren’t met by all existing ad networks yet. Among the already supported ad networks are AOL, AdSense (Google), DoubleClick (Google), Kargo, Moat, OpenX und OutBrain.

A second disadvantage concerns one of the main features of AMP: the limited instruction set. As a webmaster, you rely on the JavaScript commands and HTML tags AMP supports. This means that many functions that can normally be implemented on a website by standard JavaScript are not available with AMP. This is especially true for building app-like websites with complex user interactions.

As a third drawback, many see AMP’s dependency on Google as potentially problematic. Although Google emphasizes the open source characteristic and the huge development community that works on AMP, it is totally clear that the project is conducted in the interest of Google, the company who controls the direction in which the project is heading. These critics suspect the end of independent web standards and the rise of company-driven directives in the web.

Does AMP fit to every website?

The adoption of AMP for existing websites is one of the biggest problems AMP has to face currently. Using AMP normally means to have two different versions of a website: the canonical HTML version and the AMP version. Fortunately, there is some support to create AMP pages automatically with some CMS.

If you use a CMS like WordPress, there are extensions for AMP, which seems to be working for most WordPress sites, although you need to ensure you are running the latest version of the plugin. Joomla doesn’t have official AMP support yet, although there is a third party plugin available, but it requires payment if you want more than the bare bones version.  Drupal has recently updated their own free module for AMP support.

If you are not a developer and have no professional knowledge of AMP’s characteristics, you are reliant on either waiting for a proper plugin or a professional programmer who does the necessary things for you.

It’s on behalf of Google helping the manufacturers of the biggest Content Management Systems to develop suitable plugins as fast as possible.

In addition to this, also transactional websites like online shops need mid- or long-term AMP support. It remains to be seen if Google will encourage such websites as well.

Do AMP sites rank better?

The usage of AMP itself is no ranking factor – page speed indeed is. Using AMP normally mean faster page load times, so indirectly AMP is a ranking factor. Additionally, AMP only works via HTTPS which also is a ranking factor. The combination of all these technical improvements helps websites to get more appreciation by Google.

When is the best time to start with AMP?

Since AMP launched in February, more and more sites are adding AMP support.  While it currently is only displaying in the news carousel on mobile, blog entries can also appear in those carousels, something many site owners are unaware of.

Although AMP is just in the beginning phases of an overall launch and will undergo lots of changes, you need time to experiment and to test. This is especially true for big websites in a competitive environment.

Owners of small or mid-size websites can wait for suitable and properly working extensions for the CMS they use. If using a proprietary system, you should think about changing to WordPress, Joomla. Typo3 or the like. But there is no need to hurry, depending on your industry or market: As long as the share of AMP websites is relatively low you don’t need to fear to be left behind.  That said, the adoption is pretty high already, especially in competitive news markets.

What can I do to keep myself updated about AMP?

If you like to get the latest news on AMP, check the AMP project’s website on a regular basis. Additionally, Google News Lab offers hangouts about AMP. People interested in this topic can ask their questions in advance and get a detailed answer in the course of the hangout.  More recently, Google added AMP support to their own Webmaster Help Forums, with dedicated AMP support employees responding to questions and issues.

You should also check AMP‘s GitHub page where you can find the latest documentation about the project.

What’s next?

Like mobile-friendliness in the last year, Google is pushing AMP hard. Sooner or later you will have to implement an AMP page to stay competitive. As an alternative, you can look for other ways to make your website faster. One possibility is the deployment of a CDN in order to facilitate the transmission of static website content. This is especially true for news-related websites.

It remains to be seen if Google will extend AMP’s function area to also support transaction-based websites like online shops.

But we do know that Google plans to expand AMP to more countries as well as to other parts of the search results, with the possibility of it coming to AdWords as well.

The post Accelerated Mobile Pages (AMP) – What You Need to Know appeared first on The SEM Post.



from The SEM Post http://ift.tt/1TkDPkQ
http://ift.tt/1RN0elm via IFTTT

How are businesses using Google Posts?

Google has recently debuted a new feature giving individuals and organisations a new platform for communicating with the wider public.

Dubbed ‘Google Posts’ by most commentators – although Google has indicated that the feature does not have an official name yet – the new platform appears as a carousel of ‘card’ style updates within a search engine results page.

The feature was originally introduced as part of the U.S. Presidential Elections, as a way for presidential candidates to deliver a personal message to the public via Google. Later on, some sharp-eyed searchers noticed that Posts had been rolled out to a very limited number of local businesses in the US, and was being given a prominent billing on search results pages.

Google has been fairly quiet about the new feature’s existence so far. Search Engine Land managed to confirm with the company that this was an official test, which has been rolled out to a “few dozen” local businesses.

Of these, only three seem to be widely known about: a day spa and massage therapist, a comic book store, and a jeweller’s specialising in engagement and wedding rings. So how are these lucky few making use of Google’s newest innovation, and what can we take away for when the feature is rolled out more widely in future?

Andrews Jewelers

Andrews Jewelers was the first business to be spotted using the new feature, by local search expert Mike Blumenthal as he searched for engagement rings in Buffalo, New York.

Searchers who enter the right keywords on Google.com are presented with a miniature carousel of the most recent few posts by the business in the search results page. Each has a time stamp, and a share button which allows the posts to be republished on social media platforms like Twitter, Facebook and (in weirdly meta fashion) Google+.

A screenshot showing the search results page for "engagement rings Buffalo". The fourth result down is Andrews Jewelers, below which is a carousel of shareable 'cards' containing one or more images and a snippet of text. The title above it reads 'Andrews Jewelers on Google'.

Although the posts are showcased prominently in search results, businesses using Google Posts don’t seem to be artificially boosted to the top of the results page. Rather, the Posts carousel will appear directly below the business’s highest entry on the search results page, whether it be the top search result or the fourth.

Clicking on the company’s logo takes you to a feed of published posts by that company which is presented on, in Blumenthal’s words, “a slimmed-down Plus like page”. The resemblance to Google+ (and the fact that well, it is a Google creation) has caused many to speculate that the new feature could eventually be phased in as a replacement for Google+, which has noticeably been stripped out of branded searches as of late.

As a retail business, Andrews Jewelers has made use of the strong visual element of Google Posts to show off its products, with eye-catching images of their diamond rings and custom designs.

A Google Posts update from Andrews Jewelers, showing an ornate silver ring from three different angles. The text relates that Andrews Jewelers has just finished this custom designed Masonic ring for a customer, which will be printed on a 3D printer and then cast in platinum.

Andrews Jewelers’ updates on Posts are slightly more evergreen compared with something like its Twitter feed, promoting longer-lasting content like a diamond-buying guide, a post on the importance of prong maintenance, and current trends and styles in the jewellery world.

The business is good at linking up its various different channels, drawing attention to five-star Google reviews and encouraging readers to visit its Google+ Page.

Most interestingly, company founder Andy Moquin published a longer piece to his business’s Posts page, addressing an “ongoing debate in the jewelry industry” about gemological laboratories and diamond grading.

I’m on the fence about whether Google Posts makes the best platform for this kind of piece. The no-frills platform interface makes text posts very readable, but without an image, a text post is only given a few lines of preview in the main feed, and is very easily overlooked between the more attention-grabbing visual posts.

A screenshot showing two large and prominent Google Posts updates with eye-catching images, and in between them, an almost overlooked snippet of text.I almost missed the text post here when scrolling through the feed.

Then again, when you’re one of the first businesses ever to make use of a new Google platform, anything is a good idea from a visibility standpoint. It’s cool to see businesses experimenting with different types of post on the new platform, and hopefully there’ll be plenty of room to find out what works when the feature is rolled out more widely.

Escape Pod Comics

Another local New York business, Escape Pod Comics in Huntingdon, NY, has also been spotted using Google Posts.

Escape Pod has a good mix of posts going on in its feed, using images, video, GIFs and text posts to promote the business, spotlight individual artists and highlight upcoming events. As a comic book retailer, it uses visual posts to great advantage, using GIFs to showcase a creator’s distinctive style ahead of a signing, or to show off products within the store every Wednesday (the day new comic books are released).

A Google Posts update showing a photograph of comic books on shelves in Escape Pod Comics, with a text explanation that every Wednesday in the store is New Comic Book Day, and these are some of the products that the store has on its shelves this NCBD.

The page features some cross-promotion of Escape Pod’s blog, as well as a post which makes use of a screenshot from the store’s Instagram. With posts dating back to 29th February, Escape Pod Comics is also our earliest adopter of the three Google Posts businesses, as the other two feeds both date back to 1st March.

I found it entertaining that, when searching for Escape Pod Comics on Google.com (Google Posts currently only appear in search results for Google.com and not any of the localised Googles), Google has already begun to make associations between searches for the different businesses who are using Google Posts.

A screenshot of a search for "escape pod comics". In the drop-down box of search suggestions below it, the top suggestion is "A healthy choice spa", one of the other businesses known to be testing Google Posts.

A Healthy Choice Spa

Our third Google Posts business is based in Lincoln, Nebraska, which promptly did away with my theory about whether the businesses that Google has chosen to debut Posts had any geographical link.

A Healthy Choice is our most frequent updater on Posts, often publishing two or three posts per day. The updates tend to be short and simple, containing just a single line of text, an image or GIF, and a link.

At this stage, it’s impossible to say whether updating more frequently on Google Posts could be an advantage, a disadvantage, or not really make a difference. In a situation where most or all businesses on Google have their own Posts feed, it’s conceivable that Google could boost the more active publishers higher up the search results, but this could also turn out not to be a factor at all.

Currently, the most that it affects is the number of recent posts which show in the ‘carousel’ on the search results page, which seem to have a cut-off period of about one week; the more posts which have been published in the past week, the more will be displayed in the carousel.

A screenshot of the ‘carousel‘ of recent posts published by A Healthy Choice Spa, each featuring an image and a simple text description such as "Find your happy place" with a link to the spa‘s website. The earliest is time stamped 6 days ago.

As well as updates promoting its business and the health benefits of massages, A Healthy Choice uses its feed to support and promote local events, history and causes. This isn’t unusual for social media, but when publishing to a platform which feeds directly into search results pages, I wonder if it’s such a good idea.

On the one hand, it looks good for a business to be seen promoting its local area, and building trust and respect with the nearby community is always important for local businesses.

On the other hand, with Posts from businesses appearing directly within search results, businesses might have to put on a search engine ‘hat’ and consider how to deliver the most useful information to searchers who are looking up their business on Google.

I can see it going either way, but as with most things, it will be up to businesses to refine what works when Google Posts is rolled out on a larger scale.

A Google Posts update from A Healthy Choice Spa, showing a conductor frozen in the middle of conducting an orchestra. Underneath the text reads, "Supporting music for all our community", with a link.What’s next for Posts?

It’s difficult to speculate too much about what Google plans for Posts, given that Google itself has kept so quiet about the whole project. None of the businesses taking part in the initial, experimental testing stages seems to have made an announcement about being approached or selected, leaving searchers to stumble across these early users by accident.

The homepage for Posts on Google definitely implies a wider implementation of the platform, calling it an “experimental new podium” which will allow people to “hear directly from the people and organisations [they] care about on Google.”

“Verified individuals and organisations can now communicate with text, images and videos directly on Google,” it proclaims. Anyone who is a “public figure or organisation” who would like to publish on Google can join the waiting list, though noticeably, the form doesn’t require anyone to specify why they are significant or even what organisation they represent in order to sign up.

A screenshot of the waitlist form from Google Posts. The form only has three fields: Name, Email and Additional Notes. The first two fields are marked by an asterisk as being compulsory; the third is not.

Based on what we’ve seen of Posts so far, Google’s new feature seems like a halfway house between a new social media outlet and a publishing platform. Certainly, the early adopters seem to be using it that way.

But I think there’s potential here for Posts to become something completely new altogether. As I mentioned before, the fact that Posts are published directly into search results means that publishers will have to bear searchers in mind as their audience with their presentation and the information they provide.

The key thing setting Google Posts apart from social media (and blogging platforms) is the lack of interactivity. You can share posts, but not otherwise comment on or interact with them. That could always change in future, but I think it’s a statement of intent as to where Google is going with this feature.

Speculation about replacing Plus aside, I don’t think Google is setting out to create a self-contained social network with its own ecosystem, but something that extends more seamlessly from existing Google search. It’s not just another social network or another publishing platform – both of which Google already owns.

But combining elements of both with the huge ‘audience’ that Google (as the world’s most popular search engine) commands makes using Posts a very attractive prospect indeed, at least from a business perspective.

I think the next big consideration will be whether users find it beneficial, or whether it will be seen as just another level of clutter trying to draw their attention away from the information they’re searching for.



from SEO – Search Engine Watch http://ift.tt/1UW9hau
via IFTTT

Wednesday 30 March 2016

Here’s How to Supercharge Your Competitive Research Using a URL Profiler and Fusion Tables

Posted by Craig_Bradshaw

[Estimated read time: 19 minutes]

As digital marketers, the amount of data that we have to collect, process, and analyze is overwhelming. This is never more true than when we're looking into what competitors are doing from a link building perspective.

Thankfully, there are a few things we can do to make this job a little bit easier. In this post, I want to share with you the processes I use to supercharge my analysis of competitor backlinks. In this post, you'll learn:

  • How to use URL Profiler for bulk data collection
  • How to use fusion graphs to create powerful data visualizations
  • How to build an SEO profile of the competition using URL Profiler and fusion tables

Use URL Profiler for bulk data collection

Working agency-side, one of the first things I do for every new client is build a profile of their main competitors, including those who have a shared trading profile, as well as those in their top target categories.

The reason we do this is that it provides a top-level overview of the industry and how competitive it actually is. This allows us to pick our battles and prioritize the strategies that will help move the right needles. Most importantly, it’s a scalable, repeatable process for building links.

This isn't just useful for agencies. If you work in-house, you more than likely want to watch your competitors like a hawk in order to see what they're doing over the course of months and years.

In order to do this, you’re inevitably going to need to pull together a lot of data. You’ll probably have to use a range of many different tools and data points.

As it turns out, this sort of activity is where URL Profiler becomes very handy.

For those of you who are unfamiliar with URL Profiler is, it's a bulk data tool that allows you to collect link and domain data from thousands of URLs all at once. As you can probably imagine, this makes it an extremely powerful tool for link prospecting and research.

URL Profiler is a brilliant tool built for SEOs, by SEOs. Since every SEO I know seems to love working with Excel, the output you get from URL Profiler is, inevitably, most handy in spreadsheet format.

Once you have all this amazing bulk data, you still need to be able to interpret it and drive actionable insights for yourself and your clients.

To paraphrase the great philosopher Ben Parker: with great data power comes great tedium. I’ll be the first to admit that data can be extremely boring at times. Don’t get me wrong: I love a good spreadsheet as much as I love good coffee (more on that later); but wherever possible, I’d much rather just have something give me the actionable insights I need.

This is where the power of data visualization comes into play.

Use fusion tables for powerful data visualization

Have you ever manually analyzed one million articles to see what the impact of content format and length has on shares on links? Have you ever manually checked the backlink profile of a domain that has over half a million links? Have you ever manually investigated the breakdown of clicks and impressions your site gets across devices? Didn’t think so.

Thanks to Buzzsumo & Moz, Majestic, Ahrefs, and the Google Search Console, we don’t have to; we just use the information they give us to drive our strategy and decision-making.

The reason these tools are so popular is they allow you to input your data and discern actionable insights. Unfortunately, as already mentioned, we can’t easily get any actionable insights from URL Profiler. This is where fusion tables become invaluable.

If you aren’t already familiar with fusion tables, then the time has come for you to get acquainted with them.

Back in 2012, Google rolled out an “experimental” version of their fusion tables web application. They did this to help you get more from your data and tell the story of what’s going on in your niche with less effort. It’s best to think of fusion tables as Google’s answer to big data.

There are plenty of examples of how people are using fusion tables to tell their stories with data. However, for the purpose of brevity, I only want to focus on one incredibly awesome feature of fusion tables — the network graph.

h8SDcTN.png


If fusion tables are Google’s answer to big data, then the network graph feature is definitely Google’s answer to Cerebro from X-Men.

I won’t go into too many details about what network graphs are (you can read more about them here), as I would much rather talk about their practical applications for competitive analysis.

Note: There is a fascinating post on The Moz Blog by Kelsey Libert about effective influencer marketing that uses network graphs to illustrate relationships. You should definitely check that post out.

I’d been using URL Profiler and fusion tables tools in isolation of each other for quite a while — and they each worked very well — before I figured out how to combine their strengths. The result is a process that combines the pure data collection power of URL Profiler with the actionable insights that fusion graphs provide.

I've outlined my process below. Hopefully, it will allow you to do something similar yourself.

Build a competitive SEO profile with URL Profiler and fusion tables

To make this process easier to follow, we'll pretend we're entering the caffeinated, yet delicious space of online coffee subscriptions. (I've chosen to use this particular niche in our example for no reason other than the fact that I love coffee.) Let’s call our hypothetical online coffee subscription company "Grindhaus."

Step 1: Assess your competition

We’ll start by looking at the single keyword "buy coffee online." A Google search (UK) gives us the top 10 that we’ll need to crack if we want to see any kind of organic progress. The first few results look like this: zjDG2Tc.png?1

Step 2: Gather your data

However, we’ve already said that we want to scale up our analysis, and we want to see a large cross-section of the key competitors in our industry. Thankfully, there’s another free tool that comes in handy for this. The folks over at URL Profiler offer a number of free tools for Internet marketers, one of which is called the SERP Scraper. No prizes for guessing what it does: add in all the main categories and keywords you want to target and hit scrape.

e3jAb81.png?1

As you can see from the image above, you can do this for a specific keyword or set of keywords. You can also select which country-specific results you want to pull, as well as the total number of results you want for each query.

It should only take a minute or so to get the results of the scrape in a spreadsheet that looks something like this:

sNko03Z.png

In theory, these are the competitors we'll need to benchmark against in order for Grindhaus to see any sort of organic progress.

From here, we'll need to gather the backlink profiles for the companies listed in the spreadsheet one at a time. I prefer to use Majestic, but you can use any backlink crawling tool you like. You'll also need to do the same for your own domain, which will make it easier to see the domains you already have links from when it's time to perform your analysis.

After this is done, you will have a file for your own domain, as well as a file for each one of the competitors you want to investigate. I recommend investigating a minimum of five competitors in order to obtain a data set large enough to obtain useful insights from.

Next, what we need to do is clean up the data so that we have all the competitor link data in one big CSV file. I organize my data using a simple two-column format, as follows:

  • The first column contains the competitor being linked to. I've given this column the imaginative heading "Competitor."
  • The second column contains the domains that are linking to your competitors. I've labeled this column "URL" because this is the column header the URL Profiler tool recognizes as the column to pull metrics from.

Once you have done this, you should have a huge list of the referring domains for your competitors that looks something like this:

IjfGTeb.png

This is where the fun begins.

Step 3: Gather even more data

Next, let's take each domain that is linking to one, some, or all of your competitors and run it through URL Profiler one at a time. Doing this will pull back all the metrics we want to see.

It's worth noting that you don’t need any additional paid tools or APIs to use URL Profiler, but you will have to set up a couple of API keys. I won’t go into detail here on how to do this, as there are already plenty of resources explaining this readily available, including here and here. Vl6tUIQ.png?1

One of the added benefits of doing this through URL Profiler is that you can use its "Import and Merge" feature to append metrics to an existing CSV. Otherwise, you would have to do this by using some real Excel wizardry or by tediously copying and pasting extreme amounts of data to and from your clipboard.

As I’ve already mentioned, URL Profiler allows me to extract both page-level and domain-level data. However, in this case, the domain metrics are what I’m really interested in, so we'll only examine these in detail here.

Majestic, Moz, and Ahrefs metrics

Typically, SEOs will pledge allegiance to one of these three big tools of the trade: Majestic, Moz, or Ahrefs. Thankfully, with URL Profiler, you can collect data from any or all of these tools. All you need to do is tick the corresponding boxes in the Domain Level Data selection area, as shown below. iIoJzQi.png

In most cases, the basic metrics for each of the tools will suffice. However, we also want to be able to assess the relevance of a potential link, so we'll also need Topical Trust Flow data from Majestic. To turn this on, go to Settings > Link Metrics using the top navigation and tick the “Include Topical Trust Flow metrics” box under the Majestic SEO option.
JnUG72w.png

Doing this will allow us to see the three main topics of the links back to a particular domain. The first topic and its corresponding score will give us the clearest indication of what type of links are pointing back to the domain we're looking at.

In the case of our Grindhaus example, we'll most likely be looking for sites that scored highly in the “Recreation/Food” category. The reason we want to do this is because relevance is a key factor in link quality. If we're selling coffee, then links from health and fitness sites would be useful, relevant, and (more likely to be) natural. Links from engineering sites, on the other hand, would be pretty irrelevant, and would probably look unnatural if assessed by a Google quality rater.

Social data

Although the importance of social signals in SEO is heavily disputed, it's commonly agreed that social signals can give you a good idea of how popular a site is. Collecting this sort of information will help us to identify sites with a large social presence, which in theory will help to increase the reach of our brand and our content. In contrast, we can also use this information to filter out sites with a lack of social presence, as they're likely to be of low quality.

Social Shares

Ticking "Social Shares" will bring back social share counts for the site’s homepage. Specifically, it will give you the number of Facebook likes, Facebook shares, Facebook comments, Google plus-ones, LinkedIn shares, and Pinterest pins.

Social Accounts

Selecting "Social Accounts" will return the social profile URLs of any accounts that are linked via the domain. This will return data across the following social networks: Twitter, Google Plus, Facebook, LinkedIn, Pinterest, YouTube, and Instagram.

Traffic

In the same way that sites with strong social signals give us an indication of their relative popularity, the same can also be said for sites that have strong levels of organic traffic. Unfortunately, without having direct access to a domain’s actual traffic figures, the best we can do is use estimated traffic.

This is where the "SEMrush Rank" option comes into play, as this will give us SEMrush's estimation of organic traffic to any given domain, as well as a number of organic ranking keywords. It also gives us AdWords data, but that isn’t particularly useful for this exercise. pNgt3pH.png

It's worth mentioning once more time that this is an estimate of organic traffic, not an actual figure. But it can give you a rough sense of relative traffic between the sites included in your research. Rand conducted an empirical study on traffic prediction accuracy back in June — well worth a read, in my opinion.

Indexation

One final thing we may want to look at is whether or not a domain is indexed by Google. If it hasn’t been indexed, then it's likely that Google has deindexed the site, suggesting that they don't trust that particular domain. The use of proxies for this feature is recommended, as it automatically queries Google in bulk, and Google is not particularly thrilled when you do this! pw4DOYa.png

After you’ve selected all the metrics you want to collect for your list of URLs, hit "Run Profiler" and go make yourself a coffee while it runs. (I’d personally go with a nice flat white or a cortado.)

For particularly large list of URLs, it can sometimes take a while, so it would probably be best to collect the data a day or two in advance of when you plan to do the analysis. For the example in this post, it took around three hours to pull back data for over 10,000 URLs. But I could have it running in the background while working on other things.

Step 4: Clean up your data

One of the downsides of collecting all of this delicious data is that there are invariably going to be columns we won’t need. Therefore, once you have your data, it's best to clean it up, as there's a limit on the number of columns you can have in a fusion table. CXFldtb.png

You'll only need the combined results tab from your URL Profiler output. So you can delete the results tab, which will allow you to re-save your file in CSV format.

Step 5: Create your new fusion table

Head on over to Google Drive, and then click New > More > Google Fusion Tables. zbULZzA.png

If you can’t see the "Google Fusion Tables" option, you'll have to select the "Connect More Apps" option and install Fusion Tables from there: nffgrIL.png

From here, it’s pretty straightforward. Simply upload your CSV file and you'll then be given a preview of what your table will look like.

Click "Next" and all your data should be imported into a new table faster than you can say "caffeine."
VwO62dA.png

WSpdPNN.png

Step 6: Create a network graph

Once you have your massive table of data, you can create your network graph by clicking on the small red "+" sign next to the "Cards" tab at the top of your table. Choose "Add Chart" and you'll be presented with a range of chart options. The one we’re interested is the network graph option: DadqMBW.png

Once you’ve selected this option, you'll then be asked to configure your network graph. We’re primarily interested in the link between our competition and their referring domains.

However, the relationship only goes in one direction: I, the referring website, give you, the retailer, a link. Thus the connection. Therefore, we should tick the "Link is directional" and "Color by columns" options to make it easier to distinguish between the two.

By default, the network graph is weighted by whatever is in the third column — in this case, it's Majestic CitationFlow, so our blue nodes are sized by how high the CitationFlow is for a referring domain. Almost instantly, you can spot the sites that are the most influential based on how many sites link to them.

This is where the real fun begins.

One interesting thing to do with this visualization that will save you a lot of time is to reduce the number of visible nodes. However, there's no science to this, so be careful you're not missing something. wzwURXr.png

As you increase the number of nodes shown, more and more blue links begin to appear. At around 2,000 nodes, it’ll start to become unresponsive. This is where the filter feature comes in handy, as you can filter out the sites that don’t meet your chosen quality thresholds, such as low Page Authority or a large number of outbound links.

So what does this tell us — other than there appears to be a relatively level playing field, which means there is a low barrier to entry for Grindhaus?

This visualization gives me a very clear picture of where my competition is getting their links from. adaFRBx.png

In the example above, I’ve used a filter to only show referring domains that have more than 100,000 social shares. This leaves me with 137 domains that I know have a strong social following that would definitely help me increase the reach of my content.

You can check out the complete fusion table and network graph here.

Step 7: Find your mutant characteristics

Remember how I compared network graphs to Google’s answer to Cerebro from X-Men? Well, this is where I actually explain what I meant.

For those of you that are unfamiliar with the X-Men universe, Cerebro is a device that amplifies the brainwaves of humans. Most notably, it allows telepaths to distinguish between humans and mutants by finding the presence of the X-gene in a mutant’s body.

Using network graphs, we can specify our own X-gene and use it to quickly find high-quality and relevant link opportunities. For example, we could include sites that have a Domain Authority greater than or equal to 50:

81Wu6Zp.png

For Grindhaus, this filter finds 242 relevant nodes (from a total of 10,740 total nodes). In theory, these are domains Google would potentially see as being more trustworthy and authoritative. Therefore, they should definitely be considered as potential link-building opportunities.

You should be able to see that there are some false positives in here, including Blogspot, Feedburner, and Google. However, these are outweighed by an abundance of extremely authoritative and relevant domains, including Men’s Health, GQ Magazine, and Vogue.co.uk.

Sites that have "Recreation/Food" as their primary Topical Trust Flow Topic:

rp5JT4o.png

This filter finds 361 relevant nodes out of a total of 10,740 nodes, which all have "Recreation/Food" as their primary Topical Trust Flow Topic.

Looking at this example in more detail, we see that another cool feature of network graphs is that the nodes that have the most connections are always in the center of the graph. This means you can quickly identify the domains that link to more than one of your competitors, as indicated by the multiple yellow lines. This works in a similar way to Majestic’s "Click Hunter" feature and Moz’s "Link Intersect" tool.

However, you can do this on a much bigger scale, having a wider range of metrics at your fingertips.

qFP2gro.png

In this case, toomuchcoffee.com, coffeegeek.com, and beanhunter.com would be three domains I would definitely investigate further in order to see how I could get a link from them for my own company.

Sites that are estimated to get over 100,000 organic visits, weighted by social shares:

1ui0EZa.png

For our Grindhaus, this filter finds 174 relevant nodes out of 10,740, which are all estimated to receive more than 100,000 organic visits per month. However, I have also weighted these nodes by "Homepage Total Shares." This allows me to see the sites that have strong social followings and have also been estimated to receive considerable amounts of organic traffic (i.e., "estimorganic" traffic).

By quickly looking at this network graph, we can immediately see some authoritative news sites such as The Guardian, the BBC, and the Wall Street Journal near the center, as well as quite a few university sites (as denoted by the .ac.uk TLD).

Using this data, I would potentially look into reaching out to relevant editors and journalists to see if they’re planning on covering National Coffee Week and whether they’d be interested in a quote from Grindhaus on, say, coffee consumption trends.

For the university sites, I’d look at reaching out with a discount code to undergraduate students, or perhaps take it a bit more niche by offering samples to coffee societies on campus like this one.

This is barely scratching the surface of what you can do with competitor SEO data in a fusion table. SEOs and link builders will all have their own quality and relevance thresholds, and will also place a particular emphasis on certain variables, such as Domain Authority or total referring domains. This process lets you collect, process, and analyze your data however you see fit, allowing you to quickly find your most relevant sites to target for links.

Step 8: Publish and share your amazing visualization

Now that you have an amazing network graph, you can embed it in a webpage or blog post. You can also send a link by email or IM, which is perfect for sharing with other people in your team, or even for sharing with your clients so you can communicate the story of the work you’re undertaking more easily.

Note: Typically, I recommend repeating this process every three months.

Summary and caveats

Who said that competitive backlink research can't be fun? Aside from being able to collect huge amounts of data using URL Profiler, with network graphs you can also visualize the connections between your data in a simple, interactive map.

Hopefully, I’ve inspired you to go out and replicate this process for your own company or clients. Nothing would fill me with more joy than hearing tales of how this process has added an extra level of depth and scale to your competitive analysis, as well as given you favorable results.

However, I wouldn’t be worth my salt as a strategist if I didn’t end this post with a few caveats:

Caveat 1: Fusion tables are still classed as “experimental," so things won’t always run smoothly. The feature could also disappear altogether overnight, although my fingers (and toes) are crossed that it doesn’t.

Caveat 2: Hundreds of factors go into Google’s ranking algorithm, and this type of link analysis alone does not tell the full story. However, links are still seen as an incredibly important signal, which means that this type of analysis can give you a great foundation to build on.

Caveat 3: To shoehorn one last X-Men analogy in... using Cerebro can be extremely dangerous, and telepaths without well-trained, disciplined minds put themselves at great risk when attempting to use it. The same is true for competitive researchers. However, poor-quality link building won’t result in insanity, coma, permanent brain damage, or even death. The side effects are actually much worse!

In this age of penguins and penalties, links are all too often still treated as a commodity. I’m not saying you should go out and try to get every single link your competitors have. My emphasis is on quality over quantity. This is why I like to thoroughly qualify every single site I may want to try and get a link from. The job of doing competitive backlink research using this method is to assess every possible option and filter out the websites you don’t want links from. Everything that’s left is considered a potential target.

I’m genuinely very interested to hear your ideas on how else network graphs could be used in SEO circles. Please share them in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/1Y1vQbD
via IFTTT

Social Media Today