Innovate not imitate!

Innovate not imitate!
Interested in the latest Growth hacks?

Welcome to our blog

Interested in the latest Growth hacks?

Welcome to our blog!

We want to help you start/manage and grow your business using innovative strategies and implementation. We have a passion for helping businesses and companies of various sizes see the same success that we have achieved.

Our skillsets are wide and varied, from business strategy, marketing, to online strategy. An increasing number of companies are turning to the internet and online media as a means to maximising their marketing reach and exposure. This is special area of focus for us and we do more than simple SEO strategies.

See our website for more: www.innovatetoaccelerate.com

Friday 29 April 2016

Google Adds New “Instant Content” Feature in Search Results

google postsGoogle has launched a new feature into beta where Google will publish content directly within their search results… content that is supplied by publishers.  The content is in a new large section of the search results and can make a person or an organization stand out.

When publishers submit their content, Google is actually hosting this content on its own site, on a posts.google.com URL.  The articles are currently limited to 14,400 words. In addition to the text, up to ten images and/or videos can be included.

Here is how it appears in the search results:

google posts 1It tyhen leads to an expanded version of the teaser post:

google posts 2  And finally, clicks through to the entire post:

google posts 3

It is a bit unusual that Google is having that middle page that is essentially a zoomed in version of what is showing on the search results page.  Since Google is about usability and making search fast for its searchers, there could be a hidden purpose to that middle page.

Another unique aspect is that the content is essentially live and active for only 7 days.  After that, while the pages will still be available, they will no longer be served in the search results.

According to a report in WSJ, this feature doesn’t have an official name yet.

“We’re continuing to experiment with the look and feel of this feature, including exploring other potential use cases,” according to a statement from Google. A Google spokeswoman said the feature doesn’t yet have an official name.

Google also confirmed with WSJ that this is not AMP, this is something completely different, since Google is hosting this content rather than requiring the brand to do so.

If you are a verified individual or an organization, you can request access to the beta by joining the waitlist.

The post Google Adds New “Instant Content” Feature in Search Results appeared first on The SEM Post.



from The SEM Post http://ift.tt/26Am8mI
http://ift.tt/1Tj63bu via IFTTT

Google Play Now Showing When Apps Contain Ads

google play ads appGoogle has expanded a popular feature to show Android users when the app they are looking at contains in-app ads or not.  This follows the recent addition of the in-app purchases note that shows up on some Android apps that have in-app purchases available.

A user on Reddit spotted the addition to the apps in the Google Play Store.  He shared the screenshot:

google play ads 2

This is great for users.  Many users get very annoyed when the app they use runs in-app ads, especially if it is a paid app.  So if someone is presented with multiple apps in a search, they might weigh in-app ads as a deciding factor in choosing an app to download.

Could this lead to a drop in installs for apps with ads?  It is definitely a possibility if it turns out that app searchers are heavily preferring apps without in-app advertising.  So it could also mean that some app developers chose to remove ads because of it.

It does appear to be in the midst of rolling out, so not everyone has it enabled on their Android device yet.

Google even lists that their own YouTube app contains ads too.  So they aren’t giving themselves a pass when it comes to the fact they have ads.

google play ads youtube

There doesn’t seem to be a way to filter out apps depending on whether they have in-app advertising or not.

The post Google Play Now Showing When Apps Contain Ads appeared first on The SEM Post.



from The SEM Post http://ift.tt/1SOSHX3
http://ift.tt/24oUjM9 via IFTTT

8 Old School SEO Practices That Are No Longer Effective - Whiteboard Friday

Posted by randfish

[Estimated read time: 14 minutes]

Are you guilty of living in the past? Using methods that were once tried-and-true can be alluring, but it can also prove dangerous to your search strategy. In today's Whiteboard Friday, Rand spells out eight old school SEO practices that you should ditch in favor of more effective and modern alternatives.

8 Old School SEO Practices That Are No Longer Effective Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about some old school SEO practices that just don't work anymore and things with which we should replace them.

Let's start with the first one — keywords before clicks.

Look, I get the appeal here. The idea is that we've done a bunch of keyword research, now we're doing keyword targeting, and we can see that it might be important to target multiple keywords on the same page. So FYI, "pipe smoking," "tobacco smoking," "very dangerous for your health," not recommended by me or by Moz, but I thought it was a funny throwback keyword and so there you go. I do enjoy little implements even if I never use them.

So pipes, tobacco pipes, pipe smoking, wooden pipes, this is not going to draw anyone's click. You might think, "But it's good SEO, Rand. It's good to have all my keywords in my title element. I know that's an important part of SEO." Not anymore. It really is not anymore an important . . . well, let's put it this way. It's an important part of SEO, which is subsumed by wanting to draw the clicks. The user is searching, they're looking at the page, and what are they going to think when they see pipes tobacco, pipes, pipe smoking, wooden pipes? They have associations with that — spammy, sketchy, I don't want to click it — and we know, as SEOs, that Google is using click signals to help documents rank over time and to help websites rank over time.

So if they're judging this, you're going to fall in the rankings, versus a title like "Art of Piping: Studying Wooden Pipes for Every Price Range." Now, you're not just playing off the, "Yes, I am including some keywords in there. I have 'wooden' and 'pipes.' I have 'art of piping,' which is maybe my brand name." But I'm worried more about drawing the click, which is why I'm making this part of my message of "for every price range." I'm using the word "stunning" to draw people in. I'm saying, "Our collection is not the largest but the hand-selected best. You'll find unique pipes available nowhere else and always free, fast shipping."

I'm essentially trying to create a message, like I would for an AdWords ad, that is less focused on just having the raw keywords in there and more focused on drawing the click. This is a far more effective approach that we've seen over the last few years. It's probably been a good six or seven years that this has been vastly superior to this other approach.

Second one, heavy use of anchor text on internal links.

This used to be a practice that could have positive impacts on rankings. But what we've seen lately, especially the last few years, is that Google has discounted this and has actually even punished it where they feel like it's inappropriate or spammy, manipulative, overdone. We talked about this a little in our internal and external linking Whiteboard Friday a couple of weeks back.

In this case, my suggestion would be if the internal link is in the navigation, if it's in the footer, if it's in a sidebar, if it's inside content, and it is relevant and well-written and it flows well, has high usability, you're pretty safe. However, if it has low usability, if it looks sketchy or funny, if you're making the font small so as to hide it because it's really for search engines and not for searchers and users, now you're in a sketchy place. You might count on being discounted, penalized, or hurt at some point by Google.

Number three, pages for every keyword variant.

This is an SEO tactic that many folks are still pursuing today and that had been effective for a very long time. So the idea was basically if I have any variation of a keyword, I want a single page to target that because keyword targeting is such a precise art and technical science that I want to have the maximum capacity to target each keyword individually, even if it's only slightly different from another one. This still worked even up to four or five years ago, and in some cases, people were sacrificing usability because they saw it still worked.

Nowadays, Google has gotten so smart with upgrades like Hummingbird, obviously with RankBrain last year, that they've taken to a much more intent- and topic-matching model. So we don't want to do something like have four different pages, like unique hand-carved pipes, hand-carved pipes, hand-carved tobacco pipes, and hand-carved tobacco smoking pipes. By the way, these are all real searches that you'll find in Google Suggest or AdWords. But rather than taking all of these and having a separate page for each, I want one page targeting all of them. I might try and fit these keywords intelligently into the content, the headline, maybe the title, the meta description, those kinds of things. I'm sure I can find a good combination of these. But the intent for each of these searchers is the same, so I only want one page targeting them.

Number four — directories, paid links, etc.

Every single one of these link building, link acquisition techniques that I'm about to mention has either been directly penalized by Google or penalized as part of an update, or we've seen sites get hit hard for doing it. This is dangerous stuff, and you want to stay away from all of these at this point.

Directories, well, generic directories and SEO directories for sure. Article links, especially article blasts where you can push an article in and there's no editorial review. Guest content, depending on the editorial practices, the board might be a little different. Press releases, Google you saw penalized some press release websites. Well, it didn't penalize the press release website. Google said, "You know what? Your links don't count anymore, or we're going to discount them. We're not going to treat them the same."

Comment links, for obvious reasons, reciprocal link pages, those got penalized many years ago. Article spinners. Private link networks. You se private and network, or you see network, you should just generally run away. Private blog networks. Paid link networks. Fiverr or forum link buys.

You see advertised on all sorts of SEO forums especially the more aggressive, sketchy ones that a lot of folks are like, "Hey, for $99, we have this amazing package, and I'll show you all the people whose rankings it's increased, and they come from PageRank six," never mind that Page Rank is totally defunct. Or worse, they use Moz. They'll say like, "Domain authority 60-plus websites." You know what, Moz is not perfect. Domain authority is not a perfect representation of the value you're going to get from these things. Anyone who's selling you links on a forum, you should be super skeptical. That's somewhat like someone going up to your house and being like, "Hey, I got this Ferrari in the yard here. You want to buy this?" That's my Jersey coming out.

Social link buys, anything like this, just say no people.

Number five, multiple microsites, separate domains, or separate domains with the same audience or topic target.

So this again used to be a very common SEO practice, where folks would say, "Hey, I'm going to split these up because I can get very micro targeted with my individual websites." They were often keyword-rich domain names like woodenpipes.com, and I've got handmadepipes.net, and I've got pipesofmexico.co versus I just have artofpiping.com, not that "piping" is necessarily the right word. Then it includes all of the content from all of these. The benefit here is that this is going to gain domain authority much faster and much better, and in a far greater fashion than any of these will.

Let's say that it was possible that there is no bias against the exact match domain names folks. We're happy to link to them, and you had just as much success branding each of these and earning links to each of these, and doing content marketing on each of these as you did on this one. But you split up your efforts a third, a third, a third. Guess what would happen? These would rank about a third as well as all the content would on here, which means the content on handmadepipes.net is not benefitting from the links and content on woodenpipes.com, and that sucks. You want to combine your efforts into one domain if you possibly can. This is one of the reasons we also recommend against subdomains and microsites, because putting all of your efforts into one place has the best shot at earning you the most rankings for all of the content you create.

Number six, exact and partial keyword match domain names in general.

It's the case like if I'm a consumer and I'm looking at domain names like woodenpipes.com, handmadepipes.net, uniquepipes.shop, hand-carved-pipes.co, the problem is that over time, over the last 15, 20 years of the Web, those types of domain names that don't sound like real brands, that are not in our memories and don't have positive associations with them, they're going to draw clicks away from you and towards your competitors who sound more credible, more competent, and more branded. For that reason alone, you should avoid them.

It's also that case that we've seen that these types of domains do much more poorly with link earning, with content marketing, with being able to have guest content accepted. People don't trust it. The same is true for public relations and getting press mentions. The press doesn't trust sites like these.

For those reasons, it's just a barrier. Even if you thought, "Hey, there's still keyword benefits to these," which there is a little bit because the anchor text that comes with them, that points to the site always includes the words and phrases you're going after. So there's a little bit of benefit, but it's far overwhelmed by the really frustrating speed bumps and roadblocks that you face when you have a domain like this.

Number seven — Using CPC or Adwords' "Competition" to determine the difficulty of ranking in organic or non-paid results

A lot of folks, when they're doing keyword research, for some reason still have this idea that using cost per click or AdWords as competition scores can help determine the difficulty of ranking in organic, non-paid results. This is totally wrong.

So see right here, I've got "hand-carved pipes" and "unique wooden pipes," and they have an AdWords CPC respectively of $3.80 and $5.50, and they have AdWords competition of medium and medium. That is in no way correlated necessarily with how difficult they'll be to rank for in the organic results. I could find, for example, that "unique wooden pipes" is actually easier or harder than "hand-carved pipes" to rank for in the organic SEO results. This really depends on: Who's in the competition set? What types of links do they have and social mentions do they have? How robust is their content? How much are they exciting visitors and drawing them in and serving them well? That sort of stuff is really hard to calculate here.

I like the keyword difficulty score that Moz uses. Some other tools have their own versions. Doctor Pete, I think, did a wonderful job of putting together a keyword difficulty score that's relatively comprehensive and well-thought through, uses a lot of the metrics about the domain and the page authority scores, and it compensates for a lot of other things, to look at a set of search results and say, "This is probably about how hard it's going to be," and whether it's harder or easier than some other keyword.

Number eight — Unfocused, non-strategic "linkbait"

Last one, some folks are still engaging in this, I think because content strategy, content marketing, and content as a whole has become a very hot topic and a point of investment. Many SEOs still invest in what I call "nonstrategic and unfocused link bait." The idea being if I can draw links to my website, it doesn't really matter if the content doesn't make people very happy or if it doesn't match and gel well with what's on my site. So you see a lot of these types of practices on sites that have nothing to do with it. Like, "Here are seven actors who one time wore too little clothing." That's an extreme example, but you get the idea if you ever look at the bottom ads for a lot of content stuff. It feels like pretty much all of them say that.

Versus on topic link bait or what I'd call high quality content that is likely to draw in links and attention, and create a positive branding association like, "Here's the popularity of pipes, cigarettes, electronic cigarettes, and cigars in the U.S. from 1950 to today." We've got the data over time and we've mapped that out. This is likely to earn a lot of links, press attention. People would check it out. They'd go, "Oh, when was it that electronic cigarettes started getting popular? Have pipes really fallen off? It feels like no one uses them anymore. I don't see them in public. When was that? Why was that? Can I go over time and see that dataset?" It's fundamentally interesting, and data journalism is, obviously, very hot right now.

So with these eight, hopefully you'll be able to switch from some old school SEO techniques that don't work so well to some new ways of thinking that will take your SEO results to a great place. And with that, we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/1N5jP4Q
via IFTTT

Thursday 28 April 2016

Google AdWords Expanded Headlines Now Live

google expanded ads 3Google has a new closed beta for AdWords advertisers which expands the length of the title and descriptions significantly.

In this new beta, descriptions can be up to 80 characters while the headline space is doubled.  Display URLs can also be further customized to show two directories instead of the usual single one.

Himanshu Sharma spotted an example of the expanded ads, which he posted to Twitter.

google expanded ads

With Google removing the sidebar ads, this means that ads are no longer restricted to the columns size when it comes to ad copy.  This seems to be the first step they have taken to monopolize on the newly available space.

Ginny Marvin posted more details about the closed beta here, including a since deleted screenshot showing the style.

This is a closed beta, so if you are interested in gaining access, you need to contact your Google AdWords rep to see if it can be made available to you. But I suspect this is something that would be coming out of beta pretty soon.

The post Google AdWords Expanded Headlines Now Live appeared first on The SEM Post.



from The SEM Post http://ift.tt/1SuMAIm
http://ift.tt/1pJrfiU via IFTTT

Google Testing Both Lighter & Darker Blue Links in Search Results

google blue link testAt SMX West earlier this year, Paul Haahr commented that Google recently tested 41 different shades of blue to discover which one was best.  Well it looks like their experimenting isn’t over because in the past couple of days, searchers have noticed both darker and lighter shades of blue being tested.

Here is the darker version Google is testing, side by side with the standard colors, that JamiePG shared on Twitter.  Notice that the purple color to show recently clicked is also darker.

google blue darker 1

 

Here is the lighter blue version that Phil Blackwell shared on Twitter.

google blue lighter 1

As Blackwell notes, this shade of blue is closer to the Bing blue that Bing uses in their own search results.

Google is always testing in the search results, so them testing various shades of blue is not unusual.  But who has managed to spot the other 39 color variation :)

 

The post Google Testing Both Lighter & Darker Blue Links in Search Results appeared first on The SEM Post.



from The SEM Post http://ift.tt/1rkfdhl
http://ift.tt/1N2kMuK via IFTTT

Google Adding Neighborhood Info in Local Knowledge Panel

google neighborhoodGoogle is now adding neighborhood information to the local knowledge panel for some businesses.  Joy Hawkins was the first to notice the addition.  Wehen neighborhood information is available, Google will show it prominently near the top of the local knowledge panel.

Here is how it appears in the search results:

google neighborhood 1

And here is a closer look at the knowledge panel.

google neighborhood 2

There is more information on Hawkins blog post on how to get neighborhood information showing in your local listing, if it isn’t showing up already.

Hawkins also noticed that it isn’t being used in the US, since it isn’t an option in Map Maker.  Instead, US listings will only show on the Google Local Finder page and not in the knowledge panel that displays in the search results.

The post Google Adding Neighborhood Info in Local Knowledge Panel appeared first on The SEM Post.



from The SEM Post http://ift.tt/1Wssikk
http://ift.tt/1TfLyfQ via IFTTT

The Local SEO Agency’s Complete Guide to Client Discovery and Onboarding

Posted by MiriamEllis

Why proper onboarding matters

Imagine getting three months in on a Local SEO contract before realizing that your client’s storefront is really his cousin’s garage. From which he runs two other “legit” businesses he never mentioned. Or that he neglected to mention the reviews he bought last year. Worse yet, he doesn’t even know that buying reviews is a bad thing.

The story is equally bad if you’re diligently working to build quality unique content around a Chicago client’s business in Wicker Park but then realize their address (and customer base) is actually in neighboring Avondale.

What you don’t know will hurt you. And your clients.

A hallmark of the professional Local SEO department or agency is its dedication to getting off on the right foot with a new client by getting their data beautifully documented for the whole team from the start. At various times throughout the life of the contract, your teammates and staff from complementary departments will be needing to access different aspects of a client’s core NAP, known challenges, company history, and goals.

Having this information clearly recorded in shareable media is the key to both organization and collaboration, as well as being the best preventative measure against costly data-oriented mistakes. Clear and consistent data play vital roles in Local SEO. Information must not only be gathered, but carefully verified with the client.

This article will offer you a working Client Discovery Questionnaire, an Initial Discovery Phone Call Script, and a useful Location Data Spreadsheet that will be easy for any customer to fill out and for you to then use to get those listings up to date. You’re about to take your client discovery process to awesome new heights!

Why agencies don’t always get onboarding right

Lack of a clearly delineated, step-by-step onboarding process increases the potential for human error. Your agency’s Local SEO manager may be having allergies on Monday and simply forget to ask your new client if they have more than one website, if they’ve ever purchased reviews, or if they have direct access to their Google My Business listings. Or they could have that information and forget to share it when they jump to a new agency.

The outcomes of disorganized onboarding can range from minor hassles to disastrous mistakes.

Minor hassles would include having to make a number of follow-up phone calls to fill in holes in a spreadsheet that could have been taken care of in a single outreach. It’s inconvenient for all teammates when they have to scramble for missing data that should have been available at the outset of the project.

Disastrous mistakes can stem from a failure to fully gauge the details and scope of a client’s holdings. Suddenly, a medium-sized project can take on gigantic proportions when the agency learns that the client actually has 10 mini-sites with duplicate content on them, or 10 duplicate GMB listings, or a series of call tracking numbers around the web.

It’s extremely disheartening to discover a mountain of work you didn’t realize would need to be undertaken, and the agency can end up having to put in extra uncompensated time or return to the client to renegotiate the contract. It also leads to client dissatisfaction.

Setting correct client expectations is completely dependent on being able to properly gauge the scope of a project, so that you can provide an appropriate timeline, quote, and projected benchmarks. In Local, that comes down to documenting core business information, identifying past and present problems, and understanding which client goals are achievable. With the right tools and effective communication, your agency will be making a very successful start to what you want to be a very successful project.

Professional client discovery made simple

There’s a lot you want to learn about a new client up front, but asking (and answering) all those questions right away can be grueling. Not to mention information fatigue, which can make your client give shorter and shorter answers when they feel like they’ve spent enough time already. Meanwhile your brain reaches max capacity and you can’t use all that valuable information because you can’t remember it.

To prevent such a disaster, we recommend dividing your Local SEO discovery process into a questionnaire to nail down the basics, a follow-up phone call to help you feel out some trickier issues, and a CSV to gather the location data. And we’ve created templates to get you started...

Client Discovery Questionnaire

Use our Local SEO Client Discovery Questionnaire to understand your client’s history, current organization, and what other consultants they might also be working with. We’ve annotated each question in the Google Doc template to help you understand what you can learn and potential pitfalls to look out for.

If you want to make collecting and preserving your clients’ answers extra easy, use Google Forms to turn that questionnaire into a form like this:

You can even personalize the graphic, questions, and workflow to suit your brand.

Client Discovery Phone Script

Once you’ve received your client’s completed questionnaire and have had time to process the responses and do any necessary due diligence (like using our Check Listings tool to check how aggregators currently display their information), it’s time to follow up on the phone. Use our annotated Local SEO Client Discovery Phone Script to get you started.

local seo client discovery phone script

No form necessary this time, because you’ll be asking the client verbally. Be sure to pay attention to the client’s tone of voice as they answer and refer to the notes under each question to see what you might be in for.

Location Data CSV

Sometimes the hardest part of Local SEO is getting all the location info letter-perfect. Make that easier by having the client input all those details into your copy of the Location Data Spreadsheet.

local seo location data csv

Then use the File menu to download that document as a CSV.

You’ll want to proof this before uploading it to any data aggregators. If you’re working with Moz Local, the next step is an easy upload of your CSV. If you’re working with other services, you can always customize your data collection spreadsheet to meet their standards.

Keep up to date on any business moves or changes in hours by designing a data update form like this one from SEER and periodically reminding your client contact to use it.

Why mutual signals of commitment really matter

There are two sides to every successful client project: one half belongs to the agency and the other to the company it serves. The attention to detail your agency displays via clean, user-friendly forms and good phone sessions will signal your professionalism and commitment to doing quality work. At the same time, the willingness of the client to take the necessary time to fill out these documents and have these conversations signals their commitment to receiving value from their investment.

It’s not unusual for a new client to express some initial surprise when they realize how many questions you're asking them to answer. Past experience may even have led them to expect half-hearted, sloppy work from other SEO agencies. But, what you want to see is a willingness on their part to share everything they can about their company with you so that you can do your best work.

Anecdotally, I’ve fully refunded the down payments of a few incoming clients who claimed they couldn’t take the time to fill out my forms, because I detected in their unwillingness a lack of genuine commitment to success. These companies have, fortunately, been the exception rather than the rule for me, and likely will be for your agency, too.

It’s my hope that, with the right forms and a commitment to having important conversations with incoming clients at the outset, the work you undertake will make your Local team top agency and client heroes!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/1qWA4H8
via IFTTT

Wednesday 27 April 2016

Why Blog Comments Are Great for Google SEO… and Users

comments seoThere has been a significant backlash against comments in the past couple of years, with many sites dispensing with comments completely, and trying to push those conversations to social media instead.  But is this a smart move?

Google has often referred to user generated content as a valuable quality signal, something that can be irreplaceable for some types of sites.  And while comments aren’t the “magic sauce” by any stretch of the imagination, in a time when SEOs are making their sites secure for the teeny, tiny ranking boost (yes, while we know a site being secure it a great thing, it wasn’t until that ranking boost was announced that suddenly being secure was important to SEOs), comments can play a part in a site’s overall quality and subsequent ranking.

Why Sites are Removing Comments

There are multiple reasons sites have given to remove comments… and not surprisingly, a big one is as a move to prevent Google from negatively impacting a site due to low quality comments.  This is the same reason many site owners state when removing other types of user generated content, such as forums or contributor articles.

But many operate under the assumption that user generated content equals low quality.  And while there certainly is a staggering amount of very poor quality UGC out there, there are many, many examples of sites based significantly on user generated content that are incredibly well.

Direct Benefits in Google

While we have known many types of user generated content can be a quality signal, it still gets a bad rap.  Gary Illyes from Google had a Twitter discussion during Pubcon last year about how UGC can definitely provide value…. take Stack Overflow for example.  And there is a huge difference between blog spam or crappy guest blog posts and thoughtful, high quality UGC, regardless of whether it takes the form of comments, forum posts or even contributed content.

But comments are definitely on the short end of the stick of late, and the perception that comments are low quality is a big reason.

I also asked Illyes for confirmation specifically about blog comments, since so many sites are removing them.

I then asked for a bit of clarification, specifically if comments could “help” in ways beyond simply the quality content part of the algo equation.

Of course, comments themselves are also a contributor to the good content part too. So it being a visible sign of a thriving community are bonus points for SEOs.

He also clarifies that is not in an ordered list, or “stack rank.”

John Mueller has also talked about comments bring a positive signal as well, something we talked about extensively in our Panda Algo Guide.  And he also confirms Google views comments as part of the content.

That’s something where we essentially try to treat these comments as part of your content. So if these comments bring useful information in addition to the content that you’ve provided also on these pages, then that could be a really good addition to your website. It could really increase the value of your website overall. If the comments show that there’s a really engaged community behind there that encourages new users when they go to these pages to also comment, to go back directly to these pages, to recommend these pages to their friends, that could also be a really good thing.

You don’t have to worry about a handful of “great post!” type comments (checking, of course, that they aren’t also linking to a spam site) but if the majority of the comments are very short non-helpful comments, you might consider approving fewer of them.

And in case you are reverse engineering what Illyes said, Google doesn’t have any kind of negative impact on sites if they don’t have “community involvement.”

Content Value

It isn’t unusual that sometimes comments can be even more valuable than the actual content that led to the comment being contributed.  This is especially true in technical market areas.

Take for example this article on a recent Google patent at SEO by the Sea.  While there are some typical “Great post, Bill!” type of comments, there are also a large number of high quality and thoughtful content with additional insight, differing opinions and related links that people found of interest.  If these were disabled, they would be scattered throughout private Facebook groups, hard to find Twitter threads and instant messaging conversations.

Also, comments can be considered high enough quality that Google will skip the main content on the page and pull from the comments for a featured snippet.  And that example is far from the only one, I have seen multiple others, and in most cases the content on the actual main part of the page wouldn’t have been suitable for the featured snippet.  So if any site with comment-based featured snippets decided to get rid of their comments, they would also lose those featured snippets too.

But as with anything, poor quality comments can negatively impact the value of that content.  You definitely need to be careful with what gets approved and what doesn’t.

And even if it adds a slight increase to your site’s value for Google, webmasters spend far more time agonizing over title tags, or even meta tags, than the time it takes to do some quick comment approvals each day.

Comments & the Panda Algo Connection

Just like great comments can boost a site, poor quality comments can have a detrimental affect on a site, especially where Panda is concerned.  So make sure you are not approving low quality or spammy comments on your site.

More from John Mueller:

When our quality algorithms go to your website, and they see that there’s some good content here on this page, but there’s some really bad or kind of low quality content on the bottom part of the page, then we kind of have to make a judgment call on these pages themselves and say, well, some good, some bad. Is this overwhelmingly bad? Is this overwhelmingly good? Where do we draw the line?

Instead of knee-jerking to remove comments if you are worried you are impacted by Panda, instead, take a read through the Panda Algo guide, and if you feel you are still suffering from the effects of Panda, go through comments to remove the low quality ones instead of axing comments completely.

Don’t forget that Panda is now on a rolling update, which makes it a bit easier for those who are gradually going through older posts that might have lower quality updates, so there isn’t the panic race to complete it before the next Panda update. That said, it can be a bit more difficult to know if changes of removing low quality content is working, or if other things were impacting it.

But if you think your general ranking woes are related to comments, unless you have an abundance of very low quality comments that overwhelm the good quality ones, there are probably going to be other areas worth looking at more closely.  Not all ranking issues are attributed to Panda and Penguin, there are plenty of other ranking signals to be focused on too.

Comments & E-A-T

In the Google Quality Rater Guidelines, raters are asked to consider E-A-T, which is “expertise, authoritativeness and trustworthiness.” (we go into it in depth in our 2014 look at the guidelines, which introduced E-A-T.)

When you remove comments from your site, it becomes harder for visitors to vet the quality of the content. Often comments can reinforce the accuracy of the content in the articles, and raters can use this to try and gauge how trustworthy the content is.  Remove comments and you are losing a pretty strong signal of E-A-T.

But comments can play a role in those raters determining if it does have a high level of E-A-T.  While it won’t necessarily happen on every single article, or even most of them, some comments who definitely say things that helps a rater ascertain E-A-T on the page or site.

While the quality raters do not have a direct impact on the rankings of the sites they rate, it shows the types of sites Google wants to be ranking, so it is definitely a measured risk to disable comments.

User Engagement

Comments are also great for showing how popular a site is.  While share counts can be easily manipulated, it is much harder to fake legitimate comments on blog posts on a large scale.  So comments are often seen as a quick way to tell how popular a website is.

Comments also bring back users repeatedly.  Especially if the discussion is particularly good or there’s good nature debate happening, people will check back to check for other comments.  So suddenly sites can lose a ton of extra pageviews due to this.

And even if a person isn’t actually engaging in comments, they too could visit again in the future to see newly added comments on a post they are interested in.

Is It Best For Your Users?

This is a key way to look at it.  If the comments are low quality and spammy, it will probably annoy users to see them.  And this is not the right kind of content to enhance your site in the eyes of Google.

There are often sites that disable comments and face a mini backlash from visitors who really enjoyed reading and writing comments. So make sure you aren’t sacrificing repeat visitors who may end up going to a competitive site instead.

Don’t forget that many people also view things like commenting on blog posts and forums as part of their social activity, especially for those who might not get outside the house as often as others.  And these are often the ones who are most prolific.. and they could be the ones who are your site’s mini ambassadors too, who recommend your site whenever they can.  So again, are you going to alienate those users by dropping comments?

Google has often said that webmasters should do what’s best for your users, so really consider who removing comments is best for… you or the users?

Harder to Identify Issues

Even on high quality news sites, sometimes inaccurate articles are published, but when there are these inaccuracies, it often becomes quickly apparent in the comments with others saying variations of “uhhhh” and “I think this has some issues.  If you look at…”  But with comments gone, this becomes harder.

They can also be a great heads up for typos or other errors that a quick edit can fix.

How many times have you tweeted at a non-ecommerce site and gotten crickets in response?  If you are encouraging conversation on social media, then pay attention to those social media channels so when one of your visitors is trying to alert you to issues with your content, you actually see it.

Does Moving Comments Onto Social Media Actually Work?

Some felt that if they attempted to move comments off their site, by suggesting talking about posts on social media, it would stimulate more conversation, and hence more shares, on various social media channels.  So potentially those who comment via social media would be doing free promotion for their sites as well.

The downside of this is that many won’t take to social media if the option to leave a comment right on the content itself is unavailable.  Perhaps they only use their Twitter for business and the site in question was something more personal.  Or maybe they don’t want your site connected to their Facebook brand to comment. Perhaps they follow few people and few people follow them…. it is a real scenario of whether anyone would actually see that comment, even if it is extraordinarily insightful.

Sending Users Away

In a time when everyone analyses bounce rates and trying to keep people on their site, sending users away to comment on an article elsewhere means they might make that comment and never return.  But perhaps if the site didn’t lead them to Twitter or Facebook, they would have read 4 more articles, signed up for a newsletter and purchased something.

While you don’t want to build a fortress around a visitor, you don’t want to usher them out the door either.

Adding or Re-Adding Comments

It can take time for visitors to start commenting, especially if they are familiar with the site being comment-free, or those visitors remember when the comments were shut off.  Encourage comments and respond to the comments as well.  You can even call on friends to help get the ball rolling, because sometimes it takes people seeing others are commenting before they jump in too.

If you are turning back on comments, check and see if your archive of previous comments are still in the database. Do a quick check to ensure they are as quality as you remember (or do an audit if you are seeing lower quality comments you no longer want Google to see), and start showing them again.  It may take time before Google reindexes them all with comments, especially for older content Google doesn’t crawl as regularly.

Bottom Line

Don’t be so quick to remove your comments because you think it brings down the quality of the site.  While low quality spammy type of comments can bring down the quality, with moderation it can increase the value of your site in Google’s eyes.

And while site owners should be focused on users, we all know that most prioritize Google over visitors.  But having good quality comments on blog posts benefits both, meaning you are providing value to your visitors, while also showing Google that those users are contributing as a “thriving community.”

Don’t fall into the trap believing that all user generated content is bad.  Some user generated content is amazing and incredibly helpful…. for both users and Google.

The post Why Blog Comments Are Great for Google SEO… and Users appeared first on The SEM Post.



from The SEM Post http://ift.tt/1VSeUrb
http://ift.tt/1SPm5Q7 via IFTTT

5 Content Marketing Ideas for May 2016 May is good for content marketers. The month includes many holidays and events, giving marketers compelling, timely topics to cover. Content marketing seeks to attract, engage, ... http://ift.tt/eA8V8J

May is good for content marketers. The month includes many holidays and events, giving marketers compelling, timely topics to cover. Content marketing seeks to attract, engage, ...

from Practical Ecommerce » Marketing http://ift.tt/1SAsQjp
via IFTTT

Do 50% of adults really not recognise ads in search results?

Around half of adults are unable to recognise ads in Google’s search results, according to a survey. 

This surprising statistic comes from Ofcom’s Adults’ media use and attitudes report, released this month.

While I’ve seen studies suggesting that many people don’t know the difference between paid and organic ads, that 50% could look at a set of results like those below and still not spot them seems bizarre.

paid and organic results

The stats

For Ofcom’s study, ‘adults who use search engines’ were shown a picture of the SERPs for ‘walking boots’.

This is what the SERP looks like now, but the study was carried out in 2015, so the shopping results were not there at that time. As the study says:

“Their attention was drawn to the first three results at the top of the list, which were distinguished by an orange box with the word ‘Ad’ written in it. They were then prompted with three options and asked whether any of these applied to these first three results.”

walking boots

The 1,328 survey respondents were allowed to select more than one answer so, for example, some may have said that the ads were both paid links and the best results.

Understanding of paid-for results returned by Google searches, among adults who use search engine websites or apps:

ofcom 1

To clarify the results, 60% identified them as paid links, while 49% identified them only as paid ads, i.e. they selected only the correct answer.

Ofcom also split the results out between newer and more established internet users. Newer users in this case are defined as those who first went online less than five years ago. There were 160 newer users surveyed, and 1,113 older users.

These are the response to the same question as before, just split by old and new:

ofcom 2

In a nutshell: newer users were less likely to identify that the results with the yellow ad label were indeed paid results. 34% of newer and 51% of established users gave only the correct answer.

I asked Andrew GirdwoodHead of Media Technology at Cello Signal about the findings. He was pretty surprised: 

“I’ve closely followed the evolution of disclosure in search engine ads over the years. At one point the lines were blurred – Yahoo’s paid inclusion, for example, traded your money with for some sort of organic search position. Those days, in Europe and America, are long gone. Regulators on both sides of the Atlantic watch closely.

The ad badge updates to Google’s paid search should have made it crystal clear the listing has been paid for. We’re talking about a bright yellow “Ad” label beside the result. How can you miss it? Searching for competitive keyword? Google returns a whole column of Ad, Ad, Ad and Ad mentions. It leaps off the badge to me.

It is just short of mind boggling that 50% of searchers in the UK can’t see the Ad disclosure. When Steve Krug published “Don’t Make Me Think” in 2000 to offer advice on web usability I wonder if he had imaged an audience that was both digitally savvy and web-blind as this.”

Other studies into PPC ads

I’ve looked at this issues before. In 2014, I reported on stats from UX firm Bunnyfoot, which found that 36% didn’t know that PPC ads were indeed ads (a previous study from the same firm produced a figure of 41%).

This was a relatively small sample – 103 people took UX tests with eye-tracking technology and were asked afterwards if they saw any ads.

With the help of Dan Barker, I carried out further tests on this using two separate polls of more than 2,000 UK internet users in total. We asked:

  1. Are people aware of the existence of ads on Google Search?
  2. Do they believe they click Google ads? And, if so, how frequently?

The results were very different to Ofcom’s, with just around 10% not seeing ads in Google results.

However, the very presence of the word ‘ad’ in the question perhaps implied to respondents that there are ads on Google, and gave them a clue about the answer.

There was another study by Varn earlier this year which produced a similar answer to that from Ofcom.

This time, 1,010 Uk internet users were asked the following question. 50.6% couldn’t identify ads:

varn-blog-stats-main

It is tricky to devise the perfect test for this issue. If you ask users questions, there is the obvious temptation for them to second-guess the answer and say what they think is the right answer, rather than just answering honestly.

The Ofcom test, showing users the results and asking the question seems sound enough to me. Also, that several different studies have found a reasonably high percentage of people not recognising ads, so I can only conclude that there’s something in this.

Why can’t people see the ads?

This is the big question. As someone who has worked in digital for more than 10 years, it’s hard to imagine.

After all, there’s a pretty clear yellow ad label next to the results. You can hardly accuse Google of not disclosing the nature of the link.

However, Google has taken steps which some would interpret as reducing the visibility of ads. Remember, Google has an interest in increasing the number of clicks on its ads.

For example, PPC ads used to be shaded until a couple of years ago, though there were no ad labels.

PPC ads shaded

Recently, Google has experimented with green ad labels. The reason is unclear, but it could be a way to help the ad label blend in with the URL text. Or it could simply be one of a series of experiments to find the best performing format.

green ad labels

I suspect this is a similar thing to banner blindness, in which people have just become immune to, or have learned to ignore the elements on the page that don’t interest them.

Indeed, plenty of eye-tracking studies have shown that users will simply not look at certain elements on a page. Could it be that users are looking at the top results and simply not seeing (or processing) the ‘ad’ label?

Whatever the reason, and whatever the exact proportion of search users who don’t recognise ads in Google, it seems clear that there is an issue here.



from SEO – Search Engine Watch http://ift.tt/1T4hPG9
via IFTTT

Turn Off Some Types of Notifications in Google Search Console

ggsc notificationsIf you are tired about being notified for what you consider to be “unimportant” notifications from Search Console, Google has announced that you can now disable some types of notifications from being emailed to you.

Here are the ones you can opt out of:

gsc notifications

To opt out, there should be an option in the email notification you receive, and you turn off that specific notification there.

Interestingly, you cannot opt-out of receiving each kind until you actually receive one.  So you will still have to receive the first one before you can opt out if it.

You also cannot mute notifications of the notices Google deems as critical, such as manual actions, new owner alerts and notices of a site being hacked.

I would be wary about opting out of these, unless you are getting spammed with them.  These notices can be the first notification to a site owner that something is wrong.

Google has more about this option in their help files too.

The post Turn Off Some Types of Notifications in Google Search Console appeared first on The SEM Post.



from The SEM Post http://ift.tt/1SP8doM
http://ift.tt/1qTem6Q via IFTTT

Google: Best Practices for 301s in Large Htaccess Files

301 large htaccessSomeone raised an interesting question on Twitter today… what to do about older 301 redirects, especially when the redirect file begins to get a bit long and complicated.  Could they be deleted after a time?  Or should you keep them in their massive files with the potential server slowdown for visitors?

Gary Illyes from Google said that once the new page is indexed, it could be removed.  Of course, this is purely from a technical standpoint of when can you remove a 301. Sites do not need to keep up 301 redirects up for eternity in order for Google to figure out old pages should be matched with a specific brand new one.

However, it obviously goes well beyond that too.  Keeping those 301 redirects if any of the older pages have incoming links to them or actual visitors redirecting through them, then obviously keeping those 301 redirects is the best practice.

Illyes followed up with clarification as well, from a best practices point of view:

Now depending on the size, having a super long .htaccess file can slow down the server… but again the size isn’t the only factor in the equation, because the server itself also plays a role, and a server’s performance can also vary greatly.

Bottom line, if you have a 301 redirect, you can remove it once Google has crawled it and matched up the old page with the new one.  But if you can leave it indefinitely, that is the best route to go, especially if you need to redirect linking signals or actual visitors to the new page.  Then you never have to worry about losing those older ranking signals or landing visitors onto a 404 page rather than the page they wanted to go to.

The post Google: Best Practices for 301s in Large Htaccess Files appeared first on The SEM Post.



from The SEM Post http://ift.tt/1SP5pbm
http://ift.tt/1QA5fwQ via IFTTT

Measuring Content: You’re Doing it Wrong

Posted by MatthewBarby

The traditional ways of measuring the success or failure of content are broken. We can’t just rely on metrics like the number of pageviews/visits or bounce rate to determine whether what we’re creating has performed well.

“The primary thing we look for with news is impact, not traffic,” says Jonah Peretti, Founder of BuzzFeed. One of the ways that BuzzFeed have mastered this is with the development of their proprietary analytics platform, POUND.

POUND enables BuzzFeed to predict the potential reach of a story based on its content, understand how effective specific promotions are based on the downstream sharing and traffic, and power A/B tests — and that’s just a few examples.

Just because you’ve managed to get more eyeballs onto your content doesn’t mean it’s actually achieved anything. If that were the case then I’d just take a few hundred dollars and buy some paid StumbleUpon traffic every time.

Yeah, I’d generate traffic, but it’s highly unlikely to result in me achieving some of my actual business goals. Not only that, but I’d have no real indication of whether my content was satisfying the needs of my visitors.

The scary thing is that the majority of content marketing campaigns are measured this way. I hear statements like “it’s too difficult to measure the performance of individual pieces of content” far too often. The reality is that it’s pretty easy to measure content marketing campaigns on a micro level — a lot of the time people don’t want to do it.

Engagement over entrances

Within any commercial content marketing campaign that you’re running, measurement should be business goal-centric. By that I mean that you should be determining the overall success of your campaign based on the achievement of core business goals.

If your primary business goal is to generate 300 leads each month from the content that you’re publishing, you’ll need to have a reporting mechanism in place to track this information.

On a more micro-level, you’ll want to be tracking and using engagement metrics to enable you to influence the achievement of your business goals. In my opinion, all content campaigns should have robust, engagement-driven reporting behind them.

Total Time Reading (TTR)

One metric that Medium uses, which I think adds a lot more value than pageviews, is "Total Time Reading (TTR)." This is a cumulative metric that quantifies the total number of minutes spent reading a piece of content. For example, if I had 10 visitors to one of my blog articles and they each stayed reading the article for 1 minute each, the total reading time would be 10 minutes.

“We measure every user interaction with every post. Most of this is done by periodically recording scroll positions. We pipe this data into our data warehouse, where offline processing aggregates the time spent reading (or our best guess of it): we infer when a reader started reading, when they paused, and when they stopped altogether. The methodology allows us to correct for periods of inactivity (such as having a post open in a different tab, walking the dog, or checking your phone).” (source)

The reason why this is more powerful than just pageviews is because it takes into account how engaged your readers are to give a more accurate representation of its visibility. You could have an article with 1,000 pageviews that has a greater TTR than one with 10,000 pageviews.

Scroll depth & time on page

A related and simpler metric to acquire is the average time on page (available within Google Analytics). The average time spent on your webpage will give a general indication of how long your visitors are staying on the page. Combining this with ‘scroll depth’ (i.e. how far down the page has a visitor scrolled) will help paint a better picture of how ‘engaged’ your visitors are. You’ll be able to get the answer to the following:

“How much of this article are my visitors actually reading?”

“Is the length of my content putting visitors off?”

“Are my readers remaining on the page for a long time?”

Having the answers to these questions is really important when it comes to determining which types of content are resonating more with your visitors.

Social Lift

BuzzFeed’s “Social Lift” metric is a particularly good way of understanding the ‘virality’ of your content (you can see this when you publish a post to BuzzFeed). BuzzFeed calculates “Social Lift” as follows:

((Social Views)/(Seed Views)+1)

Social Views: Traffic that’s come from outside BuzzFeed; for example, referral traffic, email, social media, etc.

Seed Views: Owned traffic that’s come from within the BuzzFeed platform; e.g. from appearing in BuzzFeed’s newsfeed.

BuzzFeed Social Lift

This is a great metric to use when you’re a platform publisher as it helps separate out traffic that’s coming from outside of the properties that you own, thus determining its "viral potential."

There are ways to use this kind of approach within your own content marketing campaigns (without being a huge publisher platform) to help get a better idea of its "viral potential."

One simple calculation can just involve the following:

((social shares)/(pageviews)+1)

This simple stat can be used to determine which content is likely to perform better on social media, and as a result it will enable you to prioritize certain content over others for paid social promotion. The higher the score, the higher its "viral potential." This is exactly what BuzzFeed does to understand which pieces of content they should put more weight behind from a very early stage.

You can even take this to the next level by replacing pageviews with TTR to get a more representative view of engagement to sharing behavior.

The bottom line

Alongside predicting "viral potential" and "TTR," you’ll want to know how your content is performing against your bottom line. For most businesses, that’s the main reason why they’re creating content.

This isn’t always easy and a lot of people get this wrong by looking for a silver bullet that doesn’t exist. Every sales process is different, but let’s look at the typical process that we have at HubSpot for our free CRM product:

  1. Visitor comes through to our blog content from organic search.
  2. Visitor clicks on a CTA within the blog post.
  3. Visitor downloads a gated offer in exchange for their email address and other data.
  4. Prospect goes into a nurturing workflow.
  5. Prospect goes through to a BOFU landing page and signs up to the CRM.
  6. Registered user activates and invites in members of their team.

This is a simple process, but it can still be tricky sometimes to get a dollar value on each piece of content we produce. To do this, you’ve got to understand what the value of a visitor is, and this is done by working backwards through the process.

The first question to answer is, “what’s the lifetime value (LTV) of an activated user?” In other words, “how much will this customer spend in their lifetime with us?”

For e-commerce businesses, you should be able to get this information by analyzing historical sales data to understand the average order value that someone makes and multiply that by the average number of orders an individual will make with you in their lifetime.

For the purposes of this example, let’s say each of our activated CRM users has an LTV of $100. It’s now time to work backwards from that figure (all the below figures are theoretical)…

Question 1: “What’s the conversion rate of new CRM activations from our email workflow(s)?”

Answer 1: “5%”

Question 2: “How many people download our gated offers after coming through to the blog content?”

Answer 2: “3%”

Knowing this would help me to start putting a monetary value against each visitor to the blog content, as well as each lead (someone that downloads a gated offer).

Let’s say we generate 500,000 visitors to our blog content each month. Using the average conversion rates from above, we’d convert 15,000 of those into email leads. From there we’d nurture 750 of them into activated CRM users. Multiply that by the LTV of a CRM user ($100) and we’ve got $75,000 (again, these figures are all just made up).

Using this final figure of $75,000, we could work backwards to understand the value of a single visitor to our blog content:

 ((75,000)/(500,000))

Single Visitor Value: $0.15

We can do the same for email leads using the following calculation:

(($75,000)/(15,000))

Individual Lead Value: $5.00

Knowing these figures will help you be able to determine the bottom-line value of each of your pieces of content, as well as calculating a rough return on investment (ROI) figure.

Let’s say one of the blog posts we’re creating to encourage CRM signups generated 500 new email leads; we’d see a $2,500 return. We could then go and evaluate the cost of producing that blog post (let’s say it takes 6 hours at $100 per hour – $600) to calculate a ROI figure of 316%.

ROI in its simplest form is calculated as:

(((($return)-($investment))/($investment))*100)

You don’t necessarily need to follow these figures religiously when it comes to content performance on a broader level, especially when you consider that some content just doesn’t have the primary goal of lead generation. That said, for the content that does have this goal, it makes sense to pay attention to this.

The link between engagement and ROI

So far I’ve talked about two very different forms of measurement:

  1. Engagement
  2. Return on investment

What you’ll want to avoid is actually thinking about these as isolated variables. Return on investment metrics (for example, lead conversion rate) are heavily influenced by engagement metrics, such as TTR.

The key is to understand exactly which engagement metrics have the greatest impact on your ROI. This way you can use engagement metrics to form the basis of your optimization tests in order to make the biggest impact on your bottom line.

Let’s take the following scenario that I faced within my own blog as an example…

The average length of the content across my website is around 5,000 words. Some of my content way surpasses 10,000 words in length, taking an estimated hour to read (my recent SEO tips guide is a perfect example of this). As a result, the bounce rate on my content is quite high, especially from mobile visitors.

Keeping people engaged within a 10,000-word article when they haven’t got a lot of time on their hands is a challenge. Needless to say, it makes it even more difficult to ensure my CTAs (aimed at newsletter subscriptions) stand out.

From some testing, I found that adding my CTAs closer to the top of my content was helping to improve conversion rates. The main issue I needed to tackle was how to keep people on the page for longer, even when they’re in a hurry.

To do this, I worked on the following solution: give visitors a concise summary of the blog post that takes under 30 seconds to read. Once they’ve read this, show them a CTA that will give them something to read in more detail in their own time.

All this involved was the addition of a "Summary" button at the top of my blog post that, when clicked, hides the content and displays a short summary with a custom CTA.

Showing Custom Summaries

This has not only helped to reduce the number of people bouncing from my long-form content, but it also increased the number of subscribers generated from my content whilst improving user experience at the same time (which is pretty rare).

I’ve thought that more of you might find this quite a useful feature on your own websites, so I packaged it up as a free WordPress plugin that you can download here.

Final thoughts

The above example is just one example of a way to impact the ROI of your content by improving engagement. My advice is to get a robust measurement process in place so that you’re able to first of all identify opportunities, and then go through with experiments to take advantage of the opportunity.

More than anything, I'd recommend that you take a step back and re-evaluate the way that you're measuring your content campaigns to see if what you're doing really aligns with the fundamental goals of your business. You can invest in endless tools that help you measure things better, but if core metrics that you're looking for are wrong, then this is all for nothing.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/1NScG2x
via IFTTT

Social Media Today