Innovate not imitate!

Innovate not imitate!
Interested in the latest Growth hacks?

Welcome to our blog

Interested in the latest Growth hacks?

Welcome to our blog!

We want to help you start/manage and grow your business using innovative strategies and implementation. We have a passion for helping businesses and companies of various sizes see the same success that we have achieved.

Our skillsets are wide and varied, from business strategy, marketing, to online strategy. An increasing number of companies are turning to the internet and online media as a means to maximising their marketing reach and exposure. This is special area of focus for us and we do more than simple SEO strategies.

See our website for more: www.innovatetoaccelerate.com

Monday 30 April 2018

Faster, Fresher, Better: Announcing Link Explorer, Moz's New Link Building Tool

Posted by SarahBird

More link data. Fresher link data. Faster link data.

Today, I’m delighted to share that after eons of hard work, blood, sweat, tears, and love, Moz is taking a major step forward on our commitment to provide the best SEO tools money can buy.

We’ve rebuilt our link technology from the ground up and the data is now broadly available throughout Moz tools. It’s bigger, fresher, and much, much faster than our legacy link tech. And we’re just getting started! The best way to quickly understand the potential power of our revolutionary new link tech is to play with the beta of our Link Explorer.

Introducing Link Explorer, the newest addition to the Moz toolset!

We’ve heard your frustrations with Open Site Explorer and we know that you want more from Moz and your link building tools. OSE has done more than put in its time. Groundbreaking when it launched in 2008, it’s worked long and hard bring link data to the masses. It deserves the honor of a graceful retirement.

OSE represents our past; the new Link Explorer is our fast, innovative, ambitious future.

Here are some of my favorite things about the Link Explorer beta:

  • It’s 20x larger and 30x fresher than OSE (RIP)
  • Despite its huge index size, the app is lightning fast! I can’t stand waiting so this might be my number-one fav improvement.
  • We’re introducing Link Tracking Lists to make managing your link building efforts a breeze. Sometimes the simple things make the biggest difference, like when they started making vans with doors on each side. You’ll never go back.
  • Link Explorer includes historic data, a painful gap in OSE. Studying your gained/lost linking domains is fast and easy.
  • The new UX surfaces competitive insights much more quickly
  • Increases the size and freshness of the index improved the quality of Domain Authority and Spam Score. VoilĂ .

All this, and we’re only in beta.

Dive into your link data now!

Here’s a deeper dive into my favorites:

#1: The sheer size, quality, and speed of it all

We’re committed to data quality. Here are some ways that shows up in the Moz tools:

  • When we collect rankings, we evaluate the natural first page of rankings to ensure that the placement and content of featured snippets and other SERP features are correctly situated (as can happen when ranking are collected in 50- or 100-page batches). This is more expensive, but we think the tradeoff is worth it.
  • We were the first to build a hybrid search volume model using clickstream data. We still believe our model is the most accurate.
  • Our SERP corpus, which powers Keywords by Site, is completely refreshed every two weeks. We actively update up to 15 million of the keywords each month to remove keywords that are no longer being searched and replace them with trending keywords and terms. This helps keep our keyword data set fresh and relevant.

The new Link Explorer index extends this commitment to data quality. OSE wasn’t cutting it and we’re thrilled to unleash this new tech.

Link Explorer is over 20x larger and 30x fresher than our legacy link index. Bonus points: the underlying technology is very cost-efficient, making it much less expensive for us to scale over time. This frees up resources to focus on feature delivery. BOOM!

One of my top pet peeves is waiting. I feel physical pain while waiting in lines and for apps to load. I can’t stand growing old waiting for a page to load (amirite?).

The new Link Explorer app is delightfully, impossibly fast. It’s like magic. That’s how link research should be. Magical.

#2: Historical data showing discovered and lost linking domains

If you’re a visual person, this report gives you an immediate idea of how your link building efforts are going. A spike you weren't expecting could be a sign of spam network monkey business. Deep-dive effortlessly on the links you lost and gained so you can spend your valuable time doing thoughtful, human outreach.

#3: Link Tracking Lists

Folks, this is a big one. Throw out (at least one of... ha. ha.) those unwieldy spreadsheets and get on board with Link Tracking Lists, because these are the future. Have you been chasing a link from a particular site? Wondering if your outreach emails have borne fruit yet? Want to know if you’ve successfully placed a link, and how you’re linking? Link Tracking Lists cut out a huge time-suck when it comes to checking back on which of your target sites have actually linked back to you.

Why announce the beta today?

We’re sharing this now for a few reasons:

  • The new Link Explorer data and app have been available in beta to a limited audience. Even with a quiet, narrow release, the SEO community has been talking about it and asking good questions about our plans. Now that the Link Explorer beta is in broad release throughout all of Moz products and the broader Moz audience can play with it, we’re expecting even more curiosity and excitement.
  • If you’re relying on our legacy link technology, this is further notice to shift your applications and reporting to the new-and-improved tech. OSE will be retired soon! We’re making it easier for API customers to get the new data by providing a translation layer for the legacy API.
  • We want and need your feedback. We are committed to building the very best link building tool on the planet. You can expect us to invest heavily here. We need your help to guide our efforts and help us make the most impactful tradeoffs. This is your invitation to shape our roadmap.

Today’s release of our new Link Explorer technology is a revolution in Moz tools, not an evolution. We’ve made a major leap forward in our link index technology that delivers a ton of immediate value to Moz customers and the broader Moz Community.

Even though there are impactful improvements around the corner, this ambitious beta stands on its own two feet. OSE wasn’t cutting it and we’re proud of this new, fledgling tech.

What’s on the horizon for Link Explorer?

We’ve got even more features coming in the weeks and months ahead. Please let us know if we’re on the right track.

  • Link Building Assistant: a way to quickly identify new link acquisition opportunities
  • A more accurate and useful Link Intersect feature
  • Link Alerts to notify you when you get a link from a URL you were tracking in a list
  • Changes to how we count redirects: Currently we don't count links to a redirect as links to the target of the redirect (that's a lot of redirects), but we have this planned for the future.
  • Significantly scaling up our crawling to further improve freshness and size

Go forth, and explore:

Try the new Link Explorer!

Tomorrow Russ Jones will be sharing a post that discusses the importance of quality metrics when it comes to a link index, and don’t miss our pinned Q&A post answering questions about Domain Authority and Page Authority changes or our FAQ in the Help Hub.

We’ll be releasing early and often. Watch this space, and don’t hold back your feedback. Help us shape the future of Links at Moz. We’re listening!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2HDk5L5
via IFTTT

Criteria for Google Mobile First Indexing is Matching Content Not Mobile Friendliness

Google has officially begun sending out email notices to site owners confirming their change from desktop indexing to mobile first indexing.  And we do have some clarification on mobile friendliness being a criteria or not for sites changing over. Some people had speculated that a way to prevent mobile first indexing was to have a […]

The post Criteria for Google Mobile First Indexing is Matching Content Not Mobile Friendliness appeared first on The SEM Post.



from The SEM Post https://ift.tt/2JBWFm2
https://ift.tt/eA8V8J via IFTTT

Consumers lose trust in businesses with inaccurate NAP

Over the years we’ve seen the importance of the humble business listing change. While citations were once considered key link sources and their accuracy a contributing ranking factor for local search, today their impact has waned somewhat.

However, as Moz’s most recent Local Search Ranking Factors survey found, NAP (Name, Address, Phone number) details in business listings and online directories are still considered fourth most important for ranking in the local pack and fifth most important in localized organic rankings.

But not everything is about rankings: accurate citations are still a foundation tactic for any business, as they increase online visibility by placing businesses in the listings and directories in which potential customers are looking for them.

That’s if they are accurate. What happens if they’re not?

Recent research by BrightLocal, in the Local Citations Trust Report 2018, sought to answer this question by polling 1000 US consumers on how they feel and behave when they come across inaccurate business information online.

Consumers lose trust in businesses with incorrect or inconsistent NAP information

Today, trust in business and institutions in the US is at an all-time low, and it’s the responsibility of every business owner to make a difference in any way they can – even if it’s something as seemingly small as ensuring business name, location, and contact number information are reliable and consistent online.

According to the BrightLocal research, 80% of respondents felt that seeing incorrect or inconsistent contact details and/or business names online would make them lose trust in a business. With consumer trust in business being such a critical part of the buyer’s journey, this is obviously of great concern.

Of course, this is only an issue if businesses are actually uploading inaccurate information to online directories.

Thirty percent found inaccurate business information online

If you thought inaccurate citation data wasn’t a problem, think again. Not only have 30% of consumers found inaccurate business information online in the last 12 months, but a shocking 36% have also ended up calling incorrect phone numbers for businesses as a result of this inaccurate information. Add to this the fact that 22% of respondents went to the trouble of visiting a business only to find it was not located where online information suggested it was, and you start to see a troubling picture of lost business.

These experiences aren’t just confined to incorrect NAP, though. Nearly one quarter of consumers have visited a business too early or too late owing to incorrect opening times displayed online. Ultimately, this means that businesses with inaccurate citation data are likely to be missing out on a great deal of custom.

Let’s say a consumer has found a business’ address online and gone to that location, only to find the business is nowhere to be seen. What happens next?

Forty-one percent would not use a business if they couldn’t find it straight away

Although it’s encouraging to see that 59% of people would persist in their search for a business if they couldn’t find it – either by calling or looking elsewhere online for the address, many aren’t quite so patient.

Almost one third (29%) of consumers said they would try to find another business online or nearby, and 12% would give up completely. Obviously, the likelihood of the latter, more extreme reaction is down to how necessary the need for the business was, and also how far the consumer had to travel, but this data still suggests that businesses with inaccurate citations data online risk losing out.

As the research found, 22% have visited an incorrect address, and with only 29% of these people seeking out an alternative business, we can get an idea of how much business is being lost to competitors as a result of inaccurate location data.

Just imagine: you do all that great work and spend all your marketing budget encouraging someone to use your business, and you succeed, only to lose out to a local rival owing to something as simple as inaccurate citation data. Marketers tend to be very good at looking at the big picture, but it’s little details like this that result in lost business, even when marketing activity has otherwise succeeded.

It’s worth pointing out, too, that men answering this question seemed to be far more likely to give up their search for a business completely. Eighteen percent of those who identified as male in the survey said they would abandon their search. We live in a far more fast-paced world than ever before and immediate gratification is paramount, so I’d strongly recommend that businesses with a primarily male customer base get their citations in order, lest they face the lost custom of a frustrated customer.

Ninety-three percent of consumers are frustrated to find incorrect information in online directories

Frustration is an unpredictable emotion that can result in a range of reactions depending on the state and personality of the person experiencing it. As we’ve seen, once frustrated by incorrect business information, consumers could calmly persist with contacting the business (providing the contact number is accurate), look for another business, or quit their search entirely.

The BrightLocal research found that a huge 93% of consumers agree that finding incorrect information in online directories “frustrates” them. These are consumers with a strong intent to buy, as they’ve already searched for a business like yours and picked yours due to any number of factors. Even if they do choose to use your business after all, their first experience with it involves frustration. If you’re providing incorrect or inaccurate information online, you’re going to have to hope that your product or service is spectacular to avoid an overall negative experience.

Sixty-eight percent of consumers would stop using a local business after finding incorrect information online

After ploughing through incorrect information and coming up empty-handed, almost 70% of consumers said they would stop using a business as a result. This includes the quite literal cessation of business use due to not being able to find or call them, as well as deciding not to use a business because of diminished trust caused by inaccurate online information.

Businesses must have accurate and consistent citations to avoid losing customers

If you run a business or manage a client with incorrect business listings information, you are at high risk of missing out on swathes of new customers.

All your marketing, visibility, and brand awareness efforts are for naught if potential customers can’t find your business. You risk frustrating them, and in some cases completely wasting their time. First impressions are paramount, and creating accurate citations is one of the most important ways to ensure you’re building consumer trust from the off. If not, then I’m sure your competitor in the next neighborhood would be happy to take the business.



from SEO – Search Engine Watch https://ift.tt/2raM0rD
via IFTTT

Google Sending Mobile First Indexing Enabled Notices via Google Search Console

It seems Google has finally started sending out emails to site owners informing them that their sites have been switched over to mobile first indexing. The emails, sent via Google Search Console, advise site owners on changes they can expect to see now that their sites are being indexed by the mobile Googlebot. The emails […]

The post Google Sending Mobile First Indexing Enabled Notices via Google Search Console appeared first on The SEM Post.



from The SEM Post https://ift.tt/2HG4AhE
https://ift.tt/eA8V8J via IFTTT

Friday 27 April 2018

Content for Answers: The Inverted Pyramid - Whiteboard Friday

Posted by Dr-Pete

If you've been searching for a quick hack to write content for featured snippets, this isn't the article for you. But if you're looking for lasting results and a smart tactic to increase your chances of winning a snippet, you're definitely in the right place.

Borrowed from journalism, the inverted pyramid method of writing can help you craft intentional, compelling, rich content that will help you rank for multiple queries and win more than one snippet at a time. Learn how in this Whiteboard Friday starring the one and only Dr. Pete!

Content for Answers

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans, Dr. Pete here. I'm the Marketing Scientist at Moz and visiting you from not-so-sunny Chicago in the Seattle office. We've talked a lot in the last couple years in my blog posts and such about featured snippets.

So these are answers that kind of cross with organic. So it's an answer box, but you get the attribution and the link. Britney has done some great Whiteboard Fridays, the last couple, about how you do research for featured snippets and how you look for good questions to answer. But I want to talk about something that we don't cover very much, which is how to write content for answers.

The inverted pyramid style of content writing

It's tough, because I'm a content marketer and I don't like to think that there's a trick to content. I'm afraid to give people the kind of tricks that would have them run off and write lousy, thin content. But there is a technique that works that I think has been very effective for featured snippets for writing for questions and answers. It comes from the world of journalism, which gives me a little more faith in its credibility. So I want to talk to you about that today. That's called the inverted pyramid.

Content for Answers

1. Start with the lead

It looks something like this. When you write a story as a journalist, you start with the lead. You lead with the lead. So if we have a story like "Penguins Rob a Bank," which would be a strange story, we want to put that right out front. That's interesting. Penguins rob a bank, that's all you need to know. The thing about it is, and this is true back to print, especially when we had to buy each newspaper. We weren't subscribers. But definitely on the web, you have to get people's attention quickly. You have to draw them in. You have to have that headline.

2. Go into the details

So leading with the lead is all about pulling them in to see if they're interested and grabbing their attention. The inverted pyramid, then you get into the smaller pieces. Then you get to the details. You might talk about how many penguins were there and what bank did they rob and how much money did they take.

3. Move to the context

Then you're going to move to the context. That might be the history of penguin crime in America and penguin ties to the mafia and what does this say about penguin culture and what are we going to do about this. So then it gets into kind of the speculation and the value add that you as an expert might have.

How does this apply to answering questions for SEO?

So how does this apply to answering questions in an SEO context?

Content for Answers

Lead with the answer, get into the details and data, then address the sub-questions.

Well, what you can do is lead with the answer. If somebody's asked you a question, you have that snippet, go straight to the summary of the answer. Tell them what they want to know and then get into the details and get into the data. Add those things that give you credibility and that show your expertise. Then you can talk about context.

But I think what's interesting with answers — and I'll talk about this in a minute — is getting into these sub-questions, talking about if you have a very big, broad question, that's going to dive up into a lot of follow-ups. People who are interested are going to want to know about those follow-ups. So go ahead and answer those.

If I win a featured snippet, will people click on my answer? Should I give everything away?

Content for Answers

So I think there's a fear we have. What if we answer the question and Google puts it in that box? Here's the question and that's the query. It shows the answer. Are people going to click? What's going to happen? Should we be giving everything away? Yes, I think, and there are a couple reasons.

Questions that can be very easily answered should be avoided

First, I want you to be careful. Britney has gotten into some of this. This is a separate topic on its own. You don't always want to answer questions that can be very easily answered. We've already seen that with the Knowledge Graph. Google says something like time and date or a fact about a person, anything that can come from that Knowledge Graph. "How tall was Abraham Lincoln?" That's answered and done, and they're already replacing those answers.

Answer how-to questions and questions with rich context instead

So you want to answer the kinds of things, the how-to questions and the why questions that have a rich enough context to get people interested. In those cases, I don't think you have to be afraid to give that away, and I'm going to tell you why. This is more of a UX perspective. If somebody asks this question and they see that little teaser of your answer and it's credible, they're going to click through.

"Giving away" the answer builds your credibility and earns more qualified visitors

Content for Answers

So here you've got the penguin. He's flushed with cash. He's looking for money to spend. We're not going to worry about the ethics of how he got his money. You don't know. It's okay. Then he's going to click through to your link. You know you have your branding and hopefully it looks professional, Pyramid Inc., and he sees that question again and he sees that answer again.

Giving the searcher a "scent trail" builds trust

If you're afraid that that's repetitive, I think the good thing about that is this gives him what we call a scent trail. He can see that, "You know what? Yes, this is the page I meant to click on. This is relevant. I'm in the right place." Then you get to the details, and then you get to the data and you give this trail of credibility that gives them more to go after and shows your expertise.

People who want an easy answer aren't the kind of visitors that convert

I think the good thing about that is we're so afraid to give something away because then somebody might not click. But the kind of people who just wanted that answer and clicked, they're not the kind of people that are going to convert. They're not qualified leads. So these people that see this and see it as credible and want to go read more, they're the qualified leads. They're the kind of people that are going to give you that money.

So I don't think we should be afraid of this. Don't give away the easy answers. I think if you're in the easy answer business, you're in trouble right now anyway, to be honest. That's a tough topic. But give them something that guides them to the path of your answer and gives them more information.

How does this tactic work in the real world?

Thin content isn't credible.

Content for Answers

So I'm going to talk about how that looks in a more real context. My fear is this. Don't take this and run off and say write a bunch of pages that are just a question and a paragraph and a ton of thin content and answering hundreds and hundreds of questions. I think that can really look thin to Google. So you don't want pages that are like question, answer, buy my stuff. It doesn't look credible. You're not going to convert. I think those pages are going to look thin to Google, and you're going to end up spinning out many, many hundreds of them. I've seen people do that.

Use the inverted pyramid to build richer content and lead to your CTA

Content for Answers

What I'd like to see you do is craft this kind of question page. This is something that takes a fair amount of time and effort. You have that question. You lead with that answer. You're at the top of the pyramid. Get into the details. Get into the things that people who are really interested in this would want to know and let them build up to that. Then get into data. If you have original data, if you have something you can contribute that no one else can, that's great.

Then go ahead and answer those sub-questions, because the people who are really interested in that question will have follow-ups. If you're the person who can answer that follow-up, that makes for a very, very credible piece of content, and not just something that can rank for this snippet, but something that really is useful for anybody who finds it in any way.

So I think this is great content to have. Then if you want some kind of call to action, like a "Learn More," that's contextual, I think this is a page that will attract qualified leads and convert.

Moz's example: What is a Title Tag?

So I want to give you an example. This is something we've used a lot on Moz in the Learning Center. So, obviously, we have the Moz blog, but we also have these permanent pages that answer kind of the big questions that people always have. So we have one on the title tag, obviously a big topic in SEO.

Content for Answers

Here's what this page looks like. So we go right to the question: What is a title tag? We give the answer: A title tag is an HTML element that does this and this and is useful for SEO, etc. Right there in the paragraph. That's in the featured snippet. That's okay. If that's all someone wants to know and they see that Moz answered that, great, no problem.

But naturally, the people who ask that question, they really want to know: What does this do? What's it good for? How does it help my SEO? How do I write one? So we dug in and we ended up combining three or four pieces of content into one large piece of content, and we get into some pretty rich things. So we have a preview tool that's been popular. We give a code sample. We show how it might look in HTML. It gives it kind of a visual richness. Then we start to get into these sub-questions. Why are title tags important? How do I write a good title tag?

One page can gain the ability to rank for hundreds of questions and phrases

What's interesting, because I think sometimes people want to split up all the questions because they're afraid that they have to have one question per page, what's interesting is that I think looked the other day, this was ranking in our 40 million keyword set for over 200 phrases, over 200 questions. So it's ranking for things like "what is a title tag," but it's also ranking for things like "how do I write a good title tag." So you don't have to be afraid of that. If this is a rich, solid piece of content that people are going to, you're going to rank for these sub-questions, in many cases, and you're going to get featured snippets for those as well.

Then, when people have gotten through all of this, we can give them something like, "Hey, Moz has some of these tools. You can help write richer title tags. We can check your title tags. Why don't you try a free 30-day trial?" Obviously, we're experimenting with that, and you don't want to push too hard, but this becomes a very rich piece of content. We can answer multiple questions, and you actually have multiple opportunities to get featured snippets.

So I think this inverted pyramid technique is legitimate. I think it can help you write good content that's a win-win. It's good for SEO. It's good for your visitors, and it will hopefully help you land some featured snippets.

So I'd love to hear about what kind of questions you're writing content for, how you can break that up, how you can answer that, and I'd love to discuss that with you. So we'll see you in the comments. Thank you.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2FmsNHB
via IFTTT

Thursday 26 April 2018

Win a Ticket + Lodging to MozCon 2018!

Posted by ErinMcCaul

Have you been wanting to come to MozCon but just can’t swing the budget? Want to take a selfie with Roger, meet like-minded friends at our afterparties, and learn from leading industry experts? I’m thrilled to announce that you can do it all by winning a free ticket to join us at MozCon this July!

Those front-row seats look awfully cushy.

I’m one of the behind-the-scenes house elves who helps make MozCon happen, and I’m here to tell you everything you need to know about entering to win!

To enter, just submit a unique piece of content telling us why we should send you to MozCon by Sunday May 6th at 5pm PDT. Make sure your entry is both original and creative — the Moz staff will review all submissions and vote on the winner! If you’re chosen, we’ll pick up the tab for your registration and accommodations at the Grand Hyatt. You’ll also have a reserved VIP seat in our front row, and an invite to mix and mingle at our pre-event MozCon speakers’ dinner!

Without further ado, here’s the scoop:

Step 1: Create!

Create a unique, compelling piece of content telling us why you want to come to MozCon. Past ideas have included:

  • Drawings
  • Videos (must be one minute or less)
  • Blog posts
  • Original songs
  • Books
  • Slide decks
  • Anything else you can cook up!

Don’t feel limited by these examples. Is this the year we’ll see a Lego Roger stop-motion film, a MozCon-inspired show tune, or Roger-themed sugar cookies? The sky's the limit, my friends! (But think hard about trying your hand at those cookies.)

Step 2: Submit!

Once you’re ready to throw your hat in the game, tweet us a link @Moz and use the hashtag #MozConVIP by Sunday May 6th at 5pm PDT. Make sure to follow the instructions, and include your name and email address somewhere easily visible within your content. To keep things fair, there will be no exceptions to the rules. We need to be able to contact you if you’re our lucky winner!

Let’s recap:

  • The submission deadline is Sunday May 6th at 5pm PDT.
  • Mozzers will vote on all the entries based on the creativity and uniqueness of the content
  • We’ll announce the winning entry from @Moz via Twitter on Friday, May 11. You must be able to attend MozCon, July 9–11 2018, in Seattle. Prizes are non-transferable.
  • All submissions must adhere to the MozCon Code of Conduct
  • Content is void where prohibited by law.
  • The value of the prize will be reported for tax purposes as required by law; the winner will receive an IRS form 1099 at the end of the calendar year and a copy of such form will be filed with the IRS. The winner is solely responsible for reporting and paying any and all applicable taxes related to the prizes and paying any expenses associated with any prize which are not specifically provided for in the official rules.

Our lucky winner will receive:

  • A free ticket to MozCon 2018, including optional VIP front-row seating and an invitation to our speakers’ dinner (valued at $1,500+)
  • Accommodations with a suite upgrade at the Grand Hyatt from July 8–12, 2018 (valued at $1,300+)

Alright, that’s wrap. I can’t wait to see what you folks come up with! Happy creating!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2vP6RWc
via IFTTT

Switching to AMP Won’t Negate Other SEO & Google Algo Issues on a Site

The use of AMP is an interesting one.  For many site owners, using AMP can result in a lift in traffic, but this is because AMP gets priority placement, such as the carousel placements.  But it doesn’t specifically get an additional ranking boost outside of the regular mobile friendly one. But an interesting question came […]

The post Switching to AMP Won’t Negate Other SEO & Google Algo Issues on a Site appeared first on The SEM Post.



from The SEM Post https://ift.tt/2HsiUy8
https://ift.tt/eA8V8J via IFTTT

Why campaign structure is the killer competitive advantage

When pitching to new clients, business is rarely won based on the ability to tactically execute. It’s usually talking about how we will use data, or be able to deliver better than our competitors with whatever the latest thing might be.

There is nothing wrong with that approach, as you would expect that any agency worth their salt would know best practices. However, all too often best practices are overlooked. The most effective change that can be made when taking over a new campaign–and even after managing for a while–is revisiting campaign structure. This might be the most basic detail of all, but it is critically important.

Google rewards relevancy

As you know, Google changed the game when it introduced Quality Score into the bidding equation, and this remains a huge factor today. If you consider that you can’t control the number of advertisers in the auction, and you can’t stop someone from having deeper pockets than you, then how can you beat them? Quality Score.

Quality Score is heavily tied to campaign structure. The way in which keywords in the same ad groups are related to one another and follow a common structure is paramount. This may feel like a lot of extra work to create more ad groups when the differences do not seem substantial, but the pay-off is worth it. For example, if you are a retailer and you are putting all shirts in one ad group, you should consider breaking them down into specific types (sweatshirts, t-shirts, tank tops, etc.). Also, by sending keywords to their specific landing pages, you also increase relevancy, which is rewarded with higher quality scores.

Ad copy and ad extensions

Linked with relevancy is the ability to write ad copy that is clearly linked to the keywords in the ad groups. Again, this might seem obvious, but it is something that many don’t take the time to appreciate. Ad copy that is more directly related to your keywords will increase relevancy and consumer response rates (i.e. CTR).

In addition, a number of ad extensions can be impacted by campaign structure. AdWords decides when these show based on 2 factors:

  • When the extension is expected to improve your performance
  • Your ad’s position and Ad Rank is high enough for extensions to show.

Ad Rank has a big ad quality component that substantiates the value of a strong campaign structure.

Campaign structure determines how settings are used

As a result of the way in which AdWords is set-up, there are certain decisions that can be made at each of the three levels (campaign, ad group, keyword). For example, geo-targeting and budgets can only be controlled at the campaign level. Ad copy is uniquely controlled at the ad group level. These levers are critical to success, and campaign structure is the way in which you ensure success can be capitalized on. For example, if you have a keyword that is high volume in an ad group with lower volume keywords, it is possible that high volume keywords are suffocating those smaller volume keywords and limiting their exposure within the campaign. A key campaign structure decision in this instance might be to break these out into their own campaigns so you can more easily control the budget.

Conclusions: you’re never done

Campaign structure is something that should be revisited again and again over time. The people who are managing a campaign change, consumer behavior changes, websites change, and AdWords evolves their policies. All these factors and many others require that the structure is revisited. I recommend that you have a strategic campaign structure review annually and ensure that it aligns with your performance and strategy.

You will of course have the urge to not spend any time thinking about campaign structure, because it can be tedious and you will assume you did a good job at the outset. However, over the course of time you will add and delete keywords, and update and test ad copy/landing pages. These decisions erode the original intent behind your strategy. Revisiting the structure will ensure that best practices are followed and, even if you don’t make any changes, reaffirm the decisions you have historically made. Campaign structure is the secret weapon that will help you beat your competition–without having to increase bids or your total budget.

 



from SEO – Search Engine Watch https://ift.tt/2KiA66X
via IFTTT

Wednesday 25 April 2018

Google: Major Redesigns & Issues With Search Traffic Loss

Some of the changes made by the site, which already had millions of pages, included a redesign as well as splitting up many millions of pages – including the homepage – into many smaller individual pages.  On average, each of these longer static HTML pages was split into roughly four new pages, a main page […]

The post Google: Major Redesigns & Issues With Search Traffic Loss appeared first on The SEM Post.



from The SEM Post https://ift.tt/2vLr7b9
https://ift.tt/eA8V8J via IFTTT

How We Got a 32% Organic Traffic Boost from 4 On-Page SEO Changes [Case Study]

Posted by WallStreetOasis.com

My name is Patrick Curtis, and I'm the founder and CEO of Wall Street Oasis, an online community focused on careers in finance founded in 2006 with over 2 million visits per month.

User-generated content and long-tail organic traffic is what has built our business and community over the last 12+ years. But what happens if you wake up one day and realize that your growth has suddenly stopped? This is what happened to us back in November 2012.

In this case study, I’ll highlight two of our main SEO problems as a large forum with over 200,000 URLs, then describe two solutions that finally helped us regain our growth trajectory — almost five years later.

Two main problems

1. Algorithm change impacts

Ever since November 2012, Google’s algo changes have seemed to hurt many online forums like ours. Even though our traffic didn’t decline, our growth dropped to the single-digit percentages. No matter what we tried, we couldn’t break through our “plateau of pain” (I call it that because it was a painful ~5 years trying).

Plateau of pain: no double-digit growth from late 2012 onward

2. Quality of user-generated content

Related to the first problem, 99% of our content is user-generated (UGC) which means the quality is mixed (to put it kindly). Like most forum-based sites, some of our members create incredible pieces of content, but a meaningful percentage of our content is also admittedly thin and/or low-quality.

How could we deal with over 200,000 pieces of content efficiently and try to optimize them without going bankrupt? How could we “clean the cruft” when there was just so much of it?

Fighting back: Two solutions (and one statistical analysis to show how it worked)

1. "Merge and Purge" project

Our goal was to consolidate weaker “children” URLs into stronger “master” URLs to utilize some of the valuable content Google was ignoring and to make the user experience better.

For example, instead of having ~20 discussions on a specific topic (each with an average of around two to three comments) across twelve years, we would consolidate many of those discussions into the strongest two or three URLs (each with around 20–30 comments), leading to a much better user experience with less need to search and jump around the site.

Changes included taking the original post and comments from a “child” URL and merging them into the “master” URL, unpublishing the child URL, removing the child from sitemap, and adding a 301 redirect to the master.

Below is an example of how it looked when we merged a child into our popular Why Investment Banking discussion. We highlighted the original child post as a Related Topic with a blue border and included the original post date to help avoid confusion:

Highlighting a related topic child post

This was a massive project that involved some complex Excel sorting, but after 18 months and about $50,000 invested (27,418 children merged into 8,515 masters to date), the user experience, site architecture, and organization is much better.

Initial analysis suggests that the percentage gain from merging weak children URLs into stronger masters has given us a boost of ~10–15% in organic search traffic.

2. The Content Optimization Team

The goal of this initiative was to take the top landing pages that already existed on Wall Street Oasis and make sure that they were both higher quality and optimized for SEO. What does that mean, exactly, and how did we execute it?

We needed a dedicated team that had some baseline industry knowledge. To that end, we formed a team of five interns from the community, due to the fact that they were familiar with the common topics.

We looked at the top ~200 URLs over the previous 90 days (by organic landing page traffic) and listed them out in a spreadsheet:

Spreadsheet of organic traffic to URLs

We held five main hypotheses of what we believed would boost organic traffic before we started this project:

  1. Longer content with subtitles: Increasing the length of the content and adding relevant H2 and H3 subtitles to give the reader more detailed and useful information in an organized fashion.
  2. Changing the H1 so that it matched more high-volume keywords using Moz’s Keyword Explorer.
  3. Changing the URL so that it also was a better match to high-volume and relevant keywords.
  4. Adding a relevant image or graphic to help break up large “walls of text” and enrich the content.
  5. Adding a relevant video similar to the graphic, but also to help increase time on page and enrich the content around the topic.

We tracked all five of these changes across all 200 URLs (see image above). After a statistical analysis, we learned that four of them helped our organic search traffic and one actually hurt.

Summary of results from our statistical analysis

  • Increasing the length of the articles and adding relevant subtitles (H2s, H3s, and H4s) to help organize the content gives an average boost to organic traffic of 14%
  • Improving the title or H1 of the URLs yields a 9% increase on average
  • Changing the URL decreased traffic on average by 38% (this was a smaller sample size — we stopped doing this early on for obvious reasons)
  • Including a relevant video increases the organic traffic by 4% on average, while putting an image up increases it by 5% on average.

Overall, the boost to organic traffic — should we continue to make these four changes (and avoid changing the URL) — is 32% on average.

Key takeaway:

Over half of that gain (~18%) comes from changes that require a minimal investment of time. For teams trying to optimize on-page SEO across a large number of pages, we recommend focusing on the top landing pages first and easy wins before deciding if further investment is warranted.

We hope this case study of our on-page SEO efforts was interesting, and I’m happy to answer any questions you have in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2HpCATj
via IFTTT

Tuesday 24 April 2018

Google: Dividing One Page Into Multiple Pages Often Dilutes Value for SEO

Google has long advocated that having strong high quality pages is much better for Google search than many lower quality pages.  But in the quest to get more pages indexed in Google, some site owners are going for quantity over quality, and sometimes sacrificing the parts of the site that are high quality. For some […]

The post Google: Dividing One Page Into Multiple Pages Often Dilutes Value for SEO appeared first on The SEM Post.



from The SEM Post https://ift.tt/2FdonTj
https://ift.tt/eA8V8J via IFTTT

How to select the best caching solution for WordPress

There is no denying that the existence of an appropriate website is justified primarily by its loading speed: the faster, the better. Forty-seven percent of consumers expect a web page to load in 2 seconds or less, which is quite a task to accomplish as a new website owner.

Interestingly, even a single second of delay in page response can result in a 7% reduction in conversions, and Google’s algorithms favour fast-loading websites in the form of search engine rankings. With so much at stake with regard to your website’s loading time, the pain is real. So, how does one make sure that WordPress websites are fast to load?

Caching is an efficient solution

Caching serves the purpose of creating and keeping a static version of your website and serving it to a requesting visitor when they access your website for the second time or more. It enhances your site’s user experience by swiftly presenting the static version without any delay.

This delay, otherwise, is simply caused when a visitor is trying to access a website from their browser and all the website elements such as the posts, slider, headers, CSS files, JavaScript, images, videos, etc., take their own time to get downloaded on the browser. When caching is in place, your website is ever-ready to deliver a cached/static version – quickly.

If you are new to website creation, how do you implement Caching on your website? What are the ways and means? Are their tools that can help you do it?

To start, you must test your website for its speed monitoring using tools such as:

These tools are a great way to figure out anything that might not be going well on your website’s backend when it comes to its loading time and similar issues. Since WordPress websites have their own share of down time owing to a number of factors, you cannot always act laid back when it comes to the performance and WordPress security of your digital property. If you would like to learn about striking a balance with your WordPress site’s security apart from its performance, you can read more here.

Broadly divided into two, WordPress Caching can be determined as:

  1. Browser Caching: Reducing the load on the server is a great way of optimizing your website’s speed and that is what Browser Caching does. It reduces the number of requests per page, resulting in the superpower where your website loads faster.
  1. Server Caching: Used by websites that have spiked traffic rates, Server Caching is largely about when data is cached on the server itself, helping with the loading revisions.

The best plugins to incorporate caching onto your WordPress site

You can choose from the following list of plugins to manage caching on your WordPress website.

  1. WP Super Cache

Total number of downloads: 2+ million

One of the best caching plugins in the WordPress repository, WP Super Cache is a great cache management plugin. Generating static HTML files for your WordPress website, the plugin serves cached files in three ways, which are based on speed. It employs methods like Apache mod_rewrite and a modification of your .htaccess file to serve supercached static HTML files.

 

  1. W3 Total Cache

Total number of downloads: 1+ million

Highly recommended by web hosts and developers, this plugin has continued to reign the WordPress caching market for a number of years. By employing browser caching, it renders pages quickly, which results in reduced page load time, and further garners more page views and increased time on site.

A great plugin in itself, W3 Total Cache contributes to improvement in your site’s SEO, offers content delivery network (CDN) integration, and overall user-experience on the WordPress site.

  1. WP Fastest Cache

Total number of downloads: 600,000+

The plugin serves the usual caching functions, offers SSL and CDN support, allows Cache Timeout for specific pages, enable/disable cache option for mobile devices and for logged-in users. Available in over 18 languages, the plugin does not require the user to modify the .htacces file and is pretty simple to set up. However, it does not currently support WordPress Multisite, but it is hoped that the plugin developers are working towards introducing this. Also, their premium version has much more to offer.

  1. Cache Enabler

Total number of downloads: 40,000+

Working its way to improving the performance of your website, the Cache enabler plugin offers WordPress multisite support. Its disk cache engine is efficient and fast and the plugin can be easily setup. One of the unique features of this plugin is its ability to create two cached files: plain HTML one and gzipped (pre-compressed files). It also offers the features of clearing the cache in either a manual or an automated manner.

  1. Hummingbird Page Speed Optimization

Total number of downloads: 10,000+

A great speed optimization caching plugin by WPMU Dev, the Hummingbird plugin features file compression, minification and full-page, browser and Gravatar caching. It also provides performance reports for your WordPress site so that you can maintain its speed. Its scanning feature keeps a check on files that might be slowing your site and provides probable fixes.

NOTE: While caching is great, you will also need to implement other efforts if you really want to increase your website’s speed. Some of the things that you can easily do are:

  • Invest in a reliable web hosting service and go with a hosting plan that suits the size of your business website
  • Getting a CDN service is a great way to cater to your site visitors from various geographical locations without having them to wait up a bit too long for the server to fetch your site data
  • Declutter your website’s database, uninstall plugins and themes that you no longer need
  • Always use a WordPress theme that has been optimized for speed.

Conclusions

Website speed matters, and caching is one of the easiest ways out there to accomplish a fast loading site. Since your site’s speed has a direct relationship with user experience and the traffic it drives in, it follows that search engine optimization also slides in. Therefore, you must direct all your efforts into making sure that your website is capable of impressing its visitors with an unmatched speed.

Lucy Barret is an experienced Web Developer and passionate blogger, currently working at WPCodingDev. 



from SEO – Search Engine Watch https://ift.tt/2Kc594f
via IFTTT

The SEO Quick Fix: Competitor Keywords, Redirect Chains, and Duplicate Content, Oh My!

Posted by ErinMcCaul

I have a eight-month-old baby. As a mom my time is at a premium, and I’ve come to appreciate functionalities I didn’t know existed in things I already pay for. My HBONow subscription has Game of Thrones AND Sesame Street? Fantastic! Overnight diapers can save me a trip to the tiny airplane bathroom on a quick flight? Sweet! Oxiclean keeps my towels fluffy and vanquishes baby poop stains? Flip my pancakes!

Moz Pro isn’t just a tool for link building, or keyword research, or on-page SEO, or crawling your site. It does all those things and a little bit more, simplifying your SEO work and saving time. And if you’ve run into an SEO task you’re not sure how to tackle, it’s possible that a tool you need is right here just waiting to be found! It’s in this spirit that we’ve revived our SEO Quick Fix videos. These 2–3 minute Mozzer-led tutorials are meant to help you get the most out of our tools, and offer simple solutions to common SEO problems.

Take Moz Pro for a spin!

Today we’ll focus on a few Keyword Explorer and Site Crawl tips. I hope these knowledge nuggets bring you the joy I experienced the moment I realized my son doesn’t care whether I read him The Name of the Wind or Goodnight Moon.

Let’s dive in!

Fix #1 - Keyword Explorer: Finding keyword suggestions that are questions

Search queries all have intent (“when to give my baby water” was a hot Google search at my house recently). Here’s the good news: Research shows that if you’re already ranking in the top ten positions, providing the best answers to specific questions can earn you a coveted Featured Snippet!

Featured snippet example

In this video, April from our Customer Success Team will show you how to pull a list of keyword phrases that cover the who, what, where, when, why, and how of all the related topics for keywords you’re already ranking for. Here’s the rub. Different questions call for different Featured Snippet formats. For example, “how” and “have” questions tend to result in list-based snippets, while “which” questions often result in tables. When you’re crafting your content, be mindful of the type of question you’re targeting and format accordingly.

Looking for more resources? Once you’ve got your list, check out AJ Ghergich’s article on the Moz Blog for some in-depth insight on formatting and optimizing your snippets. High five!


Fix #2 - Site Crawl: Optimize the content on your site

Sometimes if I find a really good pair of pants, I buy two (I mean, it’s really hard to find good pants). In this case duplicates are good, but the rules of pants don’t always apply to content. Chiaryn is here to teach you how to use Site Crawl to identify duplicate content and titles, and uncover opportunities to help customers and bots find more relevant content on your site.

When reviewing your duplicate content, keep a few things in mind:

  • Does this page provide value to visitors?
  • Title tags are meant to give searchers a taste of what your content is about, and meant to help bots understand and categorize your content. You want your title tags to be relevant and unique to your content.
  • If pages with different content have the same title tag, re-write your tags to make them more relevant to your page content. Use our Title Tag Preview tool to help out.
  • Thin content isn’t always a bad thing, but it’s still a good opportunity to make sure your page is performing as expected — and update it as necessary with meaningful content.
  • Check out Jo Cameron’s post about How to Turn Low-Value Content Into Neatly Organized Opportunities for more snazzy tips on duplicate content and Site Crawl!

Fix #3 - Keyword Explorer: Identify your competitors’ top keywords

Cozily nestled under a few clicks, Keyword Explorer holds the keys to a competitive research sweet spot. By isolating the ranking keywords you have in common with your competitors, you can pinpoint their weak spots and discover keywords that are low-hanging fruit — phrases you have the content and authority to rank for that, with a little attention, could do even better. In this video, Janisha shows you how targeting a competitor’s low-ranking keywords can earn you a top spot in the SERPS.

Finding competitors' keywords: A Venn diagram

Check out all that overlapped opportunity!

For a few more tips along this line, check out Hayley Sherman’s post, How to Use Keyword Explorer to Identify Competitive Keyword Opportunities.


Fix #4 - Site Crawl: Identify and fix redirect chains

Redirects are a handy way to get a visitor from a page they try to land on, to the page you want them to land on. Redirect chains, however, are redirects gone wrong. They look something like this: URL A redirects to URL B, URL B redirects to URL C… and so on and so forth.

These redirect chains can negatively impact your rankings, slow your site load times, and make it hard for crawlers to properly index your site.

Meghan from our Help team is here to show you how to find redirect chains, understand where they currently exist, and help you cut a few of those pesky middle redirects.

Looking for a few other redirect resources? I’ve got you covered:


Alright friends, that’s a wrap! Like the end of The Last Jedi, you might not be ready for this post to be over. Fear not! Our blog editor liked my jokes so much that she's promised to harp on me to write more blog posts. So, I need your help! Find yourself facing an SEO snafu that doesn’t seem to have a straightforward fix? Let me know in the comments. I might know a Moz tool that can help, and you might inspire another Quick Fix post!

Get a free month of Moz Pro

If you’re still interested in checking out more solutions, here’s a list of some of my favorite resources:

Stay cool!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2HURJsS
via IFTTT

Monday 23 April 2018

ROPO: 2018's Most Important Multichannel Digital Marketing Report

Posted by RobBeirne

Digital marketers have always had one drum they loudly beat in front of traditional advertising channels: "We can measure what we do better than you." Now, we weren't embellishing the truth or anything — we can measure digital advertising performance at a much more granular level than we can traditional advertising. But it's not perfect. Multichannel digital marketing teams always have one niggling thought that keeps them awake at night: online activity is driving in-store sales and we can't claim any credit for it.

Offline sales are happening. Sure enough, we're seeing online shopping become more and more popular, but even so, you’ll never see 100% of your sales being made online if you're a multichannel retailer. Whether it’s a dress that needs to be tried on or a TV you want to measure up before you buy, in-store purchases are going nowhere. But it's more important than ever to make sure you don't underestimate the impact your online advertising has on offline sales.

ROPO: Research Online Purchase Offline has plagued multichannel retailers for years. This is when awareness and hot leads are generated online, but the customers convert in-store.

There is one other problem hampering many multichannel businesses: viewing their online store as "just another store" and, in many cases, the store managers themselves considering the website to be a competitor.

In this article, I'll show you how we've improvised to create a ROPO report for DID Electrical, an Irish electrical and home appliance multichannel retailer, to provide greater insight into their customers' multichannel journey and how this affected their business.

What is ROPO reporting?

Offline conversions are a massive blind spot for digital marketers. It's the same as someone else taking credit for your work: your online ads are definitely influencing shoppers who complete their purchase offline, but we can't prove it. Or at least we couldn't prove it — until now.

ROPO reporting (Research Online Purchase Offline) allows multichannel retailers to see what volume of in-store sales have been influenced by online ads. Facebook has trail-blazed in this area of reporting, leaving Google in their wake and scrambling to keep up. I know this well, because I work on Wolfgang's PPC team and gaze enviously at the ROPO reporting abilities of our social team. Working with DID, we created a robust way to measure the offline value of both PPC and SEO activity online.

To create a ROPO report, multichannel retailers must have a digital touch point in-store. This isn't as complicated as it sounds and can be something like an e-receipt or warranty system where you email customers. This gives you the customer data that you'll need to match offline conversions with your online advertising activity.

As I mentioned earlier, Facebook makes this nice and simple. You take the data gathered in-store, upload it to Facebook, and they will match as many people as possible. Our social team is generally seeing a 50% match rate between the data gathered in-store and Facebook users who've seen our ads. You can watch two of my colleagues, Alan and Roisin, discussing social ROPO reporting in an episode of our new video series, Wolfgang Bites:

Clearly, ROPO reporting is potentially very powerful for social media marketers, but Google doesn't yet provide a way for me to simply upload offline conversion data and match it against people who've seen my ads (though they have said that this is coming for Google AdWords). Wouldn't this be a really boring article for people working in SEO and PPC if I just ended things there?

Google ROPO reporting

DID Electrical were a perfect business to develop a ROPO report for. Founded back in 1968 (happy 50th birthday guys!), a year before tech was advanced enough to put man on the moon, DID strives to "understand the needs of each and every one of their customers." DID have an innovative approach to multichannel retail, which is great for ROPO reporting because they're already offering e-receipts to customers purchasing goods for over €100. Better still, the email delivering the e-receipts also has a link to a dedicated competition. This sits on a hidden landing page, so the only visitors to this page are customers receiving e-receipts.

They were nearly set for ROPO reporting, but there was just one extra step needed. In Google Analytics, we set the unique competition landing page URL as a goal, allowing us to reverse-engineer customer journeys and uncover the extent of Google PPC and SEO's influence over in-store sales. Before I unveil the results, a few caveats.

The ROPO under-report

Despite our best efforts to track offline conversions, I can't say ROPO reporting reflects 100% of all in-store sales influenced by digital ads. In the past, we've been open about the difficulties in tracking both offline conversions and cross-device conversions. For example, when running a social ROPO report, customers might give a different email in-store from the one attached to their Facebook account. For an SEO or PPC ROPO report, the customer might click a search ad on a work computer but the open their e-receipt on their smartphone. Unfortunately, due to the nature of the beast, ROPO reporting just isn't 100% accurate, but it does give an incredible indication of online's influence over offline sales.

I expect to see improved reporting coming down the line from Google, and they're definitely working on a ROPO reporting solution like Facebook's upload system. While our approach to ROPO reporting does shine a light on the offline conversion blind spot, it's entirely likely that digital advertising's influence goes far beyond these (still mightily impressive) results.

It’s also important to note that this method isn’t intended to give an exact figure for every ROPO sale, but instead gives us an excellent idea of the proportion of offline sales impacted by our online activities. By applying these proportions to overall business figures, we can work out a robust estimate for metrics like offline ROI.

Results from ROPO reporting

I'm going to divide the results of this ROPO reporting innovation into three sections:

  1. PPC Results
  2. SEO Results
  3. Business Results

1. PPC results of ROPO reporting

First of all, we found 47% of offline customers had visited the DID Electrical website prior to visiting the store and making a purchase. Alone, this was an incredible insight into consumer behavior to be able to offer the team at DID. We went even further and determined that 1 in 8 measurable offline sales were influenced by an AdWords click.

2. SEO results of ROPO reporting

This method of ROPO reporting also means we can check the value of an organic click-through using the same reverse-engineering we used for PPC clicks. Based on the same data set, we discovered 1 in 5 purchases made in-store were made by customers who visited the DID site through an organic click prior to visiting the store.

3. Business results of ROPO reporting

ROPO reporting proved to be a great solution to DID's needs in providing clarity around the position of their website in the multichannel experience. With at least 47% of offline shoppers visiting the site before purchasing, 1 in 8 of them being influenced by AdWords and 1 in 5 by SEO, DID could now show the impact online was having over in-store sales. Internally, the website was no longer being viewed as just another store — now it's viewed as the hub linking everything together for an improved customer experience.

Following the deeper understanding into multichannel retail offered by ROPO reporting, DID was also able to augment their budget allocations between digital and traditional channels more efficiently. These insights have enabled them to justify moving more of their marketing budget online. Digital will make up 50% more of their overall marketing budget in 2018!

Getting started with ROPO reporting

If you're a digital marketer within a multichannel retailer and you want to get started with ROPO reporting, the key factor is your in-store digital touchpoint. This is the bridge between your online advertising and offline conversion data. If you're not offering e-receipts already, now is the time to start considering them as they played a critical role in DID’s ROPO strategy.

ROPO Cheat Guide (or quick reference)

If you're a multichannel retailer and this all sounds tantalizing, here’s the customer journey which ROPO measures:

  1. Customer researches online using your website
  2. Customer makes purchase in your brick-and-mortar store
  3. Customer agrees to receive an e-receipt or warranty delivered to their email address
  4. Customer clicks a competition link in the communication they receive
  5. This action is captured in your Google Analytics as a custom goal completion
  6. You can now calculate ROAS (Return On Advertising Spending)

The two critical steps here are the digital touchpoint in your physical stores and the incentive for the customer’s post-conversion communication click. Once you have this touchpoint and interaction, measuring Facebook's social ROPO is a simple file upload and using what I’ve shown you above, you’ll be able to measure the ROPO impact of PPC and SEO too.

If you do have any questions, pop them into the comments below. I have some questions too and it would be great to hear what you all think:

  • If you're a multichannel retailer, are you in a position to start ROPO reporting?
  • Does your company view your website as a hub for all stores or just another store (or even a competitor to the physical stores)?
  • Have you seen a shift in marketing spend towards digital?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2qRvs7P
via IFTTT

Friday 20 April 2018

How to force Google to recrawl your website

If you have launched a new website, updated a single page on your existing domain, or altered many pages and/or the structure of your site, you will likely want Google to display your latest content in its SERPs.

Google’s crawlers are pretty good at their job. If you think of a new article on a big domain, for example, the search engine will crawl and index any changes pretty quickly thanks to the natural traffic and links from around the web which will alert its algorithms to this new content.

For most sites, however, it is good practice to give Google a little assistance with its indexing job.

Each of the following official Google tools can achieve this. And each are more or less suitable depending on whether you want Google to recrawl a single or page or more of your site.

It is also important to note two things before we start:

  1. None of these tools can force Google to start indexing your site immediately. You do have to be patient
  2. Quicker and more comprehensive indexing of your site will occur if your content is fresh, original, useful, easy to navigate, and being linked to from elsewhere on the web. These tools can’t guarantee Google will deem your site indexable. And they shouldn’t be used as an alternative to publishing content which is adding value to the internet ecosystem.

Fetch as Google

Google’s Fetch tool is the most logical starting point for getting your great new site or content indexed.

First, you need to have a Google account in order to have a Google Webmaster account – from there you will be prompted to ‘add a property’ which you will then have to verify. This is all very straightforward if you have not yet done this.

Once you have the relevant property listed in your Webmaster Tools account, you can then ‘perform a fetch’ on any URL related to that property. If your site is fetchable (you can also check if it is displaying correctly) you can then submit for it to be added to Google’s index.

This tool also allows you to submit a single URL (‘Crawl only this URL’) or the selected URL and any pages it links to directly (‘Crawl this URL and its direct links’). Although both of these requests come with their own limits; 10 for the former option and 2 for the latter.

Add URL

You might also have heard of Google’s Add URL tool.

 

Think of this as a simpler version of the above Fetch tool. It is a slower, simpler tool without the functionality and versatility of Fetch. But it still exists, so – it seems – still worth adding your URL to if you can.

You can also use Add URL with just a Google account, rather than adding and verifying a property to Webmaster Tools. Simply add your URL and click to assure the service you aren’t a robot!

Add a Sitemap

If you have amended many pages on a domain or changed the structure of the site, adding a sitemap to Google is the best option.

Like Fetch As Google, you need to add a sitemap via the Webmaster search console.

[See our post Sitemaps & SEO: An introductory guide if you are in the dark about what sitemaps are].

Once you have generated or built a sitemap: on Webmaster Tools select the domain on which it appears, select ‘crawl’/’sitemaps’/’add/test sitemap’, type in its URL (or, as you can see, the domain URL appended with sitemap.xml) and ‘submit’.

As I pointed out in the introduction to this post…

Google is pretty good at crawling and indexing the web but giving the spiders as much assistance with their job as possible makes for quicker and cleaner SEO.

Simply having your property added to Webmaster Tools, running Google Analytics, and then using the above tools are the foundation for getting your site noticed by the search giant.

But good, useful content on a well-designed usable site really gets you visible – by Google and most importantly humans.

 



from SEO – Search Engine Watch https://ift.tt/2HBqMwn
via IFTTT

Social Media Today