Innovate not imitate!

Innovate not imitate!
Interested in the latest Growth hacks?

Welcome to our blog

Interested in the latest Growth hacks?

Welcome to our blog!

We want to help you start/manage and grow your business using innovative strategies and implementation. We have a passion for helping businesses and companies of various sizes see the same success that we have achieved.

Our skillsets are wide and varied, from business strategy, marketing, to online strategy. An increasing number of companies are turning to the internet and online media as a means to maximising their marketing reach and exposure. This is special area of focus for us and we do more than simple SEO strategies.

See our website for more: www.innovatetoaccelerate.com

Wednesday 31 May 2017

Use Fetch & Render for Most Accurate Google View of Webpage

A question came up on Twitter regarding how Googlebot sees a page and whether the fetch and render within Google Search Console is the most accurate or if using the cache view from within the Google search results is more accurate. John Mueller answered, and he said that fetch and render will be the closest.  […]

The post Use Fetch & Render for Most Accurate Google View of Webpage appeared first on The SEM Post.



from The SEM Post http://ift.tt/2smihdD
http://ift.tt/eA8V8J via IFTTT

Optimizing AngularJS Single-Page Applications for Googlebot Crawlers

Posted by jrridley

It’s almost certain that you’ve encountered AngularJS on the web somewhere, even if you weren’t aware of it at the time. Here’s a list of just a few sites using Angular:

  • Upwork.com
  • Freelancer.com
  • Udemy.com
  • Youtube.com

Any of those look familiar? If so, it’s because AngularJS is taking over the Internet. There’s a good reason for that: Angular- and other React-style frameworks make for a better user and developer experience on a site. For background, AngularJS and ReactJS are part of a web design movement called single-page applications, or SPAs. While a traditional website loads each individual page as the user navigates the site, including calls to the server and cache, loading resources, and rendering the page, SPAs cut out much of the back-end activity by loading the entire site when a user first lands on a page. Instead of loading a new page each time you click on a link, the site dynamically updates a single HTML page as the user interacts with the site.

image001.png

Image c/o Microsoft

Why is this movement taking over the Internet? With SPAs, users are treated to a screaming fast site through which they can navigate almost instantaneously, while developers have a template that allows them to customize, test, and optimize pages seamlessly and efficiently. AngularJS and ReactJS use advanced Javascript templates to render the site, which means the HTML/CSS page speed overhead is almost nothing. All site activity runs behind the scenes, out of view of the user.

Unfortunately, anyone who’s tried performing SEO on an Angular or React site knows that the site activity is hidden from more than just site visitors: it’s also hidden from web crawlers. Crawlers like Googlebot rely heavily on HTML/CSS data to render and interpret the content on a site. When that HTML content is hidden behind website scripts, crawlers have no website content to index and serve in search results.

Of course, Google claims they can crawl Javascript (and SEOs have tested and supported this claim), but even if that is true, Googlebot still struggles to crawl sites built on a SPA framework. One of the first issues we encountered when a client first approached us with an Angular site was that nothing beyond the homepage was appearing in the SERPs. ScreamingFrog crawls uncovered the homepage and a handful of other Javascript resources, and that was it.

SF Javascript.png

Another common issue is recording Google Analytics data. Think about it: Analytics data is tracked by recording pageviews every time a user navigates to a page. How can you track site analytics when there’s no HTML response to trigger a pageview?

After working with several clients on their SPA websites, we’ve developed a process for performing SEO on those sites. By using this process, we’ve not only enabled SPA sites to be indexed by search engines, but even to rank on the first page for keywords.

5-step solution to SEO for AngularJS

  1. Make a list of all pages on the site
  2. Install Prerender
  3. “Fetch as Google”
  4. Configure Analytics
  5. Recrawl the site

1) Make a list of all pages on your site

If this sounds like a long and tedious process, that’s because it definitely can be. For some sites, this will be as easy as exporting the XML sitemap for the site. For other sites, especially those with hundreds or thousands of pages, creating a comprehensive list of all the pages on the site can take hours or days. However, I cannot emphasize enough how helpful this step has been for us. Having an index of all pages on the site gives you a guide to reference and consult as you work on getting your site indexed. It’s almost impossible to predict every issue that you’re going to encounter with an SPA, and if you don’t have an all-inclusive list of content to reference throughout your SEO optimization, it’s highly likely you’ll leave some part of the site un-indexed by search engines inadvertently.

One solution that might enable you to streamline this process is to divide content into directories instead of individual pages. For example, if you know that you have a list of storeroom pages, include your /storeroom/ directory and make a note of how many pages that includes. Or if you have an e-commerce site, make a note of how many products you have in each shopping category and compile your list that way (though if you have an e-commerce site, I hope for your own sake you have a master list of products somewhere). Regardless of what you do to make this step less time-consuming, make sure you have a full list before continuing to step 2.

2) Install Prerender

Prerender is going to be your best friend when performing SEO for SPAs. Prerender is a service that will render your website in a virtual browser, then serve the static HTML content to web crawlers. From an SEO standpoint, this is as good of a solution as you can hope for: users still get the fast, dynamic SPA experience while search engine crawlers can identify indexable content for search results.

Prerender’s pricing varies based on the size of your site and the freshness of the cache served to Google. Smaller sites (up to 250 pages) can use Prerender for free, while larger sites (or sites that update constantly) may need to pay as much as $200+/month. However, having an indexable version of your site that enables you to attract customers through organic search is invaluable. This is where that list you compiled in step 1 comes into play: if you can prioritize what sections of your site need to be served to search engines, or with what frequency, you may be able to save a little bit of money each month while still achieving SEO progress.

3) "Fetch as Google"

Within Google Search Console is an incredibly useful feature called “Fetch as Google.” “Fetch as Google” allows you to enter a URL from your site and fetch it as Googlebot would during a crawl. “Fetch” returns the HTTP response from the page, which includes a full download of the page source code as Googlebot sees it. “Fetch and Render” will return the HTTP response and will also provide a screenshot of the page as Googlebot saw it and as a site visitor would see it.

This has powerful applications for AngularJS sites. Even with Prerender installed, you may find that Google is still only partially displaying your website, or it may be omitting key features of your site that are helpful to users. Plugging the URL into “Fetch as Google” will let you review how your site appears to search engines and what further steps you may need to take to optimize your keyword rankings. Additionally, after requesting a “Fetch” or “Fetch and Render,” you have the option to “Request Indexing” for that page, which can be handy catalyst for getting your site to appear in search results.

4) Configure Google Analytics (or Google Tag Manager)

As I mentioned above, SPAs can have serious trouble with recording Google Analytics data since they don’t track pageviews the way a standard website does. Instead of the traditional Google Analytics tracking code, you’ll need to install Analytics through some kind of alternative method.

One method that works well is to use the Angulartics plugin. Angulartics replaces standard pageview events with virtual pageview tracking, which tracks the entire user navigation across your application. Since SPAs dynamically load HTML content, these virtual pageviews are recorded based on user interactions with the site, which ultimately tracks the same user behavior as you would through traditional Analytics. Other people have found success using Google Tag Manager “History Change” triggers or other innovative methods, which are perfectly acceptable implementations. As long as your Google Analytics tracking records user interactions instead of conventional pageviews, your Analytics configuration should suffice.

5) Recrawl the site

After working through steps 1–4, you’re going to want to crawl the site yourself to find those errors that not even Googlebot was anticipating. One issue we discovered early with a client was that after installing Prerender, our crawlers were still running into a spider trap:

As you can probably tell, there were not actually 150,000 pages on that particular site. Our crawlers just found a recursive loop that kept generating longer and longer URL strings for the site content. This is something we would not have found in Google Search Console or Analytics. SPAs are notorious for causing tedious, inexplicable issues that you’ll only uncover by crawling the site yourself. Even if you follow the steps above and take as many precautions as possible, I can still almost guarantee you will come across a unique issue that can only be diagnosed through a crawl.

If you’ve come across any of these unique issues, let me know in the comments! I’d love to hear what other issues people have encountered with SPAs.

Results

As I mentioned earlier in the article, the process outlined above has enabled us to not only get client sites indexed, but even to get those sites ranking on first page for various keywords. Here’s an example of the keyword progress we made for one client with an AngularJS site:

Also, the organic traffic growth for that client over the course of seven months:

All of this goes to show that although SEO for SPAs can be tedious, laborious, and troublesome, it is not impossible. Follow the steps above, and you can have SEO success with your single-page app website.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/2qyzWlH
via IFTTT

Tuesday 30 May 2017

Google Updates Quality Rater Guidelines for YMYL & Offensive Results

Google has made another update to the Google Quality Rater Guidelines, coming only a couple of months after the last release, which saw major changes made to target fake news, science denail and clickbait, among other things. This updates sees Google bring added clarification to specific areas of the guidelines and other minor changes.  The […]

The post Google Updates Quality Rater Guidelines for YMYL & Offensive Results appeared first on The SEM Post.



from The SEM Post http://ift.tt/2r8kpaP
http://ift.tt/eA8V8J via IFTTT

No, Paid Search Audiences Won’t Replace Keywords

Posted by PPCKirk

I have been chewing on a keyword vs. audience targeting post for roughly two years now. In that time we have seen audience targeting grow in popularity (as expected) and depth.

“Popularity” is somewhat of an understatement here. I would go so far as to say that I've heard it lauded in messianic-like “thy kingdom come, thy will be done” reverential awe by some paid search marketers. as if paid search were lacking a heartbeat before the life-giving audience targeting had arrived and 1-2-3-clear’ed it into relevance.

However, I would argue that despite audience targeting’s popularity (and understandable success), we have also seen the revelation of some weaknesses as well. It turns out it’s not quite the heroic, rescue-the-captives targeting method paid searchers had hoped it would be.

The purpose of this post is to argue against the notion that audience targeting can replace the keyword in paid search.

Now, before we get into the throes of keyword philosophy, I’d like to reduce the number of angry comments this post receives by acknowledging a crucial point.

It is not my intention in any way to set up a false dichotomy. Yes, I believe the keyword is still the most valuable form of targeting for a paid search marketer, but I also believe that audience targeting can play a valuable complementary role in search bidding.

In fact, as I think about it, I would argue that I am writing this post in response to what I have heard become a false dichotomy. That is, that audience targeting is better than keyword targeting and will eventually replace it.

I disagree with this idea vehemently, as I will demonstrate in the rest of this article.

One seasoned (age, not steak) traditional marketer’s point of view

The best illustration I've heard on the core weakness of audience targeting was from an older traditional marketer who has probably never accessed the Keyword Planner in his life.

“I have two teenage daughters.” He revealed, with no small amount of pride.

“They are within 18 months of each other, so in age demographic targeting they are the same person.”

“They are both young women, so in gender demographic targeting they are the same person.”

“They are both my daughters in my care, so in income demographic targeting they are the same person.”

“They are both living in my house, so in geographical targeting they are the same person.”

“They share the same friends, so in social targeting they are the same person.”

“However, in terms of personality, they couldn’t be more different. One is artistic and enjoys heels and dresses and makeup. The other loves the outdoors and sports, and spends her time in blue jeans and sneakers.”

If an audience-targeting marketer selling spring dresses saw them in his marketing list, he would (1) see two older high school girls with the same income in the same geographical area, (2) assume they are both interested in what he has to sell, and (3) only make one sale.

The problem isn’t with his targeting, the problem is that not all those forced into an audience persona box will fit.

In September of 2015, Aaron Levy (a brilliant marketing mind; go follow him) wrote a fabulously under-shared post revealing these weaknesses in another way: What You Think You Know About Your Customers’ Persona is Wrong

In this article, Aaron first bravely broaches the subject of audience targeting by describing how it is far from the exact science we all have hoped it to be. He noted a few ways that audience targeting can be erroneous, and even *gasp* used data to formulate his conclusions.

It’s OK to question audience targeting — really!

Let me be clear: I believe audience targeting is popular because there genuinely is value in it (it's amazing data to have… when it's accurate!). The insights we can get about personas, which we can then use to power our ads, are quite amazing and powerful.

So, why the heck am I droning on about audience targeting weaknesses? Well, I’m trying to set you up for something. I’m trying to get us to admit that audience targeting itself has some weaknesses, and isn’t the savior of all digital marketing that some make it out to be, and that there is a tried-and-true solution that fits well with demographic targeting, but is not replaced by it. It is a targeting that we paid searchers have used joyfully and successfully for years now.

It is the keyword.

Whereas audience targeting chafes under the law of averages (i.e., “at some point, someone in my demographic targeted list has to actually be interested in what I am selling”), keyword targeting shines in individual-revealing user intent.

Keyword targeting does something an audience can never, ever, ever do...

Keywords: Personal intent powerhouses

A keyword is still my favorite form of targeting in paid search because it reveals individual, personal, and temporal intent. Those aren’t just three buzzwords I pulled out of the air because I needed to stretch this already obesely-long post out further. They are intentional, and worth exploring.

Individual

A keyword is such a powerful targeting method because it is written (or spoken!) by a single person. I mean, let’s be honest, it’s rare to have more than one person huddled around the computer shouting at it. Keywords are generally from the mind of one individual, and because of that they have frightening potential.

Remember, audience targeting is based off of assumptions. That is, you're taking a group of people who “probably” think the same way in a certain area, but does that mean they cannot have unique tastes? For instance, one person preferring to buy sneakers with another preferring to buy heels?

Keyword targeting is demographic-blind.

It doesn’t care who you are, where you’re from, what you did, as long as you love me… err, I mean, it doesn’t care about your demographic, just about what you're individually interested in.

Personal

The next aspect of keywords powering their targeting awesomeness is that they reveal personal intent. Whereas the “individual” aspect of keyword targeting narrows our targeting from a group of people to a single person, the “personal” aspect of keyword targeting goes into the very mind of that individual.

Don’t you wish there was a way to market to people in which you could truly discern the intentions of their hearts? Wouldn’t that be a powerful method of targeting? Well, yes — and that is keyword targeting!

Think about it: a keyword is a form of communication. It is a person typing or telling you what is on their mind. For a split second, in their search, you and they are as connected through communication as Alexander Graham Bell and Thomas Watson on the first phone call. That person is revealing to you what's on her mind, and that's a power which cannot be underestimated.

When a person tells Google they want to know “how does someone earn a black belt,” that is telling your client — the Jumping Judo Janes of Jordan — this person genuinely wants to learn more about their services and they can display an ad that matches that intent (Ready for that Black Belt? It’s Not Hard, Let Us Help!). Paid search keywords officiate the wedding of personal intent with advertising in a way that previous marketers could only dream of. We aren’t finding random people we think might be interested based upon where they live. We are responding to a person telling us they are interested.

Temporal

The final note of keyword targeting that cannot be underestimated, is the temporal aspect. Anyone worth their salt in marketing can tell you “timing is everything”. With keyword targeting, the timing is inseparable from the intent. When is this person interested in learning about your Judo classes? At the time they are searching, NOW!

You are not blasting your ads into your users lives, interrupting them as they go about their business or family time hoping to jumpstart their interest by distracting them from their activities. You are responding to their query, at the very time they are interested in learning more.

Timing. Is. Everything.

The situation settles into stickiness

Thus, to summarize: a “search” is done when an individual reveals his/her personal intent with communication (keywords/queries) at a specific time. Because of that, I maintain that keyword targeting trumps audience targeting in paid search.

Paid search is an evolving industry, but it is still “search,” which requires communication, which requires words (until that time when the emoji takes over the English language, but that’s okay because the rioting in the streets will have gotten us first).

Of course, we would be remiss in ignoring some legitimate questions which inevitably arise. As ideal as the outline I've laid out before you sounds, you're probably beginning to formulate something like the following four questions.

  • What about low search volume keywords?
  • What if the search engines kill keyword targeting?
  • What if IoT monsters kill search engines?
  • What about social ads?

We’ll close by discussing each of these four questions.

Low search volume terms (LSVs)

Low search volume keywords stink like poo (excuse the rather strong language there). I’m not sure if there is any data on this out there (if so, please share it below), but I have run into low search volume terms far more in the past year than when I first started managing PPC campaigns in 2010.

I don’t know all the reasons for this; perhaps it’s worth another blog post, but the reality is it’s getting harder to be creative and target high-value long-tail keywords when so many are getting shut off due to low search volume.

This seems like a fairly smooth way being paved for Google/Bing to eventually “take over” (i.e., “automate for our good”) keyword targeting, at the very least for SMBs (small-medium businesses) where LSVs can be a significant problem. In this instance, the keyword would still be around, it just wouldn’t be managed by us PPCers directly. Boo.

Search engine decrees

I’ve already addressed the power search engines have here, but I will be the first to admit that, as much as I like keyword targeting and as much as I have hopefully proven how valuable it is, it still would be a fairly easy thing for Google or Bing to kill off completely. Major boo.

Since paid search relies on keywords and queries and language to work, I imagine this would look more like an automated solution (think DSAs and shopping), in which they make keyword targeting into a dynamic system that works in conjunction with audience targeting.

While this was about a year and a half ago, it is worth noting that at Hero Conference in London, Bing Ads’ ebullient Tor Crockett did make the public statement that Bing at the time had no plans to sunset the keyword as a bidding option. We can only hope this sentiment remains, and transfers over to Google as well.

But Internet of Things (IoT) Frankenstein devices!

Finally, it could be that search engines won’t be around forever. Perhaps this will look like IoT devices such as Alexa that incorporate some level of search into them, but pull traffic away from using Google/Bing search bars. As an example of this in real life, you don’t need to ask Google where to find (queries, keywords, communication, search) the best price on laundry detergent if you can just push the Dash button, or your smart washing machine can just order you more without a search effort.

Image source

On the other hand, I still believe we're a long way off from this in the same way that the freak-out over mobile devices killing personal computers has slowed down. That is, we still utilize our computers for education & work (even if personal usage revolves around tablets and mobile devices and IoT freaks-of-nature… smart toasters anyone?) and our mobile devices for queries on the go. Computers are still a primary source of search in terms of work and education as well as more intensive personal activities (vacation planning, for instance), and thus computers still rely heavily on search. Mobile devices are still heavily query-centered for various tasks, especially as voice search (still query-centered!) kicks in harder.

The social effect

Social is its own animal in a way, and why I believe it is already and will continue to have an effect on search and keywords (though not in a terribly worrisome way). Social definitely pulls a level of traffic from search, specifically in product queries. “Who has used this dishwasher before, any other recommendations?” Social ads are exploding in popularity as well, and in large part because they are working. People are purchasing more than they ever have from social ads and marketers are rushing to be there for them.

The flip side of this: a social and paid search comparison is apples-to-oranges. There are different motivations and purposes for using search engines and querying your friends.

Audience targeting works great in a social setting since that social network has phenomenally accurate and specific targeting for individuals, but it is the rare individual curious about the ideal condom to purchase who queries his family and friends on Facebook. There will always be elements of social and search that are unique and valuable in their own way, and audience targeting for social and keyword targeting for search complement those unique elements of each.

Idealism incarnate

Thus, it is my belief that as long as we have search, we will still have keywords and keyword targeting will be the best way to target — as long as costs remain low enough to be realistic for budgets and the search engines don’t kill keyword bidding for an automated solution.

Don’t give up, the keyword is not dead. Stay focused, and carry on with your match types!

I want to close by re-acknowledging the crucial point I opened with.

It has not been my intention in any way to set up a false dichotomy. In fact, as I think about it, I would argue that I am writing this in response to what I have heard become a false dichotomy. That is, that audience targeting is better than keyword targeting and will eventually replace it…

I believe the keyword is still the most valuable form of targeting for a paid search marketer, but I also believe that audience demographics can play a valuable complementary role in bidding.

A prime example that we already use is remarketing lists for search ads, in which we can layer on remarketing audiences in both Google and Bing into our search queries. Wouldn’t it be amazing if we could someday do this with massive amounts of audience data? I've said this before, but were Bing Ads to use its LinkedIn acquisition to allow us to layer on LinkedIn audiences into our current keyword framework, the B2B angels would surely rejoice over us (Bing has responded, by the way, that something is in the works!).

Either way, I hope I've demonstrated that far from being on its deathbed, the keyword is still the most essential tool in the paid search marketer’s toolbox.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/2qBdBzf
via IFTTT

Monday 29 May 2017

Optimizing for voice assistants: Why actions speak louder than words

“Hey Siri, remind me to invent you in 30 years”

In 1987, Apple came up with the idea of a “Knowledge Navigator”. You can see the full video here, but it’s a concept that’s remarkably – and perhaps, not coincidentally – similar to our modern smart device assistants, Siri among them.

Its features included a talking screen, reacting to vocal commands to provide information and sort calendars.

In theory, we’re there, 30 years later – though the reality doesn’t always quite match up to the dream.

Even when it does work, voice hasn’t always been exactly what people were looking for. The thing most adults said they wish their voice search systems could do was find their keys (though teens said they most wished it could send them pizza).

Although we’re getting to the stage where that’s possible now, the majority of developments in voice have been voice search – talking to your phone to find out information.

Showing search results for “Why can’t you understand me, you stupid phone”

But while talking to a device can be a better experience than playing around with a virtual keyboard on a phone or a physical one on a computer, there are two major issues with voice search.

The first is that it’s still clunky. Half the time you have to repeat yourself in order to be understood, particularly if the word you’re trying to get across is slang or an abbreviation of some sort, which is to say, the default sort of language you’d think would be fitting for “conversational” search.

It doesn’t feel smooth, and it doesn’t feel effortless – and that pretty much removes the point of it.

The other is that it simply doesn’t add value. A voice search isn’t achieving anything you couldn’t do by simply typing in the same thing.

But recently, we’ve seen developments to the voice control industry, starting with Alexa. At this point, everyone’s familiar with the Echo and its younger sibling, the Echo Dot – it’s been in adverts, our friends have it, maybe we have it ourselves.

The Alexa devices were among Amazon’s best-selling products in 2016, especially around Christmas, and the trend doesn’t show significant signs of slowing. But if we’ve had Siri since 2011, why is Alexa picking up so much traction now?

The answer is that it’s not voice search. It’s voice commands. Alexa is more exciting and satisfying for users because it provides an action – you speak to it and something happens. You now can order a pizza – or an Uber, or a dollhouse.

That’s what people have been wanting from their devices – the ability to control the world around them by talking to it, not just have an alternative to a keyboard.

Ultimately, the commands are more personal. You can go on a website and order a pizza, and you can customise it and pay for it and it’ll show up, but talking to Alexa is akin to saying to your friend “Order a pizza?” (Except Alexa won’t stop mid-phone call to ask you what the other topping you wanted was).

Where the majority of mobile voice commands are used for search, Alexa’s use cases are dominated by home control – 34% of users have Alexa play music, just under 31% get her to play with the lights, and 24.5% use it as a timer.

While Siri and the Google Voice Search system are both examples of narrow AI like the Echo, they make much more limited use of its capabilities – compared to Alexa, Google is not OK, and Siri can say goodbye.

“OK Google – who would win in a fight, you or Alexa?”

Alexa’s success has put Google into catch-up mode, and they have been making some progress in the form of Google Home. Early reviews suggest that it might actually be the better product – but it lacks the market momentum of the Amazon product, and it seems unlikely that the sales will be on an even footing for a while yet.

However, Google does have the advantage of some high-end technology, namely Alphabet DeepMind.

DeepMind itself is the company name, but the more familiar connection is the technology the company produces. DeepMind are responsible for the program AlphaGo that beat the world’s foremost Go player 4 – 1, as well as a neural network that can learn how to play video games with the same approach as humans do.

DeepMind can offer Google systems their machine learning experience – which means that Google Home’s technology might have more room to start leaning towards Deep AI in the future. Your device will be able to start adapting itself to your needs – just don’t ask it to open the pod bay doors.

“Watson – what wine would you recommend with this?”

The other major contender in the AI race has only just started dipping into the B2C commercial market, and not nearly to the same scale as Alexa or Google Home.

IBM Watson has, however, won Jeopardy!, as well as found places in healthcare, teaching, and weather forecasting – essentially, absorbing a great deal of information and adapting it for different uses.

Watson is now used by The North Face, for example, to offer contextual shopping through conversational search. Users answer questions, and Watson suggests products based on the answers.

Likewise, Bear Naked uses Watson to “taste test” their customized granola system for the user, so once you’ve designed your meal, it can tell you if you might want to cut back on the chocolate chips.

AI is a competitive market – and it’s a market synergizing with conversational and voice search to bring us ever closer to the computer from Star Trek, and even beyond it.

For now, however, narrow AI is the market – and that means optimizing sites for it.

SE-OK Google

Voice search means that people are searching much more conversationally than they used to. The best way to accommodate that in your SEO strategy is to give more attention to your long-tail keywords, especially the questions.

Questions are opportunities best met with in-depth, mobile-friendly guides that offer information to your customers and clients.

But this also applies when it comes to using apps in the way that Alexa and Google Home do. People aren’t just making voice searches now – they’re also making voice commands.

With that in mind, to rank for some of these long-tail keywords, you need to start optimizing for action phrases and Google-approved AI commands like “search for [KEYWORD] on [APP]”, as well as carefully managing your API, if you have one. And it is worth having one, in order that you can integrate fully with these new devices.

You can break down the structure of common questions in your industry to optimize your long-tail keywords for devices.

You’ll also need to look into deep-linking to optimize your apps for search. Deep-linking allows searchers to see listings from an app directly on search, and open the app from those search rankings, making for a smoother user experience.

Search results show your app data and link directly into the app

This is only going to become more important over time – Google have just announced that they’re opening up their technology, “Instant Apps”, to all developers.

Instant Apps mean that if the user doesn’t have the app, it can “stream” the page from the app anyway. It’s not a stretch to imagine that before long Alexa won’t need Skills to complete commands – so long as you’ve properly set up your API to work with search.

Siri, likewise, already has SiriKit, which allows developers to build markup into their apps that Siri can read.

“Alexa – What’s the Best Way to Deal with AI?”

Voice search is a growing part of the search industry. But it’s not the biggest opportunity of it.

Rather, companies should be focusing on integrating voice actions into their strategy – by deep-linking their apps, ranking for long-tail keyword questions, and making sure everything they want a customer can do, they can do with their voice.



from SEO – Search Engine Watch http://ift.tt/2s6ArRi
via IFTTT

Evidence of the Surprising State of JavaScript Indexing

Posted by willcritchlow

Back when I started in this industry, it was standard advice to tell our clients that the search engines couldn’t execute JavaScript (JS), and anything that relied on JS would be effectively invisible and never appear in the index. Over the years, that has changed gradually, from early work-arounds (such as the horrible escaped fragment approach my colleague Rob wrote about back in 2010) to the actual execution of JS in the indexing pipeline that we see today, at least at Google.

In this article, I want to explore some things we've seen about JS indexing behavior in the wild and in controlled tests and share some tentative conclusions I've drawn about how it must be working.

A brief introduction to JS indexing

At its most basic, the idea behind JavaScript-enabled indexing is to get closer to the search engine seeing the page as the user sees it. Most users browse with JavaScript enabled, and many sites either fail without it or are severely limited. While traditional indexing considers just the raw HTML source received from the server, users typically see a page rendered based on the DOM (Document Object Model) which can be modified by JavaScript running in their web browser. JS-enabled indexing considers all content in the rendered DOM, not just that which appears in the raw HTML.

There are some complexities even in this basic definition (answers in brackets as I understand them):

  • What about JavaScript that requests additional content from the server? (This will generally be included, subject to timeout limits)
  • What about JavaScript that executes some time after the page loads? (This will generally only be indexed up to some time limit, possibly in the region of 5 seconds)
  • What about JavaScript that executes on some user interaction such as scrolling or clicking? (This will generally not be included)
  • What about JavaScript in external files rather than in-line? (This will generally be included, as long as those external files are not blocked from the robot — though see the caveat in experiments below)

For more on the technical details, I recommend my ex-colleague Justin’s writing on the subject.

A high-level overview of my view of JavaScript best practices

Despite the incredible work-arounds of the past (which always seemed like more effort than graceful degradation to me) the “right” answer has existed since at least 2012, with the introduction of PushState. Rob wrote about this one, too. Back then, however, it was pretty clunky and manual and it required a concerted effort to ensure both that the URL was updated in the user’s browser for each view that should be considered a “page,” that the server could return full HTML for those pages in response to new requests for each URL, and that the back button was handled correctly by your JavaScript.

Along the way, in my opinion, too many sites got distracted by a separate prerendering step. This is an approach that does the equivalent of running a headless browser to generate static HTML pages that include any changes made by JavaScript on page load, then serving those snapshots instead of the JS-reliant page in response to requests from bots. It typically treats bots differently, in a way that Google tolerates, as long as the snapshots do represent the user experience. In my opinion, this approach is a poor compromise that's too susceptible to silent failures and falling out of date. We've seen a bunch of sites suffer traffic drops due to serving Googlebot broken experiences that were not immediately detected because no regular users saw the prerendered pages.

These days, if you need or want JS-enhanced functionality, more of the top frameworks have the ability to work the way Rob described in 2012, which is now called isomorphic (roughly meaning “the same”).

Isomorphic JavaScript serves HTML that corresponds to the rendered DOM for each URL, and updates the URL for each “view” that should exist as a separate page as the content is updated via JS. With this implementation, there is actually no need to render the page to index basic content, as it's served in response to any fresh request.

I was fascinated by this piece of research published recently — you should go and read the whole study. In particular, you should watch this video (recommended in the post) in which the speaker — who is an Angular developer and evangelist — emphasizes the need for an isomorphic approach:

Resources for auditing JavaScript

If you work in SEO, you will increasingly find yourself called upon to figure out whether a particular implementation is correct (hopefully on a staging/development server before it’s deployed live, but who are we kidding? You’ll be doing this live, too).

To do that, here are some resources I’ve found useful:

Some surprising/interesting results

There are likely to be timeouts on JavaScript execution

I already linked above to the ScreamingFrog post that mentions experiments they have done to measure the timeout Google uses to determine when to stop executing JavaScript (they found a limit of around 5 seconds).

It may be more complicated than that, however. This segment of a thread is interesting. It's from a Hacker News user who goes by the username KMag and who claims to have worked at Google on the JS execution part of the indexing pipeline from 2006–2010. It’s in relation to another user speculating that Google would not care about content loaded “async” (i.e. asynchronously — in other words, loaded as part of new HTTP requests that are triggered in the background while assets continue to download):

“Actually, we did care about this content. I'm not at liberty to explain the details, but we did execute setTimeouts up to some time limit.

If they're smart, they actually make the exact timeout a function of a HMAC of the loaded source, to make it very difficult to experiment around, find the exact limits, and fool the indexing system. Back in 2010, it was still a fixed time limit.”

What that means is that although it was initially a fixed timeout, he’s speculating (or possibly sharing without directly doing so) that timeouts are programmatically determined (presumably based on page importance and JavaScript reliance) and that they may be tied to the exact source code (the reference to “HMAC” is to do with a technical mechanism for spotting if the page has changed).

It matters how your JS is executed

I referenced this recent study earlier. In it, the author found:

Inline vs. External vs. Bundled JavaScript makes a huge difference for Googlebot

The charts at the end show the extent to which popular JavaScript frameworks perform differently depending on how they're called, with a range of performance from passing every test to failing almost every test. For example here’s the chart for Angular:

Slide5.PNG

It’s definitely worth reading the whole thing and reviewing the performance of the different frameworks. There's more evidence of Google saving computing resources in some areas, as well as surprising results between different frameworks.

CRO tests are getting indexed

When we first started seeing JavaScript-based split-testing platforms designed for testing changes aimed at improving conversion rate (CRO = conversion rate optimization), their inline changes to individual pages were invisible to the search engines. As Google in particular has moved up the JavaScript competency ladder through executing simple inline JS to more complex JS in external files, we are now seeing some CRO-platform-created changes being indexed. A simplified version of what’s happening is:

  • For users:
    • CRO platforms typically take a visitor to a page, check for the existence of a cookie, and if there isn’t one, randomly assign the visitor to group A or group B
    • Based on either the cookie value or the new assignment, the user is either served the page unchanged, or sees a version that is modified in their browser by JavaScript loaded from the CRO platform’s CDN (content delivery network)
    • A cookie is then set to make sure that the user sees the same version if they revisit that page later
  • For Googlebot:
    • The reliance on external JavaScript used to prevent both the bucketing and the inline changes from being indexed
    • With external JavaScript now being loaded, and with many of these inline changes being made using standard libraries (such as JQuery), Google is able to index the variant and hence we see CRO experiments sometimes being indexed

I might have expected the platforms to block their JS with robots.txt, but at least the main platforms I’ve looked at don't do that. With Google being sympathetic towards testing, however, this shouldn’t be a major problem — just something to be aware of as you build out your user-facing CRO tests. All the more reason for your UX and SEO teams to work closely together and communicate well.

Split tests show SEO improvements from removing a reliance on JS

Although we would like to do a lot more to test the actual real-world impact of relying on JavaScript, we do have some early results. At the end of last week I published a post outlining the uplift we saw from removing a site’s reliance on JS to display content and links on category pages.

odn_additional_sessions.png

A simple test that removed the need for JavaScript on 50% of pages showed a >6% uplift in organic traffic — worth thousands of extra sessions a month. While we haven’t proven that JavaScript is always bad, nor understood the exact mechanism at work here, we have opened up a new avenue for exploration, and at least shown that it’s not a settled matter. To my mind, it highlights the importance of testing. It’s obviously our belief in the importance of SEO split-testing that led to us investing so much in the development of the ODN platform over the last 18 months or so.

Conclusion: How JavaScript indexing might work from a systems perspective

Based on all of the information we can piece together from the external behavior of the search results, public comments from Googlers, tests and experiments, and first principles, here’s how I think JavaScript indexing is working at Google at the moment: I think there is a separate queue for JS-enabled rendering, because the computational cost of trying to run JavaScript over the entire web is unnecessary given the lack of a need for it on many, many pages. In detail, I think:

  • Googlebot crawls and caches HTML and core resources regularly
  • Heuristics (and probably machine learning) are used to prioritize JavaScript rendering for each page:
    • Some pages are indexed with no JS execution. There are many pages that can probably be easily identified as not needing rendering, and others which are such a low priority that it isn’t worth the computing resources.
    • Some pages get immediate rendering – or possibly immediate basic/regular indexing, along with high-priority rendering. This would enable the immediate indexation of pages in news results or other QDF results, but also allow pages that rely heavily on JS to get updated indexation when the rendering completes.
    • Many pages are rendered async in a separate process/queue from both crawling and regular indexing, thereby adding the page to the index for new words and phrases found only in the JS-rendered version when rendering completes, in addition to the words and phrases found in the unrendered version indexed initially.
  • The JS rendering also, in addition to adding pages to the index:
    • May make modifications to the link graph
    • May add new URLs to the discovery/crawling queue for Googlebot

The idea of JavaScript rendering as a distinct and separate part of the indexing pipeline is backed up by this quote from KMag, who I mentioned previously for his contributions to this HN thread (direct link) [emphasis mine]:

“I was working on the lightweight high-performance JavaScript interpretation system that sandboxed pretty much just a JS engine and a DOM implementation that we could run on every web page on the index. Most of my work was trying to improve the fidelity of the system. My code analyzed every web page in the index.

Towards the end of my time there, there was someone in Mountain View working on a heavier, higher-fidelity system that sandboxed much more of a browser, and they were trying to improve performance so they could use it on a higher percentage of the index.”

This was the situation in 2010. It seems likely that they have moved a long way towards the headless browser in all cases, but I’m skeptical about whether it would be worth their while to render every page they crawl with JavaScript given the expense of doing so and the fact that a large percentage of pages do not change substantially when you do.

My best guess is that they're using a combination of trying to figure out the need for JavaScript execution on a given page, coupled with trust/authority metrics to decide whether (and with what priority) to render a page with JS.

Run a test, get publicity

I have a hypothesis that I would love to see someone test: That it’s possible to get a page indexed and ranking for a nonsense word contained in the served HTML, but not initially ranking for a different nonsense word added via JavaScript; then, to see the JS get indexed some period of time later and rank for both nonsense words. If you want to run that test, let me know the results — I’d be happy to publicize them.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/2qybjAP
via IFTTT

Friday 26 May 2017

Receive Guest Blog Post Spam Requests? Report Them to Google

Google reminded site owners about using the perils of link building via articles, such as guest blogging.  But one part of their blog post seems to have passed unnoticed – the request Google made that site owners file spam reports for any “Post my article” requests they receive. Many sites engage in large scale “post […]

The post Receive Guest Blog Post Spam Requests? Report Them to Google appeared first on The SEM Post.



from The SEM Post http://ift.tt/2qWDD37
http://ift.tt/eA8V8J via IFTTT

Google Warns Against Link Building Via Articles & Guest Posts

Google has issued a new warning to site owners about utilizing spammy articles as a link building technique.  While this is a reminder, and not a new policy, it is targeting specific types of link building by site owners utilizing various article publishing tactics, such as guest blogging. The blog post also calls out a […]

The post Google Warns Against Link Building Via Articles & Guest Posts appeared first on The SEM Post.



from The SEM Post http://ift.tt/2r5cDiF
http://ift.tt/eA8V8J via IFTTT

Should SEOs Care About Internal Links? - Whiteboard Friday

Posted by randfish

Internal links are one of those essential SEO items you have to get right to avoid getting them really wrong. Rand shares 18 tips to help inform your strategy, going into detail about their attributes, internal vs. external links, ideal link structures, and much, much more in this edition of Whiteboard Friday.

Should SEOs Care About Internl Links?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat a little bit about internal links and internal link structures. Now, it is not the most exciting thing in the SEO world, but it's something that you have to get right and getting it wrong can actually cause lots of problems.

Attributes of internal links

So let's start by talking about some of the things that are true about internal links. Internal links, when I say that phrase, what I mean is a link that exists on a website, let's say ABC.com here, that is linking to a page on the same website, so over here, linking to another page on ABC.com. We'll do /A and /B. This is actually my shipping routes page. So you can see I'm linking from A to B with the anchor text "shipping routes."

The idea of an internal link is really initially to drive visitors from one place to another, to show them where they need to go to navigate from one spot on your site to another spot. They're different from internal links only in that, in the HTML code, you're pointing to the same fundamental root domain. In the initial early versions of the internet, that didn't matter all that much, but for SEO, it matters quite a bit because external links are treated very differently from internal links. That is not to say, however, that internal links have no power or no ability to change rankings, to change crawling patterns and to change how a search engine views your site. That's what we need to chat about.



1. Anchor text is something that can be considered. The search engines have generally minimized its importance, but it's certainly something that's in there for internal links.

2. The location on the page actually matters quite a bit, just as it does with external links. Internal links, it's almost more so in that navigation and footers specifically have attributes around internal links that can be problematic.

Those are essentially when Google in particular sees manipulation in the internal link structure, specifically things like you've stuffed anchor text into all of the internal links trying to get this shipping routes page ranking by putting a little link down here in the footer of every single page and then pointing over here trying to game and manipulate us, they hate that. In fact, there is an algorithmic penalty for that kind of stuff, and we can see it very directly.



We've actually run tests where we've observed that jamming this type of anchor text-rich links into footers or into navigation and then removing it gets a site indexed, well let's not say indexed, let's say ranking well and then ranking poorly when you do it. Google reverses that penalty pretty quickly too, which is nice. So if you are not ranking well and you're like, "Oh no, Rand, I've been doing a lot of that," maybe take it away. Your rankings might come right back. That's great.



3. The link target matters obviously from one place to another.

4. The importance of the linking page, this is actually a big one with internal links. So it is generally the case that if a page on your website has lots of external links pointing to it, it gains authority and it has more ability to sort of generate a little bit, not nearly as much as external links, but a little bit of ranking power and influence by linking to other pages. So if you have very well-linked two pages on your site, you should make sure to link out from those to pages on your site that a) need it and b) are actually useful for your users. That's another signal we'll talk about.



5. The relevance of the link, so pointing to my shipping routes page from a page about other types of shipping information, totally great. Pointing to it from my dog food page, well, it doesn't make great sense. Unless I'm talking about shipping routes of dog food specifically, it seems like it's lacking some of that context, and search engines can pick up on that as well.

6. The first link on the page. So this matters mostly in terms of the anchor text, just as it does for external links. Basically, if you are linking in a bunch of different places to this page from this one, Google will usually, at least in all of our experiments so far, count the first anchor text only. So if I have six different links to this and the first link says "Click here," "Click here" is the anchor text that Google is going to apply, not "Click here" and "shipping routes" and "shipping." Those subsequent links won't matter as much.

7. Then the type of link matters too. Obviously, I would recommend that you keep it in the HTML link format rather than trying to do something fancy with JavaScript. Even though Google can technically follow those, it looks to us like they're not treated with quite the same authority and ranking influence. Text is slightly, slightly better than images in our testing, although that testing is a few years old at this point. So maybe image links are treated exactly the same. Either way, do make sure you have that. If you're doing image links, by the way, remember that the alt attribute of that image is what becomes the anchor text of that link.

Internal versus external links

A. External links usually give more authority and ranking ability.

That shouldn't be surprising. An external link is like a vote from an independent, hopefully independent, hopefully editorially given website to your website saying, "This is a good place for you to go for this type of information." On your own site, it's like a vote for yourself, so engines don't treat it the same.

B. Anchor text of internal links generally have less influence.

So, as we mentioned, me pointing to my page with the phrase that I want to rank for isn't necessarily a bad thing, but I shouldn't do it in a manipulative way. I shouldn't do it in a way that's going to look spammy or sketchy to visitors, because if visitors stop clicking around my site or engaging with it or they bounce more, I will definitely lose ranking influence much faster than if I simply make those links credible and usable and useful to visitors. Besides, the anchor text of internal links is not as powerful anyway.



C. A lack of internal links can seriously hamper a page's ability to get crawled + ranked.

It is, however, the case that a lack of internal links, like an orphan page that doesn't have many internal or any internal links from the rest of its website, that can really hamper a page's ability to rank. Sometimes it will happen. External links will point to a page. You'll see that page in your analytics or in a report about your links from Moz or Ahrefs or Majestic, and then you go, "Oh my gosh, I'm not linking to that page at all from anywhere else on my site." That's a bad idea. Don't do that. That is definitely problematic.

D. It's still the case, by the way, that, broadly speaking, pages with more links on them will send less link value per link.

So, essentially, you remember the original PageRank formula from Google. It said basically like, "Oh, well, if there are five links, send one-fifth of the PageRank power to each of those, and if there are four links, send one-fourth." Obviously, one-fourth is bigger than one-fifth. So taking away that fifth link could mean that each of the four pages that you've linked to get a little bit more ranking authority and influence in the original PageRank algorithm.

Look, PageRank is old, very, very old at this point, but at least the theories behind it are not completely gone. So it is the case that if you have a page with tons and tons of links on it, that tends to send out less authority and influence than a page with few links on it, which is why it can definitely pay to do some spring cleaning on your website and clear out any rubbish pages or rubbish links, ones that visitors don't want, that search engines don't want, that you don't care about. Clearing that up can actually have a positive influence. We've seen that on a number of websites where they've cleaned up their information architecture, whittled down their links to just the stuff that matters the most and the pages that matter the most, and then seen increased rankings across the board from all sorts of signals, positive signals, user engagement signals, link signals, context signals that help the engine them rank better.

E. Internal link flow (aka PR sculpting) is rarely effective, and usually has only mild effects... BUT a little of the right internal linking can go a long way.

Then finally, I do want to point out that what was previous called — you probably have heard of it in the SEO world — PageRank sculpting. This was a practice that I'd say from maybe 2003, 2002 to about 2008, 2009, had this life where there would be panel discussions about PageRank sculpting and all these examples of how to do it and software that would crawl your site and show you the ideal PageRank sculpting system to use and which pages to link to and not.



When PageRank was the dominant algorithm inside of Google's ranking system, yeah, it was the case that PageRank sculpting could have some real effect. These days, that is dramatically reduced. It's not entirely gone because of some of these other principles that we've talked about, just having lots of links on a page for no particularly good reason is generally bad and can have harmful effects and having few carefully chosen ones has good effects. But most of the time, internal linking, optimizing internal linking beyond a certain point is not very valuable, not a great value add.

But a little of what I'm calling the right internal linking, that's what we're going to talk about, can go a long way. For example, if you have those orphan pages or pages that are clearly the next step in a process or that users want and they cannot find them or engines can't find them through the link structure, it's bad. Fixing that can have a positive impact.


Ideal internal link structures

So ideally, in an internal linking structure system, you want something kind of like this. This is a very rough illustration here. But the homepage, which has maybe 100 links on it to internal pages. One hop away from that, you've got your 100 different pages of whatever it is, subcategories or category pages, places that can get folks deeper into your website. Then from there, each of those have maybe a maximum of 100 unique links, and they get you 2 hops away from a homepage, which takes you to 10,000 pages who do the same thing.



I. No page should be more than 3 link "hops" away from another (on most small-->medium sites).

Now, the idea behind this is that basically in one, two, three hops, three links away from the homepage and three links away from any page on the site, I can get to up to a million pages. So when you talk about, "How many clicks do I have to get? How far away is this in terms of link distance from any other page on the site?" a great internal linking structure should be able to get you there in three or fewer link hops. If it's a lot more, you might have an internal linking structure that's really creating sort of these long pathways of forcing you to click before you can ever reach something, and that is not ideal, which is why it can make very good sense to build smart categories and subcategories to help people get in there.

I'll give you the most basic example in the world, a traditional blog. In order to reach any post that was published two years ago, I've got to click Next, Next, Next, Next, Next, Next through all this pagination until I finally get there. Or if I've done a really good job with my categories and my subcategories, I can click on the category of that blog post and I can find it very quickly in a list of the last 50 blog posts in that particular category, great, or by author or by tag, however you're doing your navigation.



II. Pages should contain links that visitors will find relevant and useful.

If no one ever clicks on a link, that is a bad signal for your site, and it is a bad signal for Google as well. I don't just mean no one ever. Very, very few people ever and many of them who do click it click the back button because it wasn't what they wanted. That's also a bad sign.

III. Just as no two pages should be targeting the same keyword or searcher intent, likewise no two links should be using the same anchor text to point to different pages. Canonicalize!

For example, if over here I had a shipping routes link that pointed to this page and then another shipping routes link, same anchor text pointing to a separate page, page C, why am I doing that? Why am I creating competition between my own two pages? Why am I having two things that serve the same function or at least to visitors would appear to serve the same function and search engines too? I should canonicalize those. Canonicalize those links, canonicalize those pages. If a page is serving the same intent and keywords, keep it together.

IV. Limit use of the rel="nofollow" to UGC or specific untrusted external links. It won't help your internal link flow efforts for SEO.

Rel="nofollow" was sort of the classic way that people had been doing PageRank sculpting that we talked about earlier here. I would strongly recommend against using it for that purpose. Google said that they've put in some preventative measures so that rel="nofollow" links sort of do this leaking PageRank thing, as they call it. I wouldn't stress too much about that, but I certainly wouldn't use rel="nofollow."

What I would do is if I'm trying to do internal link sculpting, I would just do careful curation of the links and pages that I've got. That is the best way to help your internal link flow. That's things like...



V. Removing low-value content, low-engagement content and creating internal links that people actually do want. That is going to give you the best results.

VI. Don't orphan! Make sure pages that matter have links to (and from) them. Last, but not least, there should never be an orphan. There should never be a page with no links to it, and certainly there should never be a page that is well linked to that isn't linking back out to portions of your site that are of interest or value to visitors and to Google.

So following these practices, I think you can do some awesome internal link analysis, internal link optimization and help your SEO efforts and the value visitors get from your site. We'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/2rFxn0U
via IFTTT

Thursday 25 May 2017

Taming the local search beast in a post-Possum and Fred world

It’s estimated that 46 percent of all searches performed on Google have a local intent, and the Map Pack appears for 93 percent of these.

In September 2016 Google unveiled a new local search algorithm, dubbed Possum, and it pretty much went unnoticed in comparison to the real-time Penguin update released in the same month.

In short, Possum made it harder for businesses to fake being in locations that they’re not (through the likes of virtual offices), as well as tackling Google My Business spam.

Possum, however, isn’t a “single” algorithm update, as it affected both localized search results as well as the Map Pack, which of course are two separate algorithms both triggered by search queries that are interpreted as having a local search intent.

The Google “Fred” update, which hit SERPs back in March, has also had an impact on local search, much like the Phantom updates before it.

A lot of local SERPs are extremely spammy, where websites have been built cheap and location names have been liberally applied to every menu link and keyword on the page, such as this home page sidebar menu:

This of course, is only a snapshot of the page – the menu and tile icons go on a lot more. Spam such as this still ranks on page one, because Google still has to provide results to its users.

Take advantage of the market conditions

A lot of locally-focused websites aren’t built by agencies; the vast majority tend to be self-built or built by bedroom level developers who can churn out a full website for £300 (or less).

Some verticals have seen some significant online investment in recent years, while others lag behind considerably. By investing in a good website and avoiding the same spammy tactics of your competitors, you can create a powerful resource offering user value that Google will appreciate.

Directory submissions and citations

Just to be clear, I’m not talking about just backlinks. Recent studies have shown that citations with a consistent NAP (Name, Address & Phone number) are important to both local algorithms.

There is no magic number to how many directory submissions you should have, but they need to be relevant.

I’ve worked on local campaigns in the UK where they have been previously submitted to directories in Vietnam, Thailand and Australia. Yes, it’s a backlink, but it’s not relevant in the slightest.

Think local with your directories, and exhaust those before moving onto national ones. The number of local directories should also outweigh the nationals were possible. To do this properly, it’s a manual process and to ensure quality it can’t be automated.

Reviews

Review volume, velocity and diversity factors are important, and in my opinion, they’re going to become more important in the coming months – particularly following the recent release of verified customer reviews for online businesses.

In Google’s Search Quality Evaluator Guidelines, the evaluators are instructed to research a website/brand’s online reputation from external sources in order to assess the quality of the website.

This is why getting reviews on your Google My Business listing, Facebook pages, positive tweets, Yell, Trip Advisor reviews etc are all great. Having testimonials and reviews on your website is great for users, but you wouldn’t publish bad reviews on your own website, would you?

Google accepts that negative reviews appear, but as long the good outweighs the bad, you shouldn’t have anything to worry about. If you do get a negative review, demonstrate your customer service and respond to it. You can set up Google Alerts to monitor for your brand and flag up any external reviews.

Screenshot of Amazon reviews for a product, averaging 4.7 of 5 stars.

Google My Business & Bing Places

Believe it or not, Google My Business is considered a directory, as is Bing Places. It’s important that you have one if you’re a local business, and that you’ve optimised it correctly. This means the correct business name, address and phone number (keep your NAP as consistent as possible), choose an appropriate category and include a thorough description.

localBusiness structured data mark-up

Structured data mark-up (or schema) is an addition to a website’s code that enables Google’s RankBrain (and other AI algorithms from other search engines) to better understand a website’s context by providing it with additional information.

Not all websites are currently utilizing this schema (or any schema), and Google wants you to use it.

If you don’t have developer resource to hand, and you’re not a coder you can use Google’s Data Highlighter to mark-up content – you will need a verified Google Search Console however to make this work.

Other considerations

As well as focusing locally, you need to also consider other ranking factors such as SERP click-through rates.

Optimizing your meta title and description to appeal to local users can have a huge impact on click-through rates, and the change could be as simple as including the phone number in the title tag.

You also need to be on https and have a secure website. Getting hacked, suffering a SQL injection or having malware put on your site can seriously damage your reputation within Google and take a long, long time to recover.



from SEO – Search Engine Watch http://ift.tt/2qfTifb
via IFTTT

Wednesday 24 May 2017

It's Here: The Finalized MozCon 2017 Agenda

Posted by ronell-smith

That sound you hear is the coming together of MozCon 2017.

[You can hear that, right? It's not just me.]

With less than two months to go, most of the nuts and bolts of the event have been fastened together to create what looks to be one of the strongest MozCons in history. Yeah, that's saying a lot, but once you've perused the speakers' lineup, we're sure you'll agree.

MozCon has a rich tradition of bringing together the best and brightest minds in digital marketing, creating a place for individuals across the globe to learn from top-notch speakers, network, share ideas, and learn about the tools, services, and tactics they can put to use in their work and their business.

As a bonus, attendees also get to enjoy lots of snacks, coffee and lots and lots of bacon.

Also, this year we'll offer pre-MozCon SEO workshops on Sunday, July 16. Keep reading for more info.

You will, however, need a ticket to attend the event, so you might want to take care of that sooner rather later, since it always sells out:

Buy my MozCon 2017 ticket!

Now for the meaty details you've been waiting for.

The MozCon 2017 Agenda

Monday


08:00–09:00am
Breakfast


Rand Fishkin

09:00–09:20am
Welcome to MozCon 2017

Rand Fishkin, Wizard of Moz
@randfish

Rand Fishkin is the founder and former CEO of Moz, co-author of a pair of books on SEO, and co-founder of Inbound.org. Rand's an un-save-able addict of all things content, search, and social on the web.


lisa-myers-150x150-33348.jpg09:20–10:05am
How to Get Big Links

Lisa Myers, Verve Search
@LisaDMyers

Everyone wants links and coverage from sites such as New York Times, the Wall Street Journal, and the BBC, but very few achieve it. This is how we cracked it. Over and over.

Lisa is the founder and CEO of award-winning SEO agency Verve Search and founder of Womeninsearch.net. Feminist, mother of two, and modern-day shield maiden.


oli-gardner-150x150-47067.jpg

10:05–10:35am
Data-Driven Design

Oli Gardner, Unbounce
@oligardner

Data-Driven Design (3D) is an actionable, evidence-based framework for creating websites & landing pages that will increase your leads, sales, and customers. In this session you’ll learn how to use the latest industry conversion data to inform copywriting and design decisions that impact conversions. Additionally, I’ll share a new methodology for prioritizing your marketing optimization that will show you which pages are awesome (leave them alone), which pages aren’t (massive ROI potential here), and help you develop a common language that your teams of marketers, designers, and copywriters can use to work better together to collectively increase your conversion rates.

Oli, founder of Unbounce, is on a mission to rid the world of marketing mediocrity by using data-informed copywriting, design, interaction, and psychology to create a more delightful experience for marketers and customers alike.


10:35–11:05am
AM Break


11:10–11:30am
How to Write Customer-Driven Copy That Converts

Joel Klettke, Business Casual Copywriting & Case Study Buddy
@JoelKlettke

If you want to write copy that converts, you need to get into your customers' heads. But how do you do that? How do you know which pain points you need to address, features customers care about, or benefits your audience needs to hear? Marketers are sick and tired of hearing "it depends." I'll give the audience a practical framework for writing customer-driven copy that any business can apply.

Joel is a freelance conversion copywriter and strategist for Business Casual Copywriting. He also owns and runs Case Study Buddy, a done-for-you case studies service.


11:30–11:50am
What We Learned From Reddit & How It Can Help Your Brand Take Content Marketing to the Next Level

Daniel Russell, Go Fish Digital
@dnlRussell

It almost seems too good to be true — online forums where people automatically segment themselves into different markets and demographics and then vote on what content they like best. These forums, including Reddit, are treasure troves of content ideas. I'll share actionable insights from three case studies that demonstrate how your marketing can benefit from content on Reddit.

Daniel is a director at Go Fish Digital whose work has hit the front page of Reddit, earned the #1 spot on YouTube, and been featured in Entrepreneur, Inc., The Washington Post, WSJ, and Fast Company.


11:50am–12:10pm
How to Build an SEO-Intent-Based Framework for Any Business

Kathryn Cunningham, Adept Marketing
@kac4509

Everyone knows intent behind the search matters. In e-commerce, intent is somewhat easy to see. B2B, or better yet healthcare, isn't quite as easy. Matching persona intent to keywords requires a bit more thought. I will cover how to find intent modifiers during keyword research, how to organize those modifiers into the search funnel, and how to quickly find unique universal results at different levels of the search funnel to utilize.

Kathryn is an SEO consultant for Adept Marketing, although to many of her office mates she is known as the Excel nerd.


12:10–01:40pm
Lunch


ian-lurie-150x150-40285.jpg01:45–02:30pm
Size Doesn't Matter: Great Content by Teams of One

Ian Lurie, Portent, Inc.
@portentint

Feel the energy surge through your veins as you gain content creation powers THE LIKES OF WHICH YOU HAVE NEVER EXPERIENCED... Or, just learn a process for creating great content when it's just you and your little teeny team. Because size doesn't matter.

Ian Lurie is founder, CEO, and nerdiest marketing nerd at Portent, a digital marketing agency he started in the Cretaceous era, aka 1995. Ian's meandering career includes marketing copywriting, expert dungeon master, bike messenger-ing, and office temp worker.


justine-jordan-150x150-39303.jpg

02:30–03:00pm
The Tie That Binds: Why Email is Key to Maximizing Marketing ROI

Justine Jordan, Litmus
@meladorri

If nailing the omnichannel experience (whatever that means!) is key to getting more traffic and converting more leads, what happens if we have our channel priorities out of order? Justine will show you how email — far from being an old-school afterthought — is core to hitting marketing goals, building lifetime value, and making customers happy.

Justine is obsessed with helping marketers create, test, and send better email. Named 2015 Email Marketer Thought Leader of the Year, she is strangely passionate about email marketing, hates being called a spammer, and still gets nervous when pressing send.


03:00–03:30pm
PM Break


tara-nicholle-nelson-150x150-39664.jpg

03:35–04:05pm

How to Be a Happy Marketer: Survive the Content Crisis and Drive Results by Mastering Your Customer’s Transformational Journey

Tara-Nicholle Nelson, Transformational Consumer Insights
@taranicholle

Branded content is way up, but customer engagement with that content is plummeting. This whole scene makes it hard to get up in the morning, as a marketer. But there's a new path beyond the epidemic of disengagement and, at the end of it, your brand and your content become regular stops along your customer's everyday journey.

Tara-Nicholle Nelson is the CEO of Transformational Consumer Insights, the former VP of Marketing for MyFitnessPal, and author of the Transformational Consumer.


phil-nottingham-150x150-38081.jpg04:05–04:50pm
Thinking Smaller: Optimizing for the New Wave of Social Video Platforms

Phil Nottingham, Wistia
@philnottingham

SnapChat, Facebook, Twitter, Instagram, Periscope... the list goes on. All social networks are now video platforms, but it's hard to know where to invest. In this session, Phil will be giving you all the tips and tricks for what to make, how to get your content in front of the right audiences, and how get the most value from the investment you're making in social video.

Phil Nottingham is a strategist who believes in the power of creative video content to improve the way companies speak to their customers, and regularly speaks around the world about video strategy, SEO, and technical marketing.


07:00–10:00pm
Monday Night #MozCrawl

The Monday night pub crawl is back.

For the uninitiated, "pub crawl" is not meant to convey what you do after a night of drinking.

Rather, during the MozCon pub crawl, attendees visit some of the best bars in Seattle.

(Each stop is sponsored by a trusted partner; You'll need to bring your MozCon badge for free drinks and light appetizers. You'll also need your US ID or passport.)

More deets to follow.


Tuesday


08:00–09:00am
Breakfast


wil-reynolds-150x150-33027.jpg

09:05–09:50am
I'd Rather Be Thanked Than Ranked

Wil Reynolds, Seer Interactive
@wilreynolds

Ego and assumptions led me to chose the wrong keywords for my own site — yeah, me, Wil Reynolds, Mr. RCS. How did I spend three years optimizing my site and building links to finally crack the top three for six critical keywords, only to find out that I wasted all that time? However, in spite of targeting the wrong words, Seer grew the business. In this presentation, I'll show you the mistakes I made and share with you to approaches that can help you to build content that gets you thanked.

A former teacher with a knack for advising, he’s been helping Fortune 500 companies develop SEO strategies since 1999. Today, Seer is home to over 100 employees across Philadelphia and San Diego.


dawn-anderson-150x150-8516.jpg09:50–10:35am
Winning Value Propositions for Crawlers and Consumers

Dawn Anderson, Move It Marketing/Manchester Metropolitan University
@dawnieando

In an evolving mobile-first web, we can utilize preempting solutions to create winning value propositions, which are designed to attract and satisfy search engine crawlers and keep consumers happy. I'll outline a strategy and share tactics that help ensure increased organic reach, in addition to highlighting smart ways to view data, intent, consumer choice theory, and crawl optimization.

Dawn Anderson is an International and Technical SEO Consultant, Director of Move It Marketing, and a lecturer at Manchester Metropolitan University.


10:35–11:05am
AM Break


11:10–11:15am
MozCon Ignite Preview


11:15–11:35am
More Than SEO: 3 Ways To Prove UX Matters Too

Matthew Edgar, Elementive
@MatthewEdgarCO

Great SEO is increasingly dependent on having a website with a great user experience. To make your user experience great requires carefully tracking what people do so that you always know where to improve. But what do you track? In this 15-minute talk, I’ll cover three effective and advanced ways to use event tracking in Google Analytics to understand a website's user

Matthew is a web analytics and technical marketing consultant at Elementive.


11:35–11:55am
A Site Migration: Redirects, Resources, & Reflection

Jayna Grassel, Dick's Sporting Goods
@jaynagrassel

Site. Migration. No two words elicit more fear, joy, or excitement to a digital marketer. When the idea was shared three years ago, the company was excited. They dreamed of new features and efficiency. But as SEOs, we knew better. We knew there would be midnight strategy sessions with IT. More UAT environments than we could track. Deadlines, requirements, and compromises forged through hallway chats. ...The result was a stable transition with minimal dips in traffic. What we didn't know, however, was the amount of cross-functional coordination that was required to pull it off.

Jayna is the SEO manager at Dick's Sporting Goods and is the unofficial world's second-fastest crocheter.


11:55am–12:15pm
The 8 Paid Promotion Tactics That Will Get You To Quit Organic Traffic

Kane Jamison, Content Harmony
@kanejamison

Digital marketers are ignoring huge opportunities to promote their content through paid channels, and I want to give them the tools to get started. How many brands out there are spending $500+ on a blog post, then moving on to the next one before that post has been seen by 500 people, or even 50? For some reason, everyone thinks about Outbrain and native ads when we talk about paid content distribution, but the real opportunity is in highly targeted paid social.

Kane is the founder of Content Harmony, a content marketing agency based here in Seattle. The Content Harmony team specializes in full funnel content marketing and content promotion.


12:15–01:45pm
Lunch


purna-virji-150x150-46694.jpg01:50–02:20pm
Marketing in a Conversational World: How to Get Discovered, Delight Your Customers and Earn the Conversion

Purna Virji, Microsoft
@purnavirji

Capturing and keeping attention is one of the hardest parts of our job today. Fact: It's just going to get harder with the advent of new technology and conversational interfaces. In the brave new world we're stepping into, the key questions are: How do we get discovered? How can we delight our audiences? And how can we grow revenue for our clients? Come to this session to learn how to make your marketing and advertising efforts something people are going to want to consume.

Named by PPC Hero as the #1 most influential PPC expert in the world, Purna specializes in SEM, SEO, and future search trends. She is a popular global keynote speaker and columnist, an avid traveler, aspiring top chef, and amateur knitter.


matthew-barby-150x150-37740.jpg

02:20–02:50pm
Up and to the Right: Growing Traffic, Conversions, & Revenue

Matthew Barby, HubSpot
@matthewbarby

So many of the case studies that document how a company has grown from 0 to X forget to mention that solutions that they found are applicable to their specific scenario and won't work for everyone. This falls into the dangerous category of bad advice for generic problems. Instead of building up a list of other companies' tactics, marketers need to understand how to diagnose and solve problems across their entire funnel. Illustrated with real-world examples, I'll be talking you through the process that I take to come up with ideas that none of my competitors are thinking of.

Matt, who heads up user acquisition at HubSpot, is an award-winning blogger, startup advisor, and a lecturer.


joanna-lord-150x150-66788.jpg

02:50–03:20pm
How to Operationalize Growth for Maximum Revenue

Joanna Lord, ClassPass
@JoannaLord

Joanna will walk through tactical ways to organize your team, build system foundations, and create processes that fuel growth across the company. You'll hear how to coordinate with product, engineering, CX, and sales to ensure you're maximizing your opportunity to acquire, retain, and monetize your customers.

Joanna is the CMO of ClassPass, the world's leading fitness membership. Prior to that she was VP of Marketing at Porch and CMO of BigDoor. She is a global keynote and digital evangelist. Joanna is a recognized thought leader in digital marketing and a startup mentor.


03:20–03:50pm
PM Break


03:55–04:25pm
Analytics to Drive Optimization & Personalization

Krista Seiden, Google
@kristaseiden

Getting the most out of your optimization efforts means understanding the data you’re collecting, from analytics implementation, to report setup, to analysis techniques. In this session, Krista walks you through several tips for using analytics data to empower your optimization efforts, and then takes it further to show you how to up-level your efforts to take advantage of personalization from mass scale all the way down to individual user actions.

Krista Seiden is the Analytics Advocate for Google, advocating for all things data, web, mobile, optimization, and more. Keynote speaker, practitioner, writer on Analytics and Optimization, and passionate supporter of #WomenInAnalytics.

dr-pete-meyers-150x150-40534.jpg

04:25–05:10pm
Facing the Future: 5 Simple Tactics for 5 Scary Changes

Dr. Pete Meyers, Moz
@dr_pete

We've seen big changes to SEO recently, from an explosion in SERP features to RankBrain to voice search. These fundamental changes to organic search marketing can be daunting, and it's hard to know where to get started. Dr. Pete will walk you through five big changes and five tactics for coping with those changes today.

Dr. Peter J. Meyers (aka "Dr. Pete") is Marketing Scientist for Seattle-based Moz, where he works with the marketing and data science teams on product research and data-driven content.


07:00–10:00pm
MozCon Ignite

Join us for an evening of networking and passion-talks. Laugh, cheer, and be inspired as your peers share their 5-minute talks about their hobbies, passion projects, and life lessons.

Be sure to bring your MozCon badge.


Wednesday


09:00–10:00am
Breakfast


cindy-krum-150x150-58917.jpg10:05–10:50am
The Truth About Mobile-First Indexing

Cindy Krum, MobileMoxie, LLC
@suzzicks

Mobile-first design has been a best practice for a while, and Google is finally about to support it with mobile-first indexing. But mobile-first design and mobile-first indexing are not the same thing. Mobile-first indexing is about cross-device accessibility of information, to help integrate digital assistants and web-enabled devices that don’t even have browsers to achieve Google’s larger goals. Learn how mobile-first indexing will give digital marketers their first real swing at influencing Google’s new AI (Artificial Intelligence) landscape. Marketers who embrace an accurate understanding of mobile-first indexing could see a huge first-mover advantage, similar to the early days of the web, and we all need to be prepared.

Cindy, the CEO and Founder of MobileMoxie, LLC, is the author of Mobile Marketing: Finding Your Customers No Matter Where They Are. She brings fresh and creative ideas to her clients, and regularly speaks at US and international digital marketing events.


tara-reed-150x150-45070.jpg

10:50–11:20am
Powerful Brands Have Communities

Tara Reed, Apps Without Code
@TaraReed_

You are laser-focused on user growth. Meanwhile, you're neglecting a gold mine of existing customers who desperately want to be part of your brand's community. Tara Reed shares how to use communities, gamification, and membership content to grow your revenue.

Tara Reed is a tech entrepreneur & marketer. After running marketing initiatives at Google, Foursquare, & Microsoft, Tara branched out to launch her own apps & startups. Today, Tara helps people implement cutting-edge marketing into their businesses.


11:20–11:50am
AM Break


11:55–12:25am

From Anchor to Asset: How Agencies Can Wisely Create Data-Driven Content

Heather Physioc, VML
@HeatherPhysioc

Creative agencies are complicated and messy, often embracing chaos instead of process, and focusing exclusively on one-time campaign creative instead of continuous web content creation. Campaign creative can be costly, and not sustainable for most large brands. How can creative shops produce data-driven streams of high-quality content for the web that stays true to its creative roots — but faster, cheaper, and continuously? I'll show you how.

Heather is director of Organic Search at global digital ad agency VML, which performs search engine optimization services for multinational brands like Hill's Pet Nutrition, Electrolux/Frigidaire, Bridgestone, EXPRESS, and Wendy's.


britney-muller-150x150-45570.jpg12:25–12:55pm
5 Secrets: How to Execute Lean SEO to Increase Qualified Leads

Britney Muller, Moz
@BritneyMuller

I invite you to steal some of the ideas I've gleaned from managing SEO for the behemoth bad-ass Moz.com. Learn what it takes to move the needle on qualified leads, execute quick wins, and keep your head above water. I'll go over my biggest Moz.com successes, failures, tests, and lessons.

Britney is a Minnesota native who moved to Colorado to fulfill a dream of being a snowboard bum! After 50+ days on the mountain her first season, she got stir-crazy and taught herself how to program, then found her way into SEO while writing for a local realtor.


12:55–02:25pm
Lunch


stephanie-chang-150x150-5456.jpg02:30–03:15pm
SEO Experimentation for Big-Time Results

Stephanie Chang, Etsy
@@stephpchang

One of the biggest business hurdles any brand faces is how to prioritize and validate SEO recommendations. This presentation describes an SEO experimentation framework you can use to effectively test how changes made to your pages affect SEO performance.

Stephanie currently leads the Global Acquisition & Retention Marketing teams at Etsy. Previously, she was a Senior Consultant at Distilled.


rob-bucci-150x150-39132.jpg

03:15–03:45pm
Reverse-Engineer Google's Research to Serve Up the Best, Most Relevant Content for Your Audience

Rob Bucci, STAT Search Analytics
@STATrob

The SERP is the front-end to Google's multi-billion dollar consumer research machine. They know what searchers want. In this data-heavy talk, Rob will teach you how to uncover what Google already knows about what web searchers are looking for. Using this knowledge, you can deliver the right content to the right searchers at the right time, every time.

Rob loves the challenge of staying ahead of the changes Google makes to their SERPs. When not working, you can usually find him hiking up a mountain, falling down a ski slope, or splashing around in the ocean.


03:45–04:15pm
PM Break


04:20–05:05pm
rand-fishkin-150x150-32915.jpgInside the Googling Mind: An SEO's Guide to Winning Clicks, Hearts, & Rankings in the Years Ahead

Rand Fishkin, Founder of Moz, doer of SEO, feminist
@randfish

Searcher behavior, intent, and satisfaction are on the verge of overtaking classic SEO inputs (keywords, links, on-page, etc). In this presentation, Rand will examine the shift that behavioral signals have caused, and list the step-by-step process to build a strategy that can thrive long-term in Google's new reality.

Rand Fishkin is the founder and former CEO of Moz, co-author of a pair of books on SEO, and co-founder of Inbound.org. Rand's an un-save-able addict of all things content, search, and social on the web.


07:00–11:30pm
MozCon Bash

Join us at Garage Billiards for an evening of networking, billiards, bowling, and karaoke with MozCon friends new and old. Don't forget to bring your MozCon badge and US ID or passport.


Additional Pre-MozCon Sunday Workshops


12:30pm–5:05pm
SEO Intensive

Offered as 75-minute sessions, the five workshops will be taught by Mozzers Rand Fishkin, Britney Muller, Brian Childs, Russ Jones, and Dr. Pete. Topics include The 10 Jobs of SEO-focused Content, Keyword Targeting for RankBrain and Beyond, and Risk-Averse Link Building at Scale, among others.

These workshops are separate from MozCon; you'll need a ticket to attend them.


Amped up for a talk or ten? Curious about new methods? Excited to learn? Get your ticket before they sell out:

Snag my ticket to MozCon 2017!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/2qXxH8w
via IFTTT

Social Media Today