Innovate not imitate!

Innovate not imitate!
Interested in the latest Growth hacks?

Welcome to our blog

Interested in the latest Growth hacks?

Welcome to our blog!

We want to help you start/manage and grow your business using innovative strategies and implementation. We have a passion for helping businesses and companies of various sizes see the same success that we have achieved.

Our skillsets are wide and varied, from business strategy, marketing, to online strategy. An increasing number of companies are turning to the internet and online media as a means to maximising their marketing reach and exposure. This is special area of focus for us and we do more than simple SEO strategies.

See our website for more: www.innovatetoaccelerate.com

Friday 31 January 2020

Four common Google Analytics myths busted

Google Analytics is a powerful tool that’s unprecedented in its ability to measure your website’s performance. The data it gathers is invaluable to you as a marketer. They can give you a clear view of what decisions you need to make to benefit your brand. Data, however, are just numbers and graphs. On their own, they cannot tell a story. It’s your job as a marketer to deduce that story through sound and unbiased analysis and not fall for Google Analytics myths.

If Google Analytics terms and data confuse you more than they enlighten you, this article will help you understand four Google Analytics and SEO-related myths you need to avoid.

How do I use Google Analytics?

Business owners use Google Analytics (GA) to see what they’re doing right, in terms of getting quality traffic to their sites. If you’re a business owner hoping to expand your presence in online spheres, you’ll need analytics to measure your success.

With the use of metrics, Google Analytics tracks who visits your site, how long they stay, what device they’re using, and what link brought them there. With these data, you can discover how to improve your online marketing and SEO strategies.

Google Analytics basics

At first, it may seem like Google Analytics is serving you raw data that are too complicated to digest. Learning to speak the analytics language, though, it is easier than you think. Below are some basic terms to help you better understand the data reported by Google Analytics:

Pageviews

Pageviews are the total number of times a page on your site that users have viewed. This includes instances in which users refresh the page or when they jump to another page and promptly go back to the page they had just left. This underlines what pages are most popular.

Visits/Sessions

Sessions are measured by how much time users spend on your website, regardless if they spend it navigating only one or multiple pages. Sessions are limited to a 30-minute window. This means that if users stay on the site for 30 minutes but remain inactive and non-interactive with the page throughout, the session ends. If they leave the site and go back within 30 minutes, though, it gets counted as a session.

Average session duration refers to the average time users spent on your site. Pages per session, on the other hand, is the average number of pages that users view on your site within a single session.

Time on Page

This refers to the average time users spend on a page on your site. This can help you determine which pages users typically check out longer. This starts the second a pageview is counted until the subsequent pageview ends it.

Traffic

Traffic refers to the number of people accessing your website. This comes from a traffic source or any place where users come from before they are led to your pages.

Traffic is classified into direct and referral. Direct traffic comes from pageviews triggered by specifically typing the whole URL or when a user is given a URL directly without searching for it. Referral traffic is directed from links on other sites, like search results or social media.

Unique Pageviews

Unique pageviews are reported when your page is viewed once by users in a single session. These don’t count the times users navigated back to that page in the same session. For example, a user navigates the whole site in one session and navigates back to the original page three times; the Unique Pageview count is still at one, and not three.

Unique Visitors

When a user visits your site for the first time, a unique visitor and a new visit for the website is counted. Google Analytics uses cookies to determine this. If the same user comes back to the site on the same browser and device, it’s only counted as a new visit. But if that user deletes their cookies or accesses the site through a different browser or device, they may be falsely added as a unique visitor.

Hits

Hits are interactions or requests made to a site. This includes page views, events, and transactions. A group of hits is measured as a session, used to determine a user’s engagement with the website.

Clicks

Clicks are measured by the number of clicks you get from search engine results. Click-through rate (CTR) is the total amount of clicks divided by impressions or times you are part of the user’s search results. If CTR is dropping, consider writing titles and meta descriptions that capture your users’ attention better.

Events

Events are actions users take on a particular site. This includes clicking buttons to see other pages or download files. You are looking at what kind of content encourages users to interact with the page, thereby triggering an event.

Bounce rate

Bounce rate refers to users’ single-page sessions wherein they click on a page and exits quickly without interacting with a single element on the page. A high bounce rate can mean either that a user has swiftly found what they were looking for or that they did not think the content on the page was interesting enough to stay longer and engage.

Goals

You can input goals in your Google Analytics account to track user interactions on your site. These interactions include submitting a report, subscribing to your newsletter, or downloading files. If the user performs an event that you’ve identified as a goal, Analytics counts this as a conversion.

Four common Google Analytics myths debunked

Now that you have an overview of Google Analytics terms, below are five common misconceptions surrounding those terms and how to avoid these as a marketer.

1. The more traffic that goes to your site, the better

The myth

Generally, you’d want more people to visit your site. These huge amounts of visits, though, won’t matter if they don’t turn into conversions. Even if thousands of people flock to your webpages each day, if they don’t take the desired actions your SEO campaign is aiming for, these visits won’t provide any benefit for your site.

The truth

A good SEO strategy is built upon making sure that once you’ve garnered a pageview, the quality of your content drives the user to the desired action such as subscribing to a newsletter, for example.

Keyword research can help make sure that you use the right terms to get you a higher ranking on SERPs. The material on your site, however, is also crucial in satisfying your users’ queries, enough to get a conversion.

2. Users need to spend more time on webpages

The myth

Users spending a few quick seconds on your page is not entirely bad. This may mean that these users are looking for quick, precise answers. Quality SEO delivers this to them through well-placed keywords and concise content. Hence, if they quickly get the answers they need, they tend to leave the site immediately.

The truth

Quality SEO content ensures that your material is written in such a way that it invites users to learn more about the subject, which can be seen when they are led to another page on your site. This leads them one step closer to taking the desired action on your site.

3. The amount of unique visitors is an accurate metric to measure audience traffic

The myth

The upsurge of unique visitors on your page doesn’t necessarily mean that the amount of your audience is blowing up. Unique visitors are measured by cookies used by Google to determine if it’s a user’s first time on a site. The same user accessing the same page through a different browser or a browser whose cookies have been cleared is counted as a unique visitor too.

The truth

If you’re looking to study your audience, it’s not enough to look at how many of them go to your page. You can refer to the Audience > Demographics tab and see who are navigating your site and from what marketing links they were directed from. With this information, you can determine what types of content gather the most traffic and from what avenues this traffic comes from such as SERPs or social media posts, for example.

4. Traffic reports are enough to tell if your campaign is successful

The myth

Looking at traffic reports alone is not enough to determine whether your SEO campaign is successful, or that your keyword research paid off. Although at first, it seems as though heavy traffic signals an effective online marketing strategy, it only counts the quantitative aspect of your campaign and dismisses the qualitative side.

The truth

Maximize all the reports on GA. All these are correlated with how your campaign is going. Reports are valuable in comprehensively addressing issues instead of nitpicking on a single aspect of a campaign because, for instance, a report suggests it’s not doing its job.

These points will help you clear the air when it comes to Google Analytics and help you correctly derive insights.

The post Four common Google Analytics myths busted appeared first on Search Engine Watch.



from SEO – Search Engine Watch https://ift.tt/31eoo6s
via IFTTT

SEO for 2020 - Whiteboard Friday

Posted by BritneyMuller

It's a brand-new decade, rich with all the promise of a fresh start and new beginnings. But does that mean you should be doing anything different with regards to your SEO?

In this Whiteboard Friday, our Senior SEO Scientist Britney Muller offers a seventeen-point checklist of things you ought to keep in mind for executing on modern, effective SEO. You'll encounter both old favorites (optimizing title tags, anyone?) and cutting-edge ideas to power your search strategy from this year on into the future.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we are talking about SEO in 2020. What does that look like? How have things changed?

Do we need to be optimizing for favicons and BERT? We definitely don't. But here are some of the things that I feel we should be keeping an eye on. 

☑ Cover your bases with foundational SEO

Titles, metas, headers, alt text, site speed, robots.txt, site maps, UX, CRO, Analytics, etc.

To cover your bases with foundational SEO will continue to be incredibly important in 2020, basic things like title tags, meta descriptions, alt, all of the basic SEO 101 things.

There have been some conversations in the industry lately about alt text and things of that nature. When Google is getting so good at figuring out and knowing what's in an image, why would we necessarily need to continue providing alt text?

But you have to remember we need to continue to make the web an accessible place, and so for accessibility purposes we should absolutely continue to do those things. But I do highly suggest you check out Google's Visual API and play around with that to see how good they've actually gotten. It's pretty cool.

☑ Schema markup

FAQ, Breadcrumbs, News, Business Info, etc.

Schema markup will continue to be really important, FAQ schema, breadcrumbs, business info. The News schema that now is occurring in voice results is really interesting. I think we will see this space continue to grow, and you can definitely leverage those different markup types for your website. 

☑ Research what matters for your industry!

Just to keep in mind, there's going to be a lot of articles and research and information coming at you about where things are going, what you should do to prepare, and I want you to take a strategic stance on your industry and what's important in your space.

While I might suggest page speed is going to be really important in 2020, is it for your industry? We should still worry about these things and still continue to improve them. But if you're able to take a clearer look at ranking factors and what appears to be a factor for your specific space, you can better prioritize your fixes and leverage industry information to help you focus.

☑ National SERPs will no longer be reliable

You need to be acquiring localized SERPs and rankings.

This has been the case for a while. We need to localize search results and rankings to get an accurate and clear picture of what's going on in search results. I was going to put E-A-T here and then kind of cross it off.

A lot of people feel E-A-T is a huge factor moving forward. Just for the case of this post, it's always been a factor. It's been that way for the last ten-plus years, and we need to continue doing that stuff despite these various updates. I think it's always been important, and it will continue to be so. 

☑ Write good and useful content for people

While you can't optimize for BERT, you can write better for NLP.

This helps optimize your text for natural language processing. It helps make it more accessible and friendly for BERT. While you can't necessarily optimize for something like BERT, you can just write really great content that people are looking for.

☑ Understand and fulfill searcher intent, and keep in mind that there's oftentimes multi-intent

One thing to think about this space is we've kind of gone from very, very specific keywords to this richer understanding of, okay, what is the intent behind these keywords? How can we organize that and provide even better value and content to our visitors? 

One way to go about that is to consider Google houses the world's data. They know what people are searching for when they look for a particular thing in search. So put your detective glasses on and examine what is it that they are showing for a particular keyword.

Is there a common theme throughout the pages? Tailor your content and your intent to solve for that. You could write the best article in the world on DIY Halloween costumes, but if you're not providing those visual elements that you so clearly see in a Google search result page, you're never going to rank on page 1.

☑ Entity and topical integration baked into your IA

Have a rich understanding of your audience and what they're seeking.

This plays well into entities and topical understanding. Again, we've gone from keywords and now we want to have this richer, better awareness of keyword buckets. 

What are those topical things that people are looking for in your particular space? What are the entities, the people, places, or things that people are investigating in your space, and how can you better organize your website to provide some of those answers and those structures around those different pieces? That's incredibly important, and I look forward to seeing where this goes in 2020. 

☑ Optimize for featured snippets

Featured snippets are not going anywhere. They are here to stay. The best way to do this is to find the keywords that you currently rank on page 1 for that also have a featured snippet box. These are your opportunities. If you're on page 1, you're way more apt to potentially steal or rank for a featured snippet.

One of the best ways to do that is to provide really succinct, beautiful, easy-to-understand summaries, takeaways, etc., kind of mimic what other people are doing, but obviously don't copy or steal any of that. Really fun space to explore and get better at in 2020. 

☑ Invest in visuals

We see Google putting more authority behind visuals, whether it be in search or you name it. It is incredibly valuable for your SEO, whether it be unique images or video content that is organized in a structured way, where Google can provide those sections in that video search result. You can do all sorts of really neat things with visuals. 

☑ Cultivate engagement

This is good anyway, and we should have been doing this before. Gary Illyes was quoted as saying, "Comments are better for on-site engagement than social signals." I will let you interpret that how you will.

But I think it goes to show that engagement and creating this community is still going to be incredibly important moving forward into the future.

☑ Repurpose your content

Blog post → slides → audio → video

This is so important, and it will help you excel even more in 2020 if you find your top-performing web pages and you repurpose them into maybe be a SlideShare, maybe a YouTube video, maybe various pins on Pinterest, or answers on Quora.

You can start to refurbish your content and expand your reach online, which is really exciting. In addition to that, it's also interesting to play around with the idea of providing people options to consume your content. Even with this Whiteboard Friday, we could have an audio version that people could just listen to if they were on their commute. We have the transcription. Provide options for people to consume your content. 

☑ Prune or improve thin or low-quality pages

This has been incredibly powerful for myself and many other SEOs I know in improving the perceived quality of a site. So consider testing and meta no-indexing low-quality, thin pages on a website. Especially larger websites, we see a pretty big impact there. 

☑ Get customer insights!

This will continue to be valuable in understanding your target market. It will be valuable for influencer marketing for all sorts of reasons. One of the incredible tools that are currently available by our Whiteboard Friday extraordinaire, Rand Fishkin, is SparkToro. So you guys have to check that out when it gets released soon. Super exciting. 

☑ Find keyword opportunities in Google Search Console

It's shocking how few people do this and how accessible it is. If you go into your Google Search Console and you export as much data as you can around your queries, your click-through rate, your position, and impressions, you can do some incredible, simple visualizations to find opportunities.

For example, if this is the rank of your keywords and this is the click-through rate, where do you have high click-through rate but low ranking position? What are those opportunity keywords? Incredibly valuable. You can do this in all sorts of tools. One I recommend, and I will create a little tutorial for, is a free tool called Facets, made by Google for machine learning. It makes it really easy to just pick those apart. 

☑ Target link-intent keywords

A couple quick link building tactics for 2020 that will continue to hopefully work very, very well. What I mean by link-intent keywords is your keyword statistics, your keyword facts.

These are searches people naturally want to reference. They want to link to it. They want to cite it in a presentation. If you can build really great content around those link-intent keywords, you can do incredibly well and naturally build links to a website. 

☑ Podcasts

Whether you're a guest or a host on a podcast, it's incredibly easy to get links. It's kind of a fun link building hack. 

☑ Provide unique research with visuals

Andy Crestodina does this so incredibly well. So explore creating your own unique research and not making it too commercial but valuable for users. I know this was a lot.

There's a lot going on in 2020, but I hope some of this is valuable to you. I truly can't wait to hear your thoughts on these recommendations, things you think I missed, things that you would remove or change. Please let us know down below in the comments, and I will see you all soon. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2vCMgot
via IFTTT

Thursday 30 January 2020

Going international with SEO: How to make your WordPress site globally friendly

International expansion is an expected ambition for progressive websites. The online nature of this global reach means that the uncertainties, legal dangers, and cultural hazards are minimized. The world is at your fingertips, and the costs in reaching it successfully are minimal. The rationale for reaching out to a new audience, readership, viewership or listenership, maybe one of opportunity, exciting new prospects, high growth potential, or to escape a domestic audience that has become too saturated or competitive.

With only some limitations, the internet is a global phenomenon that effectively ties us all together with invisible strings. Send a Tweet from Prague and reply to it in Illinois. Publish an ebook in Seattle and share it with your friends in Beirut. There are practically no boundaries when it comes to sharing content online.

When it comes to your WordPress website, the one you’ve dedicated time, money and energy building, I expect that you will want it to possess the maximum global reach possible. This doesn’t just happen by chance and requires some key features within your site to make this happen. The following tips and suggested plugins should set you and your website on the path to international influence.

Four tips to help make your site globally friendly

1. Globalize your content

The foundation of an internationally appealing website is its content transcreation. This does not focus on the mere translation of words but ensures the recreation of meaning, intent, and context.

It is important to make sure that the meaning of the content does not change when translated into another language and does not convey your message wrongly. Cultural hazards are rife when it comes to the international expansion of any kind. To be accepted and welcomed in a different geographical area, you cannot afford to display misunderstood and potentially offensive content.

Unsurprisingly, over 73% of the global market prefers websites with content in their native language. If people cannot understand the content on your website, you cannot hope to keep their interest. In the same vein, inaccurate translations just won’t cut it. The best option is to find a content writer who can craft the copy in a specific language for better quality content.

2. Avoid rigid localized options

Some websites choose the default website domain and language based on dynamic Geolocation IP tracking. Others do not have rigid local settings and allow their websites to be accessed by users from anywhere. If you are hoping to reach as many readers as possible, this option is best. No matter the country from which your website is browsed, it can be accessed without limitations of location.

3. Avoid using text on images

Google cannot translate text on images. This is the same for logos, headings, and other information. This can be majorly off-putting for readers who do not understand some parts of your website. Further, no translator or software that runs on your multilingual site can translate graphical text. Therefore, avoid it altogether for the best results, or keep it to a minimum for a more international audience.

4. Localize checkout and shipping factors

Whether your WordPress site is an online store or sells software as a service that doesn’t require any shipping at all, your checkout process should be appropriately localized. Currency options are fundamental to users taking that final step to make the purchase. There are WordPress plugins available to allow for multiple currencies to be displayed and chosen from.

If you are giving the option of international shipping then inform the buyer beforehand whether or not the product is available for shipping to his local address. Make the option to convert the currency clear and choose a suitable API tool for currency conversions. In order to keep on track of abandoned cart figures, allow the user to view the delivery charges and taxes prior to checking out. Finally, remember that people from different locations are more comfortable with different payment methods- so ensure to provide multiple options.

Plugins to help make your site globally friendly

1. TranslatePress

This full-fledged WordPress multilingual plugin translates every aspect of your website. Its main feature is that it allows you to translate directly from the front-end. It allows you to easily switch languages during the translation- and the live preview is updated instantly. All translations of content, theme, plugins and even meta-data can be made without changing the interface.

It is ideal for manual translations. Do it yourself or assign a custom translator ‘user role’ to any user on your site. Users will then be able to translate as and when they want, without needing access to the admin area.

Lastly, the plugin creates SEO friendly URLs for all languages and boosts you up the local SEO results. Ranking well will make this extra effort to globalize your site worth all the while. Once you have established yourself as an authoritative and respectably ranking website abroad, you’re in and can continue the normal operation of your site.

2. Multi-currency for WooCommerce

As discussed, the need for multiple currencies on your international online store is unchallenged. This plugin allows users to easily switch to different currencies and make use of currency exchange rate converter with no limits. It can be used to accept only one currency or all currencies. Multi-currency for WooCommerce will enhance your site’s user experience and will do so for free. It’s a no brainer.

Implementing these can surely get you some good traction for your WordPress site on a global scale.

Feel free to share your thoughts and queries in the comments section.

The post Going international with SEO: How to make your WordPress site globally friendly appeared first on Search Engine Watch.



from SEO – Search Engine Watch https://ift.tt/2tRgmnL
via IFTTT

The perils of tricking Google’s algorithm

Let’s admit it, all of us are trying our best to please search engines (SE) and cracking Google’s algorithm. After all, who doesn’t want some extra visibility and revenue?

Naturally, billions of websites are adopting innovative practices to gain Google’s attention and approval. In order to rank high on the SERP, businesses should comply with the Google updates that are introduced on a regular basis. But this, in no way, means finding loopholes in these search engine algorithms or adopting strategies to trick them. In fact, businesses employing such empty SEO tricks have to face the music later. Many firms already have experienced Google’s wrath in the past.

Google has been regularly introducing algorithm updates to improve the quality of its search results. But it also penalizes sites that employ unethical or outdated practices to rank higher. This can adversely impact a brand’s reputation and bottom line. Ideally, these updates should be used as a guide for improving a site’s UX, ranking on SERPs is an end result that will follow.

Read on to know the ill-effects of chasing Google’s algorithms. There’s also a bonus involved! You will also learn some effective tips to stay on top of these updates while boosting your business reputation.

1. Google penalties

Google’s algorithm updates are a solution to reward good content and identify and penalize websites using unethical and outdated SEO practices. Google absolutely doesn’t approve of tactics like keyword stuffing, buying links, linking to penalized sites, unnatural links, and others. Algorithm updates, Panda, Penguin, Pigeon, RankBrain, Broad Core, and others aim at improving the quality of search results for users.

Google webmaster guidelines

Source: Google Webmaster Guideline

Thus, web developers, digital marketers, bloggers, and online businesses messing with these updates are penalized, sending their website plummeting down the SERP.

Google can penalize such websites in two ways –

A. Algorithmic penalty

Several other factors can cause your ranking to go down. Yet, with the introduction of an update, there’s a fair chance that your website may be affected. This is especially true if your site doesn’t adhere to the specific parameters assessed by the update.

For instance, Google Panda assigns a quality score to your site after checking for duplicate content and keyword stuffing. If your site has duplicate content, its ranking is bound to suffer.

Similarly, the latest January 2020 Core Update will be checking websites for authoritative and relevant content with a healthy E-A-T rating. So, if your website violates any of the guidelines shared by Google, it will automatically be penalized or filtered.

Make sure you check for issues in your domain on Google Search Console at regular intervals.

B. Manual penalty

This is a direct result of your website being penalized by a Google employee for not complying with the search quality guidelines. Manual penalties are Google’s way of punishing websites with spammy behavior. The Manual Actions Report on Search Console allows you to check such penalties, offering an opportunity to resolve them.

Check out this infographic by DigitalThirdCoast that shares an analysis of the businesses that tried to cheat Google along with the repercussions they had to face later.

2. Loss of reputation and credibility

Businesses obsessed with algorithm updates not only attract penalties but also lose focus on improving their site’s UX. Either way, the business loses its reputation and credibility. Lost reputation means an immediate loss of potential revenue, benefiting no one else but the competition.

Check out what John Mueller, Webmaster Trends Analyst at Google has to say about cleaning up the mess after being slapped by a Google penalty.

John Mueller's comment about Google penalties

Source: Reddit

Of course, there are ways to recover from Google penalties. But it takes a lot of effort to rebuild the business reputation and trustworthiness, let alone improving the firm’s online ranking and winning back the lost customers.

3. Marketing myopia

One of the gravest dangers of being preoccupied with Google algorithm updates is losing sight of the business vision and goals. Instead of focusing on the audience’s needs the firm tends to adopt an inward-looking approach only to satisfy Google.

Google will forever introduce these updates. There’s no end to their journey towards improving the quality of search results. Google is clearly focused on its vision. Are you?

Don’t lose sight of your vision. Use Google’s algorithm updates as a guide to steer closer to your business goals.

What can you do to rank better on Google?

1. Don’t perennially chase Google updates

Google makes minor changes in its algorithm almost every other day. In 2019 alone multiple updates were reported. Not all were confirmed as Google is less upfront about these updates.

List of Google's algorithm updates in 2019

Source: Moz

The sole objective of these updates is to create a better user experience. Merely chasing them and going all over the place with execution will not only land you with a penalty but also affect your reputation in the long term.

Stop obsessing about these updates and focus on making your website and content better each day.

2. Focus on delivering first-rate digital experience

Google’s algorithms are constantly judging and rating sites based on the quality of experience they offer and their E-A-T rating. In a nutshell, you need to prioritize these pointers.

A. Serve quality content

“Quality” seems to be a subjective term but not for Google. The search giant clearly states that the content on a website should be in-depth, relevant, useful, and from a credible source. Simply put, it asks us to create E-A-T worthy content.

This is especially true for the YMYL websites that affect an individual’s health, happiness, safety, or financial stability.

Google's page quality rating standards for YMYL websites

(Source: Google’s Search Quality Guidelines)

Ask yourself these three questions when creating a piece of content:

  • Is the content contributor an expert on the subject? (Expertise)
  • Is the content contributor an influencer or an authority in the domain? (Authority)
  • Is this content accurate and from a credible source? (Trustworthiness)

B. Work on your backlink profile

Backlinks are one of the top-ranking factors that help Google decide a website’s authority and credibility in its niche. Focus on getting quality backlinks from authority sites.

How?

Well, authoritative sites will award links to websites serving relevant, useful, and shareable content. Build authority by creating great content in various forms like videos, podcasts, case studies, infographics, and others.

You should also collaborate with experts for content-creation projects. For instance, expert roundups can not only strengthen your network with influential people in a niche but also provide solid content for your upcoming posts.

Tip to work on back links via roundup posts to rank well

(Source: https://ift.tt/1LEd4n3)

Check out how RankWatch conducted an expert roundup involving 25 marketing experts like Rand Fishkin and Barry Adams to discuss the future of SEO. Such inbound link-building initiatives have earned the website a healthy number of backlinks from websites with healthy page authority (PA).

Here are the results as seen on MozBar.

Inbound link result on Mozbar

Source: Moz Analytics

C. Improve your site speed

A website’s bounce rate is directly proportional to its load time. Google recommends having a site speed index of under three seconds.

How page load time affects traffic

Source: Think with Google

If your website takes longer than three seconds to load, be prepared to wear Google’s “Badge of Shame”. You read it right! Google’s planning to slap slow websites with this badge.

Google's badge of shame

Source: Chromium blog

It’s best suggested to take effective steps to improve your site speed which will, in turn, boost your site’s UX and improve your ranking.

D. Avoid over-optimizing webpages

Google will see through any unscrupulous SEO hacks that are employed to game the system. Build sites to improve your audience’s online experience, not to trick Google. We will touch such unethical practices at the next point.

3. Play by the rules

Though Google isn’t transparent with its algorithm updates, it keeps sharing valuable tips for webmasters and content creators, encouraging them to serve quality content and boost their site’s UX. Use these tips to your advantage.

A. Take learnings from the search quality guidelines

Google wants webmasters to follow its guidelines when building sites and posting online content. So, it’s important to constantly stay updated about the current guidelines. Refer to the search quality guidelines when creating an SEO strategy for your business.

B. Avoid black and gray-hat SEO tactics

Avoid using black-hat SEO techniques and monetization schemes like keyword stuffing, private blog networks, spammy links, and affiliate links among others. Moreover, Google absolutely disapproves of gray-hat SEO tricks like buying expired domains, cloaking, dummy social accounts, and scraped content among others. These techniques normally go unnoticed but when used excessively are spotted by Google, attracting a penalty.

Therefore, it’s best to avoid both these unethical SEO tactics that only focus on tricking the algorithm. Make delivering value to users a priority!

4. Check for crawl errors

At times, your website isn’t featured in the top searched because Google’s spiders haven’t crawled it. One of the major reasons for this is a possible error in your code. Use Google’s Index Coverage report and URL Inspection tool to identify and fix the gaps in your code.

Also, remember to optimize your crawl budget to ensure that your important webpages in Robots.txt are crawled. Finally, watch out for 301 and 302 redirect chains that can hurt your crawl limit and cause the SE crawler to stop crawling your site.

Wrapping up

A website doesn’t enjoy high visibility on Google, it practically doesn’t exist. Therefore, everyone’s bending over backward to crack Google’s algorithm updates. However, businesses adopting strategies merely to trick Google are headed for a slippery slope.

Google’s algorithms are smart enough to identify and punish websites that are up to no good. So, take my advice – instead of trying to crack Google’s algorithm updates, work towards creating awesome content and offering the best experience to users. The tips shared in this post will guide you in the process.

George Konidis is the co-founder of Growing Search, a Canadian based digital marketing agency providing optimal SEO and link building services worldwide. He can be found on Twitter @georgekonidis.

The post The perils of tricking Google’s algorithm appeared first on Search Engine Watch.



from SEO – Search Engine Watch https://ift.tt/2S4ln47
via IFTTT

Tuesday 28 January 2020

Word of advice on exactly what to expect from SEO in 2020

Between 2010-2015 the SEO industry went from being seen as a shady backroom box of tricks to a leading and essential marketing channel, driven by data, trends and user behavior statistics.

With ongoing changes Google kept SEO agencies, freelancers and internal teams on their toes by releasing update after update to hone and shape not only what they want search results to look like, but how they want us to act and work within them. This included the once-famed Penguin Update, aimed at webspam and link building practices, supposedly impacted around 0.1% of searches when originally launched, but went on to shape the importance of positive link building, utilization of tools and data and birthed job roles around SEO content strategy while strengthening the importance of content marketing.

Long term with the development of RankBrain and (perceived) closer to real-time algorithm changes, more core updates on a regular basis and the journey through ‘Content is King’ to UX – SEO has become theorized in some sense, with many of us having our own opinions and approaches to the same end result.

As we’ve reached 2020 we have in some parts see new developments from Google slow down, with the company’s focus seemingly on updating reporting suites and core updates that offer little more than ‘an improvement to search results’. We’re no longer beholden to the next big Penguin or Panda updates, but more to the inner workings of Google and sporadic updates to its Search Quality Guidelines – with this in mind, what exactly can we expect from SEO in 2020? Adhering to Google guidelines becomes harder, or easier?

We all know how SEO works and many of us will have specialisms or approaches to SEO we feel get results quicker, but with vague updates and unannounced tweaks to algorithms, is it becoming harder to adhere to Google’s guidelines?

Certainly, the unpredictability is a factor at times – with the recent updates to search guidelines on YMYL and E-A-T being announced, there’s a perception the goalposts are moving ever so slightly, every so often.

This means that if you’re scoring just inside the post on Monday, you might be wide of the mark by a fraction on Tuesday. For websites where the SEO team is at the mercy of web development or other factors outside of their control, this can prove a challenge.

Of course, any SEO agency or specialist worth their weight in gold will be able to outline and approach any issues with a solution in hand.

The flipside to this is, however, is that we all have a clear idea of what a good website looks like and what is going to rank page 1 for chosen keywords. With guideline updates, an industry that shares knowledge like no other and a focus on developing strategies that are future proof, there is no reason for every update to send SEO campaigns spiraling.

In 2020, we predict that the next wave of guidelines will be released, and our prediction is these again will be focusing on trust and authority – not a million miles away from where we’ve been for the last few years.

Actioning and adhering to search quality guidelines

Google Search Quality Guidelines regularly update – these guidelines reflect how Google wants you to work within a website and the process the search engine’s algorithm will take to evaluate the relevance of the website for keyword usage.

These guidelines take into account:

  • E-A-T – The Expert, Authority, Trust of the website in relation to the target subject
  • Page Quality – How the page is laid out, how it works and whether it has the user’s best interests at heart
  • Needs Met – Factors around whether the page ANSWERS the needs of the query

The page quality is assessed to identify where the text is placed, the wording used, content used and the quality of the content.

Google’s most recent updates put E-A-T elements at the heart of the Page Quality section of its guidelines, based on industry and type of product.

The blanket approach, and the actions needed to adhere to (or in fact exceed) Google guidelines are that the page should be “more specific than the query, but would still be helpful for many or most users because” the company is reputable in the area.

Top nine factors content managers should audit for on-page SEO

Element to Optimise Definition
Landing Page URL URL of the landing page (after the website name)
Meta Title This is the blue link that shows in Google
Meta Description The text that shows under the blue link in search results – to draw a user to click
Heading 1 Tag A title that shows at the top of a page
Heading 2 / 3 Tags Additional titles which are placed within the content of a page
Content The physical content on the page needs to meet particular criteria
Keyword Density The percentage of keywords to total text ratio on a page
Images The size, name, and title of an image on the page
Internal Links Links which point to other pages on the website

Dependence on technical SEO reduced but is still important

Technical SEO has been on the rise for a number of years but the buzz behind it has somewhat plateaued in the last 12 months or so – although it is still essential to audit from a technical perspective regularly. Traditionally, technical SEO would include web structure, speed, hosting and so on – with JSON, mark up and structured tagging following on from this.

Across client bases we’ve seen the need for technical SEO regularly drop by just under 50%, with wider-ranging audits, working with web development on new site builds and regular crawls on health being the norm.  Working in this way allows for time to be split effectively across multiple areas of SEO and better use of budget. Education on the technical aspects client-side also means SEO agencies and professionals can focus time elsewhere.

Within semi-regular technical SEO audits, there are some core elements to check, all of which will help identify issues and improve the technical performance of a website, without impacting the day-to-day of search marketing.

Top eight factors you should audit for technical SEO

Element to Optimise Definition
Web structure and URL Structure Essentially the folders in use website is built
HTTPS/SSL Security for customers or users visiting the site
HTML Build Code-behind core elements of a website
CSS / Javascript Code behind the theme and functionality of a site
Schema / JSON Code that allows websites to send additional information to search engines
Server Speed The speed in which servers respond to requests from users
Sitemaps/Robots Used by Google to crawl websites
Accessibility Are all pages able to be found

2020 and beyond

As always, Google is likely to throw a couple of curveballs – However, the SEO industry is coming of age again and it’s no longer an area of expertise that “anybody” can have a go at. There’s a need to understand the market of your clients, their customers, their collateral and the demands of Google to achieve success. Following clear structure, regular audits and systematic approaches will allow all of the above to be achieved.

Keith Hodges, Head of Search at POLARIS, is an SEO expert with over eight years’ experience in the industry. 

The post Word of advice on exactly what to expect from SEO in 2020 appeared first on Search Engine Watch.



from SEO – Search Engine Watch https://ift.tt/30YnD19
via IFTTT

Monday 27 January 2020

Google's January 2020 Core Update: Has the Dust Settled?

Posted by Dr-Pete

On January 13th, MozCast measured significant algorithm flux lasting about three days (the dotted line shows the 30-day average prior to the 13th, which is consistent with historical averages) ...

That same day, Google announced the release of a core update dubbed the January 2020 Core Update (in line with their recent naming conventions) ...

On January 16th, Google announced the update was "mostly done," aligning fairly well with the measured temperatures in the graph above. Temperatures settled down after the three-day spike ...

It appears that the dust has mostly settled on the January 2020 Core Update. Interpreting core updates can be challenging, but are there any takeaways we can gather from the data?

How does it compare to other updates?

How did the January 2020 Core Update stack up against recent core updates? The chart below shows the previous four named core updates, back to August 2018 (AKA "Medic") ...

While the January 2020 update wasn't on par with "Medic," it tracks closely to the previous three updates. Note that all of these updates are well above the MozCast average. While not all named updates are measurable, all of the recent core updates have generated substantial ranking flux.

Which verticals were hit hardest?

MozCast is split into 20 verticals, matching Google AdWords categories. It can be tough to interpret single-day movement across categories, since they naturally vary, but here's the data for the range of the update (January 14–16) for the seven categories that topped 100°F on January 14 ...

Health tops the list, consistent with anecdotal evidence from previous core updates. One consistent finding, broadly speaking, is that sites impacted by one core update seem more likely to be impacted by subsequent core updates.

Who won and who lost this time?

Winners/losers analyses can be dangerous, for a few reasons. First, they depend on your particular data set. Second, humans have a knack for seeing patterns that aren't there. It's easy to take a couple of data points and over-generalize. Third, there are many ways to measure changes over time.

We can't entirely fix the first problem — that's the nature of data analysis. For the second problem, we have to trust you, the reader. We can partially address the third problem by making sure we're looking at changes both in absolute and relative terms. For example, knowing a site gained 100% SERP share isn't very interesting if it went from one ranking in our data set to two. So, for both of the following charts, we'll restrict our analysis to subdomains that had at least 25 rankings across MozCast's 10,000 SERPs on January 14th. We'll also display the raw ranking counts for some added perspective.

Here are the top 25 winners by % change over the 3 days of the update. The "Jan 14" and "Jan 16" columns represent the total count of rankings (i.e. SERP share) on those days ...

If you've read about previous core updates, you may see a couple of familiar subdomains, including VeryWellHealth.com and a couple of its cousins. Even at a glance, this list goes well beyond healthcare and represents a healthy mix of verticals and some major players, including Instagram and the Google Play store.

I hate to use the word "losers," and there's no way to tell why any given site gained or lost rankings during this time period (it may not be due to the core update), but I'll present the data as impartially as possible. Here are the 25 sites that lost the most rankings by percentage change ...

Orbitz took heavy losses in our data set, as did the phone number lookup site ZabaSearch. Interestingly, one of the Very Well family of sites (three of which were in our top 25 list) landed in the bottom 25. There are a handful of healthcare sites in the mix, including the reputable Cleveland Clinic (although this appears to be primarily a patient portal).

What can we do about any of this?

Google describes core updates as "significant, broad changes to our search algorithms and systems ... designed to ensure that overall, we’re delivering on our mission to present relevant and authoritative content to searchers." They're quick to say that a core update isn't a penalty and that "there’s nothing wrong with pages that may perform less well." Of course, that's cold comfort if your site was negatively impacted.

We know that content quality matters, but that's a vague concept that can be hard to pin down. If you've taken losses in a core update, it is worth assessing if your content is well matched to the needs of your visitors, including whether it's accurate, up to date, and generally written in a way that demonstrates expertise.

We also know that sites impacted by one core update seem to be more likely to see movement in subsequent core updates. So, if you've been hit in one of the core updates since "Medic," keep your eyes open. This is a work in progress, and Google is making adjustments as they go.

Ultimately, the impact of core updates gives us clues about Google's broader intent and how best to align with that intent. Look at sites that performed well and try to understand how they might be serving their core audiences. If you lost rankings, are they rankings that matter? Was your content really a match to the intent of those searchers?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/38KoLYG
via IFTTT

The Dirty Little Featured Snippet Secret: Where Humans Rely on Algorithmic Intervention [Case Study]

Posted by brodieclarkconsulting

I recently finished a project where I was tasked to investigate why a site (that receives over one million organic visits per month) does not rank for any featured snippets.

This is obviously an alarming situation, since ~15% of all result pages, according to the MozCast, have a featured snippet as a SERP feature. The project was passed on to me by an industry friend. I’ve done a lot of research on featured snippets in the past. I rarely do once-off projects, but this one really caught my attention. I was determined to figure out what issue was impacting the site.

In this post, I detail my methodology for the project that I delivered, along with key takeaways for my client and others who might be faced with a similar situation. But before I dive deep into my analysis: this post does NOT have a fairy-tale ending. I wasn’t able to unclog a drain that resulted in thousands of new visitors.

I did, however, deliver massive amounts of closure for my client, allowing them to move on and invest resources into areas which will have a long-lasting impact.

Confirming suspicions with Big Data

Now, when my client first came to me, they had their own suspicions about what was happening. They had been advised by other consultants on what to do.

They had been told that the featured snippet issue was stemming from either:

1. An issue relating to conflicting structured data on the site

OR

2. An issue relating to messy HTML which was preventing the site from appearing within featured snippet results

I immediately shut down the first issue as a cause for featured snippets not appearing. I’ve written about this topic extensively in the past. Structured data (in the context of schema.org) does NOT influence featured snippets. You can read more about this in my post on Search Engine Land.

As for the second point, this is more close to reality, yet also so far from it. Yes, HTML structure does help considerably when trying to rank for featured snippets. But to the point where a site that ranks for almost a million keywords but doesn’t rank for any featured snippets at all? Very unlikely. There’s more to this story, but let’s confirm our suspicions first.


Let’s start from the top. Here’s what the estimated organic traffic looks like:

Note: I’m unable to show the actual traffic for this site due to confidentiality. But the monthly estimation that Ahrefs gives of 1.6M isn’t far off.

Out of the 1.6M monthly organic visits, Ahrefs picks up on 873K organic keywords. When filtering these keywords by SERP features with a featured snippet and ordering by position, you get the following:

I then did similar research with both Moz Pro using their featured snippet filtering capabilities as well as SEMrush, allowing me to see historical ranking.

All 3 tools displaying the same result: the site did not rank for any featured snippets at all, despite ~20% of my client's organic keywords including a featured snippet as a SERP feature (higher than the average from MozCast).

It was clear that the site did not rank for any featured snippets on Google. But who was taking this position away from my client?

The next step was to investigate whether other sites are ranking within the same niche. If they were, then this would be a clear sign of a problem.

An “us” vs “them” comparison

Again, we need to reflect back to our tools. We need our tools to figure out the top sites based on similarity of keywords. Here’s an example of this in action within Moz Pro:

Once we have our final list of similar sites, we need to complete the same analysis that was completed in the previous section of this post to see if they rank for any featured snippets.

With this analysis, we can figure out whether they have featured snippets displaying or not, along with the % of their organic keywords with a featured snippet as a SERP feature.

The next step is to add all of this data to a Google Sheet and see how everything matches up to my client's site. Here’s what this data looks like for my client:

I now need to dig deeper into the sites in my table. Are they really all that relevant, or are my tools just picking up on a subset of queries that are similar?

I found that from row 8 downwards in my table, those sites weren’t all that similar. I excluded them from my final dataset to keep things as relevant as possible.

Based on this data, I could see 5 other sites that were similar to my clients. Out of those five sites, only one had results where they were ranking within a featured snippet.

80% of similar sites to my client's site had the exact same issue. This is extremely important information to keep in mind going forward.

Although the sample size is considerably lower, one of those sites has ~34% of search results that they rank for where they are unable to be featured. Comparatively, this is quite problematic for this site (considering the 20% calculation from my client's situation).

This analysis has been useful in figuring out whether the issue was specific to my client or the entire niche. But do we have guidelines from Google to back this up?

Google featured snippet support documentation

Within Google’s Featured Snippet Documentation, they provide details on policies surrounding the SERP feature. This is public information. But I think a very high percentage of SEOs aren’t aware (based on multiple discussions I’ve had) of how impactful some of these details can be.

For instance, the guidelines state that: 

"Because of this prominent treatment, featured snippet text, images, and the pages they come from should not violate these policies." 

They then mention 5 categories:

  1. Sexually explicit
  2. Hateful
  3. Violent
  4. Dangerous and harmful
  5. Lack consensus on public interest topics

Number five in particular is an interesting one. This section is not as clear as the other four and requires some interpretation. Google explains this category in the following way:

"Featured snippets about public interest content — including civic, medical, scientific, and historical issues — should not lack well-established or expert consensus support."

And the even more interesting part in all of this: these policies do not apply to web search listings nor cause those to be removed.

It can be lights out for featured snippets if you fall into one of these categories, yet you can still be able to rank highly within the 10-blue-link results. A bit of an odd situation.

Based on my knowledge of the client, I couldn’t say for sure whether any of the five categories were to blame for their problem. It was sure looking like it was algorithmic intervention (and I had my suspicions about which category was the potential cause).

But there was no way of confirming this. The site didn’t have a manual action within Google Search Console. That is literally the only way Google could communicate something like this to site owners.

I needed someone on the inside at Google to help.

The missing piece: Official site-specific feedback from Google

One of the most underused resources in an SEOs toolkit (based on my opinion), are the Google Webmaster Hangouts held by John Mueller.

You can see the schedule for these Hangouts on YouTube here and join live, asking John a question in person if you want. You could always try John on Twitter too, but there’s nothing like video.

You’re given the opportunity to explain your question in detail. John can easily ask for clarification, and you can have a quick back-and-forth that gets to the bottom of your problem.

This is what I did in order to figure out this situation. I spoke with John live on the Hangout for ~5 minutes; you can watch my segment here if you’re interested. The result was that John gave me his email address and I was able to send through the site for him to check with the ranking team at Google.

I followed up with John on Twitter to see if he was able to get any information from the team on my clients situation. You can follow the link above to see the full piece of communication, but John’s feedback was that there wasn't a manual penalty being put in place for my client's site. He said that it was purely algorithmic. This meant that the algorithm was deciding that the site was not allowed to rank within featured snippets.

And an important component of John’s response:


If a site doesn’t rank for any featured snippets when they're already ranking highly within organic results on Google (say, within positions 1–5), there is no way to force it to rank.

For me, this is a dirty little secret in a way (hence the title of this article). Google’s algorithms may decide that a site can’t show in a featured snippet (but could rank #2 consistently), and there's nothing a site owner can do.

...and the end result?

The result of this, in the specific niche that my client is in, is that lots of smaller, seemingly less relevant sites (as a whole) are the ones that are ranking in featured snippets. Do these sites provide the best answer? Well, the organic 10-blue-links ranking algorithm doesn’t think so, but the featured snippet algorithm does.

This means that the site has a lot of queries which have a low CTR, resulting in considerably less traffic coming through to the site. Sure, featured snippets sometimes don’t drive much traffic. But they certainly get a lot more attention than the organic listings below:

Based on the Nielsen Norman Group study, when SERP features (like featured snippets) were present on a SERP, they found that they received looks in 74% of cases (with a 95% confidence interval of 66–81%). This data clearly points to the fact that featured snippets are important for sites to rank within where possible, resulting in far greater visibility.

Because Google’s algorithm is making this decision, it's likely a liability thing; Google (the people involved with the search engine) don’t want to be the ones to have to make that call. It’s a tricky one. I understand why Google needs to put these systems in place for their search engine (scale is important), but communication could be drastically improved for these types of algorithmic interventions. Even if it isn’t a manual intervention, there ought to be some sort of notification within Google Search Console. Otherwise, site owners will just invest in R&D trying to get their site to rank within featured snippets (which is only natural).

And again, just because there are categories available in the featured snippet policy documentation, that doesn’t mean that the curiosity of site owners is always going to go away. There will always be the “what if?”

Deep down, I’m not so sure Google will ever make this addition to Google Search Console. It would mean too much communication on the matter, and could lead to unnecessary disputes with site owners who feel they’ve been wronged. Something needs to change, though. There needs to be less ambiguity for the average site owner who doesn’t know they can access awesome people from the Google Search team directly. But for the moment, it will remain Google’s dirty little featured snippet secret.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/38IEuYl
via IFTTT

Friday 24 January 2020

Measure Form Usage with Event Tracking - Whiteboard Friday

Posted by Matthew_Edgar

When it comes to the forms your site visitors are using, you need to go beyond completions — it's important to understand how people are interacting with them, where the strengths lie and what errors might be complicating the experience. In this edition of Whiteboard Friday, Matthew Edgar takes you through in-depth form tracking in Google Analytics. 

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. My name is Matthew Edgar. Welcome to another edition of Whiteboard Friday. I am an analytics consultant at Elementive, and in this Whiteboard Friday what I want to talk to you about are new ways that we can really start tracking how people are interacting with our forms.

I'm going to assume that all of you who have a form on your website are already tracking it in some way. You're looking at goal completions on the form, you're measuring how many people arrived on that page that includes the form, and what we want to do now is we want to take that to a deeper level so we can really understand how people are not just completing the form, but how they're really interacting with that form.

So what I want to cover are how people really interact with the form on your website, how people really interact with the fields when they submit the form, and then also what kind of errors are occurring on the form that are holding back conversions and hurting the experience on your site. 

1. What fields are used?

So let's begin by talking about what fields people are using and what fields they're really interacting with.

So in this video, I want to use just an example of a registration form. Pretty simple registration form. Fields for name, company name, email address, phone number, revenue, and sales per day, basic information. We've all seen forms like this on different websites. So what we want to know is not just how many people arrived on this page, looked at this form, how many people completed this form.

What we want to know is: Well, how many people clicked into any one of these fields? So for that, we can use event tracking in Google Analytics. If you don't have Google Analytics, that's okay. There are other ways to do this with other tools as well. So in Google Analytics, what we want to do is we want to send an event through every time somebody clicks or taps into any one of these fields.

On focus

So for that, we're going to send an on focus event. The category can be form. Action is interact. Then the label is just the name of the field, so email address or phone number or whatever field they were interacting with. Then in Google Analytics, what we'll be able to look at, once we drill into the label, is we'll be able to say, "Well, how many times in total did people interact with that particular field?"

GA report

So people interacted with the name field 104 times, the revenue field 89 times, sales per day 64 times, and phone number 59 times. Then we could go through all the other fields too to look at that. What this total information starts to give us is an idea of: Well, where are people struggling? Where are people having to really spend a lot of time? Then it also gives us an idea of the drop-off rate.

So we can see here that, well, 104 people interacted with the full name field, but only 89 made it down here to the revenue field. So we're losing people along the way. Is that a design issue? Is that something about the experience of interacting with this form? Maybe it's a device issue. We have a lot of people on mobile and maybe they can't see all of those fields. The next thing we can look at here is the unique events that are happening for each of those.

Unique events aren't exactly but are close enough to a general idea of how many unique people interacted with those fields. So in the case of the name field, 102 people interacted 104 times, roughly speaking, which makes sense. People don't need to go back to the name field and enter in their name again. But in the case of the revenue field, 47 unique interactions, 89 total interactions.

People are having to go back to this field. They're having to reconsider what they want to put in there. So we can start to figure out, well, why is that? Is that because people aren't sure what kind of answer to give? Are they not comfortable giving up that answer? Are there some trust factors on our site that we need to improve? If we really start to dig into that and look at that information, we can start to figure out, well, what's it going to take to get more people interacting with this form, and what's it going to take to get more people clicking that Submit button?

2. What fields do people submit?

The next thing that we want to look at here is what fields do people submit. Not just what do they interact with, but when they click that Submit button, which fields have they actually put information into? 

On submit

So for this, when people click that Submit button, we can trigger another event to send along to Google Analytics. In this case, the category is form, the action is submit, and then for the label what we want to do is we want to send just a list of all the different fields that people had put some kind of information in.

So there's a lot of different ways to do this. It really just depends on what kind of form you have, how your form is controlled. One easy way is you have a JavaScript function that just loops through your entire form and says, "Well, which of these fields have a value, have something that's not the default entry, that people actually did give their information to?" One note here is that if you are going to loop through those fields on your form and figure out which ones people interacted with and put information into, you want to make sure that you're only getting the name of the field and not the value of the field.

We don't want to send along the person's email address or the person's phone number. We just want to know that they did put something in the email address field or in the phone number field. We don't want any of that personally identifiable information ending up in our reports. 

Review frequency

So what we can do with this is we can look at: Well, how frequently did people submit any one of these fields?

So 53 submissions with the full name field, 46 with revenue, 42 with sales per day, etc. 

Compare by interact

The first thing we can do here is we can compare this to the interaction information, and we can say, "Well, there were 53 times that people submitted a field with the full name field filled out.But there are 102 people who interacted with that full name field."

That's quite the difference. So now we know, well, what kind of opportunity exists for us to clean this up. We had 102 people who hit this form, who started filling it out, but only 53 ended up putting in their full name when they clicked that Submit button. There's some opportunity there to get more people filling out this form and submitting.

Segment by source

The other thing we can do is we can segment this by source. The reason we would want to do that is we want to compare this to understand something about the quality of these submissions. So we might know that, well, people who give us their phone number, that tends to be a better quality submission on our form. Not necessarily. There are exceptions and edge cases to be sure.

But generally speaking, people who give us their phone number we know are better quality. So by segmenting by source, we can say, "Well, which people who come in from which source are more likely to give their phone number?" That gives us an idea of which source we might want to go after. Maybe that's a really good thing that your ad network is really driving people who fill out their phone number. Or maybe organic is doing a better job driving people to submit by giving you that information.

3. What fields cause problems?

The next thing we want to look at on our form is which errors are occurring. What problems are happening here? 

Errors, slips, mistakes

When we're talking about problems, when we're talking about errors, it's not just the technical errors that are occurring. It's also the user errors that are occurring, the slips, the mistakes that people are just naturally going to make as they work through your form.

Assign unique ID to each error

The easiest way to track this is every time an error is returned to the visitor, we want to pass an event along to Google Analytics. So for that, what we can do is we can assign a unique ID number to each error on our website, and that unique ID number can be for each specific error. So people who forgot a digit on a phone number, that's one ID number. People who forgot the phone number altogether, that's a different ID number. 

On return of error

When that error gets returned, we'll pass along the category is form, the action is error, and then the label is that unique ID number

Frequency of errors

The first thing we can look at is the frequency of how frequently each error occurs. So we can say, "Well, Error ID No. 1 occurred 37 times, and Error ID No. 2 occurred 26 times."

Segment by form completion

It starts to give us an idea of how to prioritize these errors. But the more interesting thing to look at is we want to segment by the form completion, and then we can compare these two. So we can say, "Okay, people who completed this form, how often did they get these errors?" So in this case, we can say, "Well, Error ID No. 1, 29 people got it, but 27 people who submitted this form got it."



That means pretty much everybody who got that error was able to move beyond the error and submit the form. It's not that big of a deal. It's not hurting the experience on our site all that much. It's not hurting conversions all that much. Error ID No. 4 though, 19 people got the error, but only 3 of the people who got that error were able to submit the form. Clearly whatever this ID is, whatever this error is, that's the one that's really hurting the experience on our site.

That's the one that's really going to hurt conversions. So by improving or figuring out why that error is occurring, then we can start to improve conversions on our site. I hope these ideas have given you some new ways to really track and understand how people are interacting with your forms at a deeper level.

I look forward to hearing your comments about different things you're doing on your forms, and certainly if you start using any of these ideas, what kind of insights you're gaining from them. Thank you.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2RKdQHR
via IFTTT

Wednesday 22 January 2020

New study: Majority of consumers are unaware of how search engines work

As brands and their marketing departments deploy strategies to capitalize on record ecommerce spending — which soared to $586.92 billion in 2019 — new research from leading provider of brand protection solutions, BrandVerity, has brought to light important findings and hidden risks pertaining to the journeys consumers are taking online.

In order to give brands a better understanding of the search experiences their customers are having and how they are impacting brand perception and customer experience, BrandVerity commissioned the “BrandVerity’s Online Consumer Search Trends 2020” research study in Q4 of 2019 to over 1,000 US consumers, balanced against the US population for age, gender, region, and income.

Amongst the many findings, three main themes stood out:

Consumers confused by how search engine results work

Only 37% of consumers understand that search engine results are categorized by a combination of relevance and advertising spend.

The other 63% of consumers believe that Search Engine Results Pages (SERPs) are categorized by either relevance or spend, or they simply “don’t know.”

Additionally, nearly 1-in-3 consumers (31%) say they don’t believe search engines (e.g. Google) do a good job of labeling which links are ads.

Consumers more included to click on the result that appears first

Without a clear understanding of how search results are served up, consumers are more inclined to click on the result that appears first, believing it to be the most relevant option.

With 54% of consumers saying they trust websites more that appear at the top of the SERP, this isn’t just an assumption.

Consumers feel misled by the website they find in the search engine results

51% of consumers say that when searching for information on a product, they sometimes feel misled by one of the websites in the search results.

An additional 1-in-4 report feeling misled “often” or “always.”

Even further, 25% also say they often end up somewhere unexpected that does not provide them with what they were looking for when clicking on a search result.

“Against a backdrop where consumers have increasingly high expectations of the brands they do business with, and are holding them to equally high standards, companies must ensure that the entirety of the experiences they provide meet customer expectations,” said Dave Naffziger CEO of Brandverity.

“As these findings show, a general uncertainty of how search engines work, combined with the significant occurrence of poor online experiences, mean oversight of paid search programs is more important than ever for brands today.”

The post New study: Majority of consumers are unaware of how search engines work appeared first on Search Engine Watch.



from SEO – Search Engine Watch https://ift.tt/37deqnT
via IFTTT

Tuesday 21 January 2020

How to Scale Your Content Marketing: Tips from Our Journey to 100,000 Words a Month

Posted by JotFormmarketing

In the fall of 2018 our CEO had a simple yet head-exploding request of the JotForm marketing and growth teams: Produce 100,000 words of high-quality written content in a single month.

All types of content would count toward the goal, including posts on our own blog, help guides, template descriptions, and guest posts and sponsored articles on other sites.

In case you don’t think that sounds like a lot, 100,000 words is the length of a 400-page book. Produced in a single month. By a group of JotFormers who then numbered fewer than eight.

Why would on Earth would he want us to do all that?

My colleague and I trying to calculate how many blog posts it would take to reach 100,000 words.

It’s important to understand intent here. Our CEO, Aytekin, isn’t a crazy man. He didn’t send us on a mission just to keep us busy.

You see, for many months we’d dabbled with content, and it was working. Aytekin’s contributed posts in Entrepreneur magazine and on Medium were big hits. Our redesigned blog was picking up a lot of traction with the content we already had, and we were starting to understand SEO a lot better.

Still. Why would any software company need to produce that much content?

The answer is simple: infrastructure. If we could build a content engine that produces a high volume of quality content, then we could learn what works well and double down on creating great content. But in order to sustain success in content, we needed to have the pieces in place.

He allocated a sufficient budget and gave us the freedom to hire the staff we needed to make it happen. We were going to need it.

A full year later, I’m very proud to say we’ve officially crossed over the 100,000-word count in a single month [hold for applause].

However, it didn’t come without some painful learnings and mistakes.

Here’s what I figured out about scaling content through this process.

Develop a system early

Our old editorial calendar was a Google sheet. I started it back when JotForm was publishing one or two blogs per week and needed a way to keep it organized. It worked.

Back then, the only people who needed to view the editorial calendar were three people on the marketing staff and a couple of designers.

However, no spreadsheet on earth will be functional when you’re loading up 100,000 words. It’s too complicated. We discovered this right away.

After much discussion, we migrated our editorial workflow into Asana, which seemed like the closest thing to what we needed. It has a nice calendar view, the tagging functionality helped keep things orderly, and the board view gives a great overview of everyone’s projects.

This is where our marketing team lives.

Counterintuitively, we also use Trello, since it’s what our growth team had already been using to manage projects. Once the marketing team finishes writing a post, we send a request to our growth team designers to create banners for them using a form that integrates with their Trello board.

The system is intricate, but it works. We’d be lost if we hadn’t spent time creating it.

Style guides are your friends

Speaking of things to develop before you can really grow your content machine. Style guides are paramount to maintaining consistency, which becomes trickier and trickier the more writers you enlist to help you reach your content goals.

We consider our style guide to be a sort of living, ever-changing document. We add to it all the time.

It’s also the first thing that any legitimate writer will want to see when they’re about to contribute something to your site, whether they’re submitting a guest post, doing paid freelance work, or they’re your own in-house content writer.

Things to include in a basic style guide: an overview of writing style and tone, grammar and mechanics, punctuation particulars, product wording clarifications, and formatting.

Cheap writing will cost you, dearly

If you want cheap writing, you can find it. It’s everywhere — Upwork, Express Writers, WriterAccess. You name it, we tried it. And for less than $60 a blog post, what self-respecting marketing manager wouldn’t at least try it?

I’m here to tell you it’s a mistake.

I was thrilled when the drafts started rolling in. But our editor had other thoughts. It was taking too much time to make them good — nay, readable.

That was an oversight on my end, and it created a big bottleneck. We created such a backlog of cheap content (because it was cheap and I could purchase LOTS of it at a time) that it halted our progress on publishing content in a timely manner.

Instead, treat your freelance and content agencies as partners, and take the time to find good ones. Talk to them on the phone, exhaustively review their writing portfolio, and see if they really understand what you’re trying to accomplish. It’ll cost more money in the short term, but the returns are significant.

But good writing won’t mask subject ignorance

One thing to check with any content agency or freelancer you work with is their research process. The good ones will lean on subject matter experts (SMEs) to actually become authorities on the subjects they write about. It’s a tedious step, for both you and the writer, but it’s an important one.

The not-so-good ones? They’ll wing it and try to find what they can online. Sometimes they can get away with it, and sometimes someone will read your article and have this to say:

Screenshot of feedback for article saying it feels like it was written by a content creator, not a photographer.

That was harsh.

But they had a point. While the article in question was well-written, it wasn’t written by someone who knew much about the subject at hand, which in this case was photography. Lesson learned. Make sure whoever you hire to write will take the time to know what they’re talking about.

Build outreach into your process

Let’s be real here. For 99.9 percent of you, content marketing is SEO marketing. That’s mostly the case with us as well. We do publish thought leadership and product-education posts with little SEO value, but a lot of what we write is published with the hope that it pleases The Google. Praise be.

But just publishing your content is never enough. You need links, lots of them.

Before I go any further, understand that there’s a right and a wrong way to get links back to your content.

Three guidelines for getting links to your content:

1. Create good content.

2. Find a list of reputable, high-ranking sites that are authorities on the subject you wrote about.

3. Ask them about linking or guest posting on their site in a respectful way that also conveys value to their organization.

That’s it. Don’t waste your time on crappy sites or link scams. Don’t spam people’s inboxes with requests. Don’t be shady or deal with shady people.

Create good content, find high-quality sites to partner with, and offer them value.

Successful content is a numbers game

One benefit to creating as much content as we have is that we can really see what’s worked and what hasn’t. And it’s not as easy to predict as you might think.

One of our most successful posts, How to Start and Run a Summer Camp, wasn’t an especially popular one among JotFormers in the planning stage, primarily because the topic didn’t have a ton of monthly searches for the targeted keywords we were chasing. But just a few months after it went live, it became one of our top-performing posts in terms of monthly searches, and our best in terms of converting readers to JotForm users.

Point being, you don’t really know what will work for you until you try a bunch of options.

You’ll need to hire the right people in-house

In a perfect world JotForm employees would be able to produce every bit of content we need. But that’s not realistic for a company of our size. Still, there were some roles we absolutely needed to bring in-house to really kick our content into high gear.

A few of our content hires from the past 12 months.

Here are some hires we made to build our content infrastructure:

Content writer

This was the first dedicated content hire we ever made. It marked our first real plunge into the world of content marketing. Having someone in-house who can write means you can be flexible. When last-minute or deeply product-focused writing projects come up, you need someone in-house to deliver.

Editor

Our full-time editor created JotForm’s style guide from scratch, which she uses to edit every single piece of content that we produce. She’s equal parts editor and project manager, since she effectively owns the flow of the Asana board.

Copywriters (x2)

Our smaller writing projects didn’t disappear just because we wanted to load up on long-form blog posts. Quite the contrary. Our copywriters tackle template descriptions that help count toward our goal, while also writing landing page text, email marketing messages, video scripts, and social media posts.

Content strategist

One of the most difficult components of creating regular content is coming up with ideas. I made an early assumption that writers would come up with things to write; I was way off base. Writers have a very specialized skill that actually has little overlap with identifying and researching topics based on SEO value, relevance to our audience, and what will generate clicks from social media. So we have a strategist.

Content operations specialist

When you aim for tens of thousands of words of published content over the course of a month, the very act of coordinating the publishing of a post becomes a full-time job. At JotForm, most of our posts also need a custom graphic designed by our design team. Our content operations specialist coordinates design assets and makes sure everything looks good in WordPress before scheduling posts.

SEO manager

Our SEO manager had already been doing work on JotForm’s other pages, but he redirected much of his attention to our content goals once we began scaling. He works with our content strategist on the strategy and monitors and reports on the performance of the articles we publish.

The payoff

JotForm’s blog wasn’t starting from scratch when Aytekin posed the 100,000-word challenge. It was already receiving about 120,000 organic site visitors a month from the posts we’d steadily written over the years.

A year later we receive about 230,000 monthly organic searches, and that’s no accident.

The past year also marked our foray into the world of pillar pages.

For the uninitiated, pillar pages are (very) long-form, authoritative pieces that cover all aspects of a specific topic in the hopes that search engines will regard them as a resource.

These are incredibly time-consuming to write, but they drive buckets full of visitors to your page.

We’re getting more than 30,000 visitors a month — all from pillar pages we’ve published within the last year.

To date, our focus on content marketing has improved our organic search to the tune of about 150,000 additional site visitors per month, give or take.

Conclusion

Content isn’t easy. That was the biggest revelation for me, even though it shouldn’t have been. It takes a large team of people with very specialized skills to see measurable success. Doing it at large scale requires a prodigious commitment in both money and time, even if you aren’t tasked with writing 100,000 words a month.

But that doesn’t mean you can’t find a way to make it work for you, on whatever scale that makes the most sense.

There really aren’t any secrets to growing your content engine. No magic recipe. It’s just a matter of putting the resources you have into making it happen.

Best of all, this post just gave us about 2,000 words toward this month’s word count goal.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from The Moz Blog https://ift.tt/2Gc5Aem
via IFTTT

Social Media Today