Google (Almost Certainly) Has an Organic Quality Score (Or Something a Lot Like It) that SEOs Need to Optimize For – Whiteboard Friday

Posted by randfish

Entertain the idea, for a moment, that Google assigned a quality score to organic search results. Say it was based off of click data and engagement metrics, and that it would function in a similar way to the Google AdWords quality score. How exactly might such a score work, what would it be based off of, and how could you optimize for it?

While there’s no hard proof it exists, the organic quality score is a concept that’s been pondered by many SEOs over the years. In today’s Whiteboard Friday, Rand examines this theory inside and out, then offers some advice on how one might boost such a score.

https://fast.wistia.net/embed/iframe/vq9fn7tfkj?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Google's Organic Quality Score

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about organic quality score.

So this is a concept. This is not a real thing that we know Google definitely has. But there’s this concept that SEOs have been feeling for a long time, that similar to what Google has in their AdWords program with a paid quality score, where a page has a certain score assigned to it, that on the organic side Google almost definitely has something similar. I’ll give you an example of how that might work.

So, for example, if on my site.com I have these three — this is a very simplistic website — but I have these three subfolders: Products, Blog, and About. I might have a page in my products, 14axq.html, and it has certain metrics that Google associates with it through activity that they’ve seen from browser data, from clickstream data, from search data, and from visit data from the searches and bounces back to the search results, and all these kinds of things, all the engagement and click data that we’ve been talking about a lot this year on Whiteboard Friday.

So they may have these metrics, pogo stick rate and bounce rate and a deep click rate (the rate with which someone clicks to the site and then goes further in from that page), the time that they spend on the site on average, the direct navigations that people make to it each month through their browsers, the search impressions and search clicks, perhaps a bunch of other statistics, like whether people search directly for this URL, whether they perform branded searches. What rate do unique devices in one area versus another area do this with? Is there a bias based on geography or device type or personalization or all these kinds of things?

But regardless of that, you get this idea that Google has this sort of sense of how the page performs in their search results. That might be very different across different pages and obviously very different across different sites. So maybe this blog post over here on /blog is doing much, much better in all these metrics and has a much higher quality score as a result.

Current SEO theories about organic quality scoring:

Now, when we talk to SEOs, and I spend a lot of time talking to my fellow SEOs about theories around this, a few things emerge. I think most folks are generally of the opinion that if there is something like an organic quality score…

1. It is probably based on this type of data — queries, clicks, engagements, visit data of some kind.

We don’t doubt for a minute that Google has much more sophistication than the super-simplified stuff that I’m showing you here. I think Google publicly denies a lot of single types of metric like, “No, we don’t use time on site. Time on site could be very variable, and sometimes low time on site is actually a good thing.” Fine. But there’s something in there, right? They use some more sophisticated format of that.

2. We also are pretty sure that this is applying on three different levels:

This is an observation from experimentation as well as from Google statements which is…

  • Domain-wide, so that would be across one domain, if there are many pages with high quality scores, Google might view that domain differently from a domain with a variety of quality scores on it or one with generally low ones.
  • Same thing for a subdomain. So it could be that a subdomain is looked at differently than the main domain, or that two different subdomains may be viewed differently. If content appears to have high quality scores on this one, but not on this one, Google might generally not pass all the ranking signals or give the same weight to the quality scores over here or to the subdomain over here.
  • Same thing is true with subfolders, although to a lesser extent. In fact, this is kind of in descending order. So you can generally surmise that Google will pass these more across subfolders than they will across subdomains and more across subdomains than across root domains.

3. A higher density of good scores to bad ones can mean a bunch of good things:

  • More rankings in visibility even without other signals. So even if a page is sort of lacking in these other quality signals, if it is in this blog section, this blog section tends to have high quality scores for all the pages, Google might give that page an opportunity to rank well that it wouldn’t ordinarily for a page with those ranking signals in another subfolder or on another subdomain or on another website entirely.
  • Some sort of what we might call “benefit of the doubt”-type of boost, even for new pages. So a new page is produced. It doesn’t yet have any quality signals associated with it, but it does particularly well.

    As an example, within a few minutes of this Whiteboard Friday being published on Moz’s website, which is usually late Thursday night or very early Friday morning, at least Pacific time, I will bet that you can search for “Google organic quality score” or even just “organic quality score” in Google’s engine, and this Whiteboard Friday will perform very well. One of the reasons that probably is, is because many other Whiteboard Friday videos, which are in this same subfolder, Google has seen them perform very well in the search results. They have whatever you want to call it — great metrics, a high organic quality score — and because of that, this Whiteboard Friday that you’re watching right now, the URL that you see in the bar up above is almost definitely going to be ranking well, possibly in that number one position, even though it’s brand new. It hasn’t yet earned the quality signals, but Google assumes, it gives it the benefit of the doubt because of where it is.

  • We surmise that there’s also more value that gets passed from links, both internal and external, from pages with high quality scores. That is right now a guess, but something we hope to validate more, because we’ve seen some signs and some testing that that’s the case.

3 ways to boost your organic quality score

If this is true — and it’s up to you whether you want to believe that it is or not — even if you don’t believe it, you’ve almost certainly seen signs that something like it’s going on. I would urge you to do these three things to boost your organic quality score or whatever you believe is causing these same elements.

1. You could add more high-performing pages. So if you know that pages perform well and you know what those look like versus ones that perform poorly, you can make more good ones.

2. You can improve the quality score of existing pages. So if this one is kind of low, you’re seeing that these engagement and use metrics, the SERP click-through rate metrics, the bounce rate metrics from organic search visits, all of these don’t look so good in comparison to your other stuff, you can boost it, improve the content, improve the navigation, improve the usability and the user experience of the page, the load time, the visuals, whatever you’ve got there to hold searchers’ attention longer, to keep them engaged, and to make sure that you’re solving their problem. When you do that, you will get higher quality scores.

3. Remove low-performing pages through a variety of means. You could take a low-performing page and you might say, “Hey, I’m going to redirect that to this other page, which does a better job answering the query anyway.” Or, “Hey, I’m going to 404 that page. I don’t need it anymore. In fact, no one needs it anymore.” Or, “I’m going to no index it. Some people may need it, maybe the ones who are visitors to my website, who need it for some particular direct navigation purpose or internal purpose. But Google doesn’t need to see it. Searchers don’t need it. I’m going to use the no index, either in the meta robots tag or in the robots.txt file.”

One thing that’s really interesting to note is we’ve seen a bunch of case studies, especially since MozCon, when Britney Muller, Moz’s Head of SEO, shared the fact that she had done some great testing around removing tens of thousands of low-quality, really low-quality performing pages from Moz’s own website and seen our rankings and our traffic for the remainder of our content go up quite significantly, even controlling for seasonality and other things.

That was pretty exciting. When we shared that, we got a bunch of other people from the audience and on Twitter saying, “I did the same thing. When I removed low-performing pages, the rest of my site performed better,” which really strongly suggests that there’s something like a system in this fashion that works in this way.

So I’d urge you to go look at your metrics, go find pages that are not performing well, see what you can do about improving them or removing them, see what you can do about adding new ones that are high organic quality score, and let me know your thoughts on this in the comments.

We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/organic-quality-score
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/08/google-almost-certainly-has-organic.html
via IFTTT

from IM Local SEO https://imlocalseo.wordpress.com/2017/08/11/google-almost-certainly-has-an-organic-quality-score-or-something-a-lot-like-it-that-seos-need-to-optimize-for-whiteboard-friday/
via IFTTT

from Gana Dinero Colaborando | Wecon Project https://weconprojectspain.wordpress.com/2017/08/12/google-almost-certainly-has-an-organic-quality-score-or-something-a-lot-like-it-that-seos-need-to-optimize-for-whiteboard-friday/
via IFTTT

from WordPress https://mrliberta.wordpress.com/2017/08/12/google-almost-certainly-has-an-organic-quality-score-or-something-a-lot-like-it-that-seos-need-to-optimize-for-whiteboard-friday/
via IFTTT

from Blogger http://coachrussellbeaudoin.blogspot.com/2017/08/google-almost-certainly-has-organic.html

Google (Almost Certainly) Has an Organic Quality Score (Or Something a Lot Like It) that SEOs Need to Optimize For – Whiteboard Friday

Posted by randfish

Entertain the idea, for a moment, that Google assigned a quality score to organic search results. Say it was based off of click data and engagement metrics, and that it would function in a similar way to the Google AdWords quality score. How exactly might such a score work, what would it be based off of, and how could you optimize for it?

While there’s no hard proof it exists, the organic quality score is a concept that’s been pondered by many SEOs over the years. In today’s Whiteboard Friday, Rand examines this theory inside and out, then offers some advice on how one might boost such a score.

https://fast.wistia.net/embed/iframe/vq9fn7tfkj?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Google's Organic Quality Score

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about organic quality score.

So this is a concept. This is not a real thing that we know Google definitely has. But there’s this concept that SEOs have been feeling for a long time, that similar to what Google has in their AdWords program with a paid quality score, where a page has a certain score assigned to it, that on the organic side Google almost definitely has something similar. I’ll give you an example of how that might work.

So, for example, if on my site.com I have these three — this is a very simplistic website — but I have these three subfolders: Products, Blog, and About. I might have a page in my products, 14axq.html, and it has certain metrics that Google associates with it through activity that they’ve seen from browser data, from clickstream data, from search data, and from visit data from the searches and bounces back to the search results, and all these kinds of things, all the engagement and click data that we’ve been talking about a lot this year on Whiteboard Friday.

So they may have these metrics, pogo stick rate and bounce rate and a deep click rate (the rate with which someone clicks to the site and then goes further in from that page), the time that they spend on the site on average, the direct navigations that people make to it each month through their browsers, the search impressions and search clicks, perhaps a bunch of other statistics, like whether people search directly for this URL, whether they perform branded searches. What rate do unique devices in one area versus another area do this with? Is there a bias based on geography or device type or personalization or all these kinds of things?

But regardless of that, you get this idea that Google has this sort of sense of how the page performs in their search results. That might be very different across different pages and obviously very different across different sites. So maybe this blog post over here on /blog is doing much, much better in all these metrics and has a much higher quality score as a result.

Current SEO theories about organic quality scoring:

Now, when we talk to SEOs, and I spend a lot of time talking to my fellow SEOs about theories around this, a few things emerge. I think most folks are generally of the opinion that if there is something like an organic quality score…

1. It is probably based on this type of data — queries, clicks, engagements, visit data of some kind.

We don’t doubt for a minute that Google has much more sophistication than the super-simplified stuff that I’m showing you here. I think Google publicly denies a lot of single types of metric like, “No, we don’t use time on site. Time on site could be very variable, and sometimes low time on site is actually a good thing.” Fine. But there’s something in there, right? They use some more sophisticated format of that.

2. We also are pretty sure that this is applying on three different levels:

This is an observation from experimentation as well as from Google statements which is…

  • Domain-wide, so that would be across one domain, if there are many pages with high quality scores, Google might view that domain differently from a domain with a variety of quality scores on it or one with generally low ones.
  • Same thing for a subdomain. So it could be that a subdomain is looked at differently than the main domain, or that two different subdomains may be viewed differently. If content appears to have high quality scores on this one, but not on this one, Google might generally not pass all the ranking signals or give the same weight to the quality scores over here or to the subdomain over here.
  • Same thing is true with subfolders, although to a lesser extent. In fact, this is kind of in descending order. So you can generally surmise that Google will pass these more across subfolders than they will across subdomains and more across subdomains than across root domains.

3. A higher density of good scores to bad ones can mean a bunch of good things:

  • More rankings in visibility even without other signals. So even if a page is sort of lacking in these other quality signals, if it is in this blog section, this blog section tends to have high quality scores for all the pages, Google might give that page an opportunity to rank well that it wouldn’t ordinarily for a page with those ranking signals in another subfolder or on another subdomain or on another website entirely.
  • Some sort of what we might call “benefit of the doubt”-type of boost, even for new pages. So a new page is produced. It doesn’t yet have any quality signals associated with it, but it does particularly well.

    As an example, within a few minutes of this Whiteboard Friday being published on Moz’s website, which is usually late Thursday night or very early Friday morning, at least Pacific time, I will bet that you can search for “Google organic quality score” or even just “organic quality score” in Google’s engine, and this Whiteboard Friday will perform very well. One of the reasons that probably is, is because many other Whiteboard Friday videos, which are in this same subfolder, Google has seen them perform very well in the search results. They have whatever you want to call it — great metrics, a high organic quality score — and because of that, this Whiteboard Friday that you’re watching right now, the URL that you see in the bar up above is almost definitely going to be ranking well, possibly in that number one position, even though it’s brand new. It hasn’t yet earned the quality signals, but Google assumes, it gives it the benefit of the doubt because of where it is.

  • We surmise that there’s also more value that gets passed from links, both internal and external, from pages with high quality scores. That is right now a guess, but something we hope to validate more, because we’ve seen some signs and some testing that that’s the case.

3 ways to boost your organic quality score

If this is true — and it’s up to you whether you want to believe that it is or not — even if you don’t believe it, you’ve almost certainly seen signs that something like it’s going on. I would urge you to do these three things to boost your organic quality score or whatever you believe is causing these same elements.

1. You could add more high-performing pages. So if you know that pages perform well and you know what those look like versus ones that perform poorly, you can make more good ones.

2. You can improve the quality score of existing pages. So if this one is kind of low, you’re seeing that these engagement and use metrics, the SERP click-through rate metrics, the bounce rate metrics from organic search visits, all of these don’t look so good in comparison to your other stuff, you can boost it, improve the content, improve the navigation, improve the usability and the user experience of the page, the load time, the visuals, whatever you’ve got there to hold searchers’ attention longer, to keep them engaged, and to make sure that you’re solving their problem. When you do that, you will get higher quality scores.

3. Remove low-performing pages through a variety of means. You could take a low-performing page and you might say, “Hey, I’m going to redirect that to this other page, which does a better job answering the query anyway.” Or, “Hey, I’m going to 404 that page. I don’t need it anymore. In fact, no one needs it anymore.” Or, “I’m going to no index it. Some people may need it, maybe the ones who are visitors to my website, who need it for some particular direct navigation purpose or internal purpose. But Google doesn’t need to see it. Searchers don’t need it. I’m going to use the no index, either in the meta robots tag or in the robots.txt file.”

One thing that’s really interesting to note is we’ve seen a bunch of case studies, especially since MozCon, when Britney Muller, Moz’s Head of SEO, shared the fact that she had done some great testing around removing tens of thousands of low-quality, really low-quality performing pages from Moz’s own website and seen our rankings and our traffic for the remainder of our content go up quite significantly, even controlling for seasonality and other things.

That was pretty exciting. When we shared that, we got a bunch of other people from the audience and on Twitter saying, “I did the same thing. When I removed low-performing pages, the rest of my site performed better,” which really strongly suggests that there’s something like a system in this fashion that works in this way.

So I’d urge you to go look at your metrics, go find pages that are not performing well, see what you can do about improving them or removing them, see what you can do about adding new ones that are high organic quality score, and let me know your thoughts on this in the comments.

We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/organic-quality-score
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/08/google-almost-certainly-has-organic.html
via IFTTT

from IM Local SEO https://imlocalseo.wordpress.com/2017/08/11/google-almost-certainly-has-an-organic-quality-score-or-something-a-lot-like-it-that-seos-need-to-optimize-for-whiteboard-friday/
via IFTTT

from Gana Dinero Colaborando | Wecon Project https://weconprojectspain.wordpress.com/2017/08/12/google-almost-certainly-has-an-organic-quality-score-or-something-a-lot-like-it-that-seos-need-to-optimize-for-whiteboard-friday/
via IFTTT

from WordPress https://mrliberta.wordpress.com/2017/08/12/google-almost-certainly-has-an-organic-quality-score-or-something-a-lot-like-it-that-seos-need-to-optimize-for-whiteboard-friday/
via IFTTT

How to Find Your Competitor’s Backlinks – Next Level

Posted by BrianChilds

Welcome to the newest installment of our educational Next Level series! In our last episode, Brian Childs equipped copywriters with the tools they need to succeed with SEO. Today, he’s back to share how to use Open Site Explorer to find linking opportunities based upon your competitors’ external inbound links. Read on and level up!


In Moz’s SEO training classes, we discuss how to identify and prioritize sources of backlinks using a mix of tools. One tactic to quickly find high Domain Authority sites that have a history of linking to pages discussing your topic is to study your competitors’ backlinks. This process is covered in-depth during the SEO Link Building Bootcamp.

In this article, I’ll show how to create and export a list of your competitor’s backlinks that you can use for targeting activities. This assumes you’ve already completed keyword research and have identified competitors that rank well in the search results for these queries. Use those competitors for the following analysis.


How to check the backlinks of a site

Step 1: Navigate to Open Site Explorer

Open Site Explorer is a tool used to research the link profile of a website. It will show you the quality of inbound links using metrics like Page Authority, Domain Authority, and Spam Score. You can do a good amount of research with the free version, but to enjoy all its capabilities you’ll need full access; you can get that access for free with a 30-day trial of Moz Pro.

Step 2: Enter your competitor’s domain URL

I suggest opening your competitor’s site in a browser window and then copying the URL. This will reduce any spelling errors and the possibility of incorrectly typing the domain name. An example of a common error is incorrectly adding “www” to the URL when that’s not how it renders for the site.

Step 3: Navigate to the “Inbound Links” tab

The Inbound Links tab will display all of the pages that link to your competitor’s website. In order to identify sources of links that are delivering link equity, I set the parameters above the list as follows: Target This – Root Domain, Links Source – Only External, and Link Type – Link Equity. This will show all external links providing link equity to any page on your competitor’s site.

Step 4: Export results into .csv

Most reports in Open Site Explorer will allow you to export to .csv. Save these results and then repeat for your other competitors.

Step 5: Compile .csv results from all competitors

Once you have Open Site Explorer exports from the top 5–10 competitors, compile them into one spreadsheet.

Step 6: Sort all results by Page Authority

Page Authority is a 1–100 scale developed by Moz that estimates the likelihood of a page’s ability to rank in a search result, based on our understanding of essential ranking factors. Higher numbers suggest the page is more authoritative and therefore has a higher likelihood of ranking. Higher Page Authority pages also will be delivering more link equity to your competitor’s site. Use Page Authority as your sorting criteria.

Step 7: Review all linking sites for opportunities

Now you have a large list of sites linking to your competitors for keywords you are targeting. Go down the list of high Page Authority links and look for sites or authors that show up regularly. Use your preferred outreach strategy to contact these sites and begin developing a relationship.


Want to learn more SEO processes?

If you like these step-by-step SEO processes, you’ll likely enjoy the SEO training classes provided by Moz. These live, instructor-led webinars show you how to use a variety of tools to implement SEO. If you’re in need of comprehensive SEO training, you can save 20% by purchasing the 5-class bundle:

Sign up for the Bootcamp Bundle

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/find-competitor-backlinks-next-level
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/08/how-to-find-your-competitors-backlinks.html
via IFTTT

from IM Local SEO https://imlocalseo.wordpress.com/2017/08/11/how-to-find-your-competitors-backlinks-next-level/
via IFTTT

from Gana Dinero Colaborando | Wecon Project https://weconprojectspain.wordpress.com/2017/08/11/how-to-find-your-competitors-backlinks-next-level/
via IFTTT

from WordPress https://mrliberta.wordpress.com/2017/08/11/how-to-find-your-competitors-backlinks-next-level/
via IFTTT

from Blogger http://coachrussellbeaudoin.blogspot.com/2017/08/how-to-find-your-competitors-backlinks.html

How to Find Your Competitor’s Backlinks – Next Level

Posted by BrianChilds

Welcome to the newest installment of our educational Next Level series! In our last episode, Brian Childs equipped copywriters with the tools they need to succeed with SEO. Today, he’s back to share how to use Open Site Explorer to find linking opportunities based upon your competitors’ external inbound links. Read on and level up!


In Moz’s SEO training classes, we discuss how to identify and prioritize sources of backlinks using a mix of tools. One tactic to quickly find high Domain Authority sites that have a history of linking to pages discussing your topic is to study your competitors’ backlinks. This process is covered in-depth during the SEO Link Building Bootcamp.

In this article, I’ll show how to create and export a list of your competitor’s backlinks that you can use for targeting activities. This assumes you’ve already completed keyword research and have identified competitors that rank well in the search results for these queries. Use those competitors for the following analysis.


How to check the backlinks of a site

Step 1: Navigate to Open Site Explorer

Open Site Explorer is a tool used to research the link profile of a website. It will show you the quality of inbound links using metrics like Page Authority, Domain Authority, and Spam Score. You can do a good amount of research with the free version, but to enjoy all its capabilities you’ll need full access; you can get that access for free with a 30-day trial of Moz Pro.

Step 2: Enter your competitor’s domain URL

I suggest opening your competitor’s site in a browser window and then copying the URL. This will reduce any spelling errors and the possibility of incorrectly typing the domain name. An example of a common error is incorrectly adding “www” to the URL when that’s not how it renders for the site.

Step 3: Navigate to the “Inbound Links” tab

The Inbound Links tab will display all of the pages that link to your competitor’s website. In order to identify sources of links that are delivering link equity, I set the parameters above the list as follows: Target This – Root Domain, Links Source – Only External, and Link Type – Link Equity. This will show all external links providing link equity to any page on your competitor’s site.

Step 4: Export results into .csv

Most reports in Open Site Explorer will allow you to export to .csv. Save these results and then repeat for your other competitors.

Step 5: Compile .csv results from all competitors

Once you have Open Site Explorer exports from the top 5–10 competitors, compile them into one spreadsheet.

Step 6: Sort all results by Page Authority

Page Authority is a 1–100 scale developed by Moz that estimates the likelihood of a page’s ability to rank in a search result, based on our understanding of essential ranking factors. Higher numbers suggest the page is more authoritative and therefore has a higher likelihood of ranking. Higher Page Authority pages also will be delivering more link equity to your competitor’s site. Use Page Authority as your sorting criteria.

Step 7: Review all linking sites for opportunities

Now you have a large list of sites linking to your competitors for keywords you are targeting. Go down the list of high Page Authority links and look for sites or authors that show up regularly. Use your preferred outreach strategy to contact these sites and begin developing a relationship.


Want to learn more SEO processes?

If you like these step-by-step SEO processes, you’ll likely enjoy the SEO training classes provided by Moz. These live, instructor-led webinars show you how to use a variety of tools to implement SEO. If you’re in need of comprehensive SEO training, you can save 20% by purchasing the 5-class bundle:

Sign up for the Bootcamp Bundle

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/find-competitor-backlinks-next-level
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/08/how-to-find-your-competitors-backlinks.html
via IFTTT

from IM Local SEO https://imlocalseo.wordpress.com/2017/08/11/how-to-find-your-competitors-backlinks-next-level/
via IFTTT

from Gana Dinero Colaborando | Wecon Project https://weconprojectspain.wordpress.com/2017/08/11/how-to-find-your-competitors-backlinks-next-level/
via IFTTT

from WordPress https://mrliberta.wordpress.com/2017/08/11/how-to-find-your-competitors-backlinks-next-level/
via IFTTT

11 Lessons Learned from Failed Link Building Campaigns

Posted by kerryjones

We’ve created more than 800 content campaigns at Fractl over the years, and we’d be lying if we told you every single one was a hit.

The Internet is a finicky place. You can’t predict with 100% accuracy if your content will perform well. Sometimes what we think is going to do OK ends up being a massive hit. And there have been a few instances where we’d expect a campaign to be a huge success but it went on to garner lackluster results.

While you can’t control the whims of the Internet, you can avoid or include certain things in your content to help your chances of success. Through careful analysis we’ve pinpointed which factors tend to create high-performing content. Similarly, we’ve identified trends among our content that didn’t quite hit the mark.

soup-nazi.gif

In this this post, I’ll share our most valuable lessons we learned from content flops. Bear in mind this advice applies if you’re using content to earn links and press pickups, which is what the majority of the content we create at Fractl aims to do.

1. There’s such a thing as too much data.

For content involving a lot of data, it can be tempting to publish every single data point you collect.

A good example of this is surveying. We’ve fallen down the rabbit hole of not only sharing all of the data we’ve collected in a survey, but also segmenting the data out by demographics — regardless of whether or not all of that data is super compelling. While this can give publishers a large volume of potential angles to choose from, the result is often unfocused content lacking a cohesive narrative.

george-and-jerry.gif

Only include the most insightful, interesting data points in your content, even if that means tossing aside most of the data you’ve gathered.

One example of this was a survey we did for a home security client where we asked people about stalker-ish behaviors they’d committed. The juiciest survey data (like 1 in 5 respondents had created a fake social account to spy on someone — yikes!) ended up getting buried because we included every data point from the survey, some of which wasn’t so interesting. Had we trimmed down the content to only the most shocking findings, it probably would have performed far better.

Furthermore, the more data you include, the more time it takes for a publisher to wade through it. As one journalist told us after we sent over an epic amount of data: “Long story short, this will take too much time.”

Consider this: It shouldn’t take a publisher more than 10 seconds of looking at your project to grasp the most meaningful data points. If they can’t quickly understand that, how will their readers?

2. Turning published data into something cool doesn’t always yield links.

If you’re going to use data that’s already been reported on, you better have a new spin or finding to present. Journalists don’t want to cover the same stats they have already covered.

A great example of this is a project we created about the reasons startups fail. The majority of the data we used came from CB Insights’ startup post mortems list, which had performed really well for them. (As of the time I’m writing this, according to Open Site Explorer it has 197 linking root domains from sites including BBC, Business Insider, Fortune, Vox, CNBC, and Entrepreneur — impressive!)

It worked well once, so it should work again if we repackage it into a new format, right?

We used the startups featured on the CB Insights list, added in a handful of additional startups, and created a sexy-looking interactive node map that grouped together startups according to the primary reasons they went under.

While the content didn’t end up being a failure (we got it picked up by Quartz, woo!), it definitely didn’t live up to the expectations we had for it.

Two problems with this project:

  1. We weren’t saying anything new about the data.
  2. The original data had gotten so much coverage that many relevant publishers had already seen it and/or published it.

But of course, there are exceptions. If you’re using existing data that hasn’t gotten a ton of coverage, but is interesting, then this can be a smart approach. The key is avoiding data that has already been widely reported in the vertical you want to get coverage in.

3. It’s difficult to build links with videos.

Video content can be extremely effective for viral sharing, which is fantastic for brand awareness. But are videos great for earning links? Not so much.

When you think of viral content, videos probably come to mind — which is exactly why you may assume awesome videos can attract a ton of backlinks. The problem is, publishers rarely give proper attribution to videos. Instead of linking to the video’s creator, they just embed the video from YouTube or link to YouTube. While a mention/link to the content creator often happens organically with a piece of static visual content, this is often not the case with videos.

Of course, you can reach out to anyone who embeds your video without linking to you and ask for a link. But this can add a time-consuming extra step to the already time-intensive process of video creation and promotion.

4. Political ideas are tough to pull off.

Most brands don’t want to touch political topics with a ten-foot pole. But to others, creating political content is appealing since it has strong potential to evoke an emotional reaction and get a lot of attention.

kramer-jerry.gif

We’ve had several amazing political ideas fail despite solid executions and promotional efforts. It’s hard for us to say why this is, but our assumption has been publishers don’t care about political content that isn’t breaking (because it’s always breaking). For this reason, we believe it’s nearly impossible to compete with the constant cycle of breaking political news.

5. Don’t make content for a specific publisher.

We’ve reached out to publishers to collaborate during content production, assuming that if the publisher feels ownership over the content and it’s created to their specifications, they will definitely publish it.

In general, we’ve found this approach doesn’t work because it tends to be a drain on the publishers (they don’t want to take on the extra work of collaborating with you) and it locks you into an end result that may only work for their site and no other publishers.

Remember: Publishers care about getting views and engagement on their site, not link generation for you or your client.

6. Hyperlocal content is a big risk.

If you focus on one city, even with an amazing piece of content featuring newsworthy information, you’re limited in how many publishers you can pitch it to. And then, you’re out of luck if none of those local publishers pick it up.

On the flip side, we’ve had a lot of success with content that features multiple cities/states/regions. This allows us to target a range of local and national publishers.

Note: This advice applies to campaigns where links/press mentions are the main goal – I’m not saying to never create content for a certain locality.

7. Always make more than one visual asset.

And one of those assets should always be a simple, static image.

Why?

Many websites have limits to the type of media they can publish. Every publisher is able to publish a static graphic, but not everyone can embed more complex content formats (fortunately, Moz can handle GIFs).

george-and-kramer.gif

In most cases, we’ve found publishers prefer the simplest visualizations. One classic example of this is a project where we compared reading levels and IQ across different states based on a analysis of half a million tweets. Our Director of Creative, Ryan Sammy, spent a painstaking amount of time (and money) creating an interactive map of the results.

What did most publishers end up featuring? A screenshot of a Tableau dashboard we had sent as a preview during outreach…

8. Be realistic about newsjacking.

Newsjacking content needs to go live within 24 to 48 hours of the news event to be timely. Can you really produce something in time to newsjack?

We’ve found newsjacking is hard to pull off in an agency setting since you have to account for production timelines and getting client feedback and approval. In-house brands have a more feasible shot at newsjacking if they don’t have to worry about a long internal approval process.

9. Watch out for shiny new tools and content formats.

Just because you are using cool, new technology doesn’t automatically make the content interesting. We’ve gotten caught up in the “cool factor” of the format or method only to end up with boring (but pretty) content.

10. Avoid super niche topics.

You greatly increase your risk of no return when you go super niche. The more you drill down a topic, the smaller your potential audience becomes (and potential sites that will link become fewer, too).

There are a ton of people interested in music, there are fewer people interested in rap music, there are even fewer people interested in folk rap music, and finally, there are so few people interested in ’90s folk rap. Creating content around ’90s folk rap will probably yield few to no links.

Some questions to ask to ensure your topic isn’t too niche:

  • Is there a large volume of published content about this topic? Do a Google search for a few niche keywords to see how many results come up compared to broader top-level topics.
  • If there is a lot of content, does that content get high engagement? Do a search in Buzzsumo for keywords related to the niche topic. Is the top content getting thousands of shares?
  • Are people curious about this topic? Search on BloomBerry to see how many questions people are asking about it.
  • Are there online communities dedicated to the topic? Do a quick search for “niche keyword + forum” to turn up communities.
  • Are there more than 5 publishers that focus exclusively on the niche topic?

11. Don’t make content on a topic you can’t be credible in.

When we produced a hard-hitting project about murder in the U.S. for a gambling client, the publishers we pitched didn’t take it seriously because the client wasn’t an authority on the subject.

From that point on, we stuck to creating more light-hearted content around gambling, partying, and entertainment, which is highly relevant to our client and goes over extremely well with publishers.

It’s OK to create content that is tangentially related to your brand (we do this very often), but the connection between the content topic and your industry should be obvious. Don’t leave publishers wondering, why is this company making this content?”

costanza.gif


Learning from failure is crucial for improvement.

Failure is inevitable, especially when you’re pushing boundaries or experimenting with something new (two things we try to do often at Fractl). The good news is that with failure you tend to have the greatest “a-ha!” moments. This is why having a post-campaign review of what did and didn’t work is so important.

Getting to the heart of why your content is rejected by publishers can be extremely helpful — we collect this information, and it’s invaluable for spotting things we can tweak during content production to increase our success rate. When a publisher tells you “no,” many times they will give a brief explanation why (and if they don’t, you can ask nicely for their feedback). Collect and review all of this publisher feedback and review it every few months. Like us, you may notice trends as to why publishers are passing up your content. Use these insights to correct your course instead of continuing to make the same mistakes.

And one last note for anyone creating content for clients: What should you do when your client’s campaign is a flop? To mitigate the risk to our clients, we replace a campaign if it fails to get any publisher coverage. While we’ve rarely had to do this, putting this assurance in place can give both you and your client peace of mind that a low-performing campaign doesn’t mean their investment has gone to waste.

What have you observed about your content that didn’t perform well? Does your experience contradict or mirror any of the lessons I shared?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/lessons-from-failed-link-building
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/08/11-lessons-learned-from-failed-link.html
via IFTTT

from IM Local SEO https://imlocalseo.wordpress.com/2017/08/08/11-lessons-learned-from-failed-link-building-campaigns/
via IFTTT

from Gana Dinero Colaborando | Wecon Project https://weconprojectspain.wordpress.com/2017/08/08/11-lessons-learned-from-failed-link-building-campaigns/
via IFTTT

from WordPress https://mrliberta.wordpress.com/2017/08/08/11-lessons-learned-from-failed-link-building-campaigns/
via IFTTT

from Blogger http://coachrussellbeaudoin.blogspot.com/2017/08/11-lessons-learned-from-failed-link.html

11 Lessons Learned from Failed Link Building Campaigns

Posted by kerryjones

We’ve created more than 800 content campaigns at Fractl over the years, and we’d be lying if we told you every single one was a hit.

The Internet is a finicky place. You can’t predict with 100% accuracy if your content will perform well. Sometimes what we think is going to do OK ends up being a massive hit. And there have been a few instances where we’d expect a campaign to be a huge success but it went on to garner lackluster results.

While you can’t control the whims of the Internet, you can avoid or include certain things in your content to help your chances of success. Through careful analysis we’ve pinpointed which factors tend to create high-performing content. Similarly, we’ve identified trends among our content that didn’t quite hit the mark.

soup-nazi.gif

In this this post, I’ll share our most valuable lessons we learned from content flops. Bear in mind this advice applies if you’re using content to earn links and press pickups, which is what the majority of the content we create at Fractl aims to do.

1. There’s such a thing as too much data.

For content involving a lot of data, it can be tempting to publish every single data point you collect.

A good example of this is surveying. We’ve fallen down the rabbit hole of not only sharing all of the data we’ve collected in a survey, but also segmenting the data out by demographics — regardless of whether or not all of that data is super compelling. While this can give publishers a large volume of potential angles to choose from, the result is often unfocused content lacking a cohesive narrative.

george-and-jerry.gif

Only include the most insightful, interesting data points in your content, even if that means tossing aside most of the data you’ve gathered.

One example of this was a survey we did for a home security client where we asked people about stalker-ish behaviors they’d committed. The juiciest survey data (like 1 in 5 respondents had created a fake social account to spy on someone — yikes!) ended up getting buried because we included every data point from the survey, some of which wasn’t so interesting. Had we trimmed down the content to only the most shocking findings, it probably would have performed far better.

Furthermore, the more data you include, the more time it takes for a publisher to wade through it. As one journalist told us after we sent over an epic amount of data: “Long story short, this will take too much time.”

Consider this: It shouldn’t take a publisher more than 10 seconds of looking at your project to grasp the most meaningful data points. If they can’t quickly understand that, how will their readers?

2. Turning published data into something cool doesn’t always yield links.

If you’re going to use data that’s already been reported on, you better have a new spin or finding to present. Journalists don’t want to cover the same stats they have already covered.

A great example of this is a project we created about the reasons startups fail. The majority of the data we used came from CB Insights’ startup post mortems list, which had performed really well for them. (As of the time I’m writing this, according to Open Site Explorer it has 197 linking root domains from sites including BBC, Business Insider, Fortune, Vox, CNBC, and Entrepreneur — impressive!)

It worked well once, so it should work again if we repackage it into a new format, right?

We used the startups featured on the CB Insights list, added in a handful of additional startups, and created a sexy-looking interactive node map that grouped together startups according to the primary reasons they went under.

While the content didn’t end up being a failure (we got it picked up by Quartz, woo!), it definitely didn’t live up to the expectations we had for it.

Two problems with this project:

  1. We weren’t saying anything new about the data.
  2. The original data had gotten so much coverage that many relevant publishers had already seen it and/or published it.

But of course, there are exceptions. If you’re using existing data that hasn’t gotten a ton of coverage, but is interesting, then this can be a smart approach. The key is avoiding data that has already been widely reported in the vertical you want to get coverage in.

3. It’s difficult to build links with videos.

Video content can be extremely effective for viral sharing, which is fantastic for brand awareness. But are videos great for earning links? Not so much.

When you think of viral content, videos probably come to mind — which is exactly why you may assume awesome videos can attract a ton of backlinks. The problem is, publishers rarely give proper attribution to videos. Instead of linking to the video’s creator, they just embed the video from YouTube or link to YouTube. While a mention/link to the content creator often happens organically with a piece of static visual content, this is often not the case with videos.

Of course, you can reach out to anyone who embeds your video without linking to you and ask for a link. But this can add a time-consuming extra step to the already time-intensive process of video creation and promotion.

4. Political ideas are tough to pull off.

Most brands don’t want to touch political topics with a ten-foot pole. But to others, creating political content is appealing since it has strong potential to evoke an emotional reaction and get a lot of attention.

kramer-jerry.gif

We’ve had several amazing political ideas fail despite solid executions and promotional efforts. It’s hard for us to say why this is, but our assumption has been publishers don’t care about political content that isn’t breaking (because it’s always breaking). For this reason, we believe it’s nearly impossible to compete with the constant cycle of breaking political news.

5. Don’t make content for a specific publisher.

We’ve reached out to publishers to collaborate during content production, assuming that if the publisher feels ownership over the content and it’s created to their specifications, they will definitely publish it.

In general, we’ve found this approach doesn’t work because it tends to be a drain on the publishers (they don’t want to take on the extra work of collaborating with you) and it locks you into an end result that may only work for their site and no other publishers.

Remember: Publishers care about getting views and engagement on their site, not link generation for you or your client.

6. Hyperlocal content is a big risk.

If you focus on one city, even with an amazing piece of content featuring newsworthy information, you’re limited in how many publishers you can pitch it to. And then, you’re out of luck if none of those local publishers pick it up.

On the flip side, we’ve had a lot of success with content that features multiple cities/states/regions. This allows us to target a range of local and national publishers.

Note: This advice applies to campaigns where links/press mentions are the main goal – I’m not saying to never create content for a certain locality.

7. Always make more than one visual asset.

And one of those assets should always be a simple, static image.

Why?

Many websites have limits to the type of media they can publish. Every publisher is able to publish a static graphic, but not everyone can embed more complex content formats (fortunately, Moz can handle GIFs).

george-and-kramer.gif

In most cases, we’ve found publishers prefer the simplest visualizations. One classic example of this is a project where we compared reading levels and IQ across different states based on a analysis of half a million tweets. Our Director of Creative, Ryan Sammy, spent a painstaking amount of time (and money) creating an interactive map of the results.

What did most publishers end up featuring? A screenshot of a Tableau dashboard we had sent as a preview during outreach…

8. Be realistic about newsjacking.

Newsjacking content needs to go live within 24 to 48 hours of the news event to be timely. Can you really produce something in time to newsjack?

We’ve found newsjacking is hard to pull off in an agency setting since you have to account for production timelines and getting client feedback and approval. In-house brands have a more feasible shot at newsjacking if they don’t have to worry about a long internal approval process.

9. Watch out for shiny new tools and content formats.

Just because you are using cool, new technology doesn’t automatically make the content interesting. We’ve gotten caught up in the “cool factor” of the format or method only to end up with boring (but pretty) content.

10. Avoid super niche topics.

You greatly increase your risk of no return when you go super niche. The more you drill down a topic, the smaller your potential audience becomes (and potential sites that will link become fewer, too).

There are a ton of people interested in music, there are fewer people interested in rap music, there are even fewer people interested in folk rap music, and finally, there are so few people interested in ’90s folk rap. Creating content around ’90s folk rap will probably yield few to no links.

Some questions to ask to ensure your topic isn’t too niche:

  • Is there a large volume of published content about this topic? Do a Google search for a few niche keywords to see how many results come up compared to broader top-level topics.
  • If there is a lot of content, does that content get high engagement? Do a search in Buzzsumo for keywords related to the niche topic. Is the top content getting thousands of shares?
  • Are people curious about this topic? Search on BloomBerry to see how many questions people are asking about it.
  • Are there online communities dedicated to the topic? Do a quick search for “niche keyword + forum” to turn up communities.
  • Are there more than 5 publishers that focus exclusively on the niche topic?

11. Don’t make content on a topic you can’t be credible in.

When we produced a hard-hitting project about murder in the U.S. for a gambling client, the publishers we pitched didn’t take it seriously because the client wasn’t an authority on the subject.

From that point on, we stuck to creating more light-hearted content around gambling, partying, and entertainment, which is highly relevant to our client and goes over extremely well with publishers.

It’s OK to create content that is tangentially related to your brand (we do this very often), but the connection between the content topic and your industry should be obvious. Don’t leave publishers wondering, why is this company making this content?”

costanza.gif


Learning from failure is crucial for improvement.

Failure is inevitable, especially when you’re pushing boundaries or experimenting with something new (two things we try to do often at Fractl). The good news is that with failure you tend to have the greatest “a-ha!” moments. This is why having a post-campaign review of what did and didn’t work is so important.

Getting to the heart of why your content is rejected by publishers can be extremely helpful — we collect this information, and it’s invaluable for spotting things we can tweak during content production to increase our success rate. When a publisher tells you “no,” many times they will give a brief explanation why (and if they don’t, you can ask nicely for their feedback). Collect and review all of this publisher feedback and review it every few months. Like us, you may notice trends as to why publishers are passing up your content. Use these insights to correct your course instead of continuing to make the same mistakes.

And one last note for anyone creating content for clients: What should you do when your client’s campaign is a flop? To mitigate the risk to our clients, we replace a campaign if it fails to get any publisher coverage. While we’ve rarely had to do this, putting this assurance in place can give both you and your client peace of mind that a low-performing campaign doesn’t mean their investment has gone to waste.

What have you observed about your content that didn’t perform well? Does your experience contradict or mirror any of the lessons I shared?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog https://moz.com/blog/lessons-from-failed-link-building
via IFTTT

from Blogger http://imlocalseo.blogspot.com/2017/08/11-lessons-learned-from-failed-link.html
via IFTTT

from IM Local SEO https://imlocalseo.wordpress.com/2017/08/08/11-lessons-learned-from-failed-link-building-campaigns/
via IFTTT

from Gana Dinero Colaborando | Wecon Project https://weconprojectspain.wordpress.com/2017/08/08/11-lessons-learned-from-failed-link-building-campaigns/
via IFTTT

from WordPress https://mrliberta.wordpress.com/2017/08/08/11-lessons-learned-from-failed-link-building-campaigns/
via IFTTT