In late 2012, Internet marketing tools platform Raven Tools raised eyebrows in the SEO world when the company’s CEO announced that Raven would remove “all unauthorized Google data” – or data obtained through scraping Google against its terms of service – from its platform, including its rank tracking service and data from third-party provider SEMRush.
The move was designed to bring Raven into compliance with the Google AdWords API Terms of Service and was billed as preparation for a future in which “every software company will have to make the same choice as Raven: either you comply with Google’s requests or you don’t.”
Since then, Raven has been rebuilding and repositioning itself with a new focus on metrics that matter and tools that bring together content, SEO, social, advertising, email and other marketing methods.
We talked with Raven’s co-founder Jon Henshaw about what the journey has been like.
How have things been since removing rankings from Raven?
We definitely had several customers leave who used us only for rankings, but the majority stayed because they get so much value out of our other tools.
Probably the best thing about the change was that it pushed us into full innovation mode. So while it wasn’t comfortable, it was probably one of the best things to happen to our company.
Can you say yet whether Raven was correct in predicting that Google compliance would become the choice “every software company has to make?”
From the firsthand knowledge we have from Google, we know that they’re planning to aggressively fight scraping this year, and not just through the enforcement of the AdWords API ToS. Basically, all companies that aren’t Google compliant – who are scraping or use scraped data – are in jeopardy of losing that data and are also in potential legal jeopardy.
It’s been interesting to watch the industry react to our decision. We saw Majestic come out with strong statement in support of authorized data, and ahrefs even discontinued its keyword analysis tool because it used scraped data. (We also compared Majestic and ahrefs a few weeks ago)
However, the rest of the industry has moved full steam ahead toward choosing to scrape or use scraped rankings and AdWords data, which we think is ultimately short-sighted and will end up hurting their business in the long run.
What has Raven been focusing on since this change?
We shifted our focus to things that really matter to SEOs and their clients: performance metrics that focus on organic traffic resulting in engagement and conversions. Knowing where you rank for a keyword may be an indication of how well a site is doing in search engines, but for the most part it’s not a necessary performance metric.
What is necessary is knowing the results of your campaign and the effectiveness of your strategies and techniques. That’s what we ended up focusing on; SEO metrics and insights that are easy to report on and matter the most to clients and site owners.
We gave our customers a better set of metrics to report to clients in the form of five new SEO reports based solely on organic search engine traffic. They focus on site engagement, goals, landing pages, search engine share and link referrals. The new metrics and reports have received a lot of praise from our current customers and they’ve told us that their clients love them.
Our new Site Auditor, which is still in beta, has also been extremely well received. As we started to build it, we looked at what was available in the market and did our best to create a unique tool that would be more task-oriented, simple and easy to use.
And now we’re launching our latest post-rankings feature, Top Search Queries from Google Webmaster Tools. It includes the average position data for the keywords related to your site.
How is average position different than rankings?
Average position are rankings, only better. They are more relevant and accurate than the scraped data you get from a typical rank tracker, and the data is coming from the source, Google.
Google takes into account personalized search, location, all of these different factors, and tells you: across the board, how well does your site rank for these keywords? Whereas scraped rankings (in general) are technically inaccurate, limited in scope and don’t reflect real world usage.
At the core, search query data with average position includes the keywords that actually perform and get displayed in search results, not the keywords you wish were visible. And you’ll know when the keywords you’re targeting do start to appear in SERPs, because they’ll be included in the search query data. Simple as that.
So if average position is better and more accurate than rankings, why isn’t it more widely used?
GWT only provides keywords that actually rank and appear in searches, and many SEOs are used to having full control of tracking the keywords they are most interested in. It’s also a paradigm shift for them to change their focus from a single point of reference to an average position, regardless of the fact that average position represents the most accurate and realistic performance metric for keyword ranking.
There’s also a problem with access to the data. Google hasn’t made getting the information easy yet. You can manually export the data, but it’s currently not available via their GWT or GA API. The assumption is they’ll have it eventually; it just hasn’t happened yet.
What else is Raven working on for the future?
Our overall goal with Raven is to provide an all-in-one solution for managing, monitoring and reporting on every aspect of an online marketing campaign, including SEO, social media, content, PPC and email.
For the next few months we’re going to revamp and improve our campaign metrics and reporting across the board. We are going to launch a completely new dashboard system (which is going to be amazing) and a new alerts feature that will monitor important metrics and notify you when they change. We’re also planning some big improvements and surprises with our reporting system.