Search engines work towards improving their algorithms on an ongoing basis to ensure a positive experience for search users.
As a part of these efforts, they will identify and remove anything they deem as low quality or spam from search engine results pages.
What does that mean for marketers? If you aren’t updating your tactics as frequently as search engines update their quality guidelines, your website may fall behind your competition in rankings.
In this post, we’ll discuss the outdated SEO and marketing tactics that you should remove from your marketing playbook.
1. Misusing Keywords
There are so many ways webmasters and marketers continue to misunderstand keywords’ role in general SEO initiatives, and how they should be used in the day-to-day strategy.
Let’s take a more granular look at specific types of keyword misuse and mismanagement, including irrelevant usage, writing for a specific keyword density, and keyword stuffing.
Irrelevant Keyword Targeting/Confusion
All too often, novice SEO practitioners try and fit their content and messaging within the confines of their keyword research (and not much else).
They will shape the content and its metadata to represent keywords it’s not properly aligned with, nor the proper intent of the users conducting the searches for the high-volume keywords being targeted.
This causes brands to lose the attention of readers before ever having the chance to communicate a real message with them.
If the keywords marketed for don’t align with the content on the page, the disconnect will hinder the success of content — even if it’s otherwise of good quality.
Don’t try to mislead users and direct them to content that is misrepresented by high-volume keywords in order for increased visibility.
Google knows what this looks like, and it can truly be defined as an obsolete SEO practice (as well as a “black hat” technique, in many instances).
Writing for a specific keyword density, like many keyword-focused marketing tactics, is just missing the mark.
Google no longer depends on keyword density (or the ratio of specific keyword usage to the overall page copy) to determine whether a webpage is an effective source for answering a search query.
It is so much more advanced than simply crawling for keywords. Search engines like Google use a multitude of signals to determine search results.
While keywords remain important to the topics and ideas they represent, they are not the lifeline for ranking for high-value search queries.
The quality of content and how the messaging is delivered are the lifeline for that.
This is probably the oldest trick in the book.
SEO is about keywords, right?
So, loading up our webpages with keywords – especially the same high-value keyword we are aggressively targeting throughout the website – is going to help us show up higher in search, thus outranking our competition. Right?
Search engines have, for a long time, known what keyword stuffing is and what kind of text combinations are unnatural. They notice these as attempts to manipulate search results and demote the content as such.
Yes, there may still be valuable content that uses simple keyword stuffing, either intentionally or unintentionally, that is not demoted because of its actual value to users.
Back in the day, webmasters trying to game the system would go as far as putting every keyword variation of a high-value keyword in the website footer.
Or, even more sketchily, they might make those keywords the same color as the site’s background, effectively hiding them from humans but not the search engine crawlers.
Webmasters have also tried this with links. Don’t do anything like this.
Remember, you’re writing for humans, not search engines.
2. Writing for Robots
It’s important to understand that writing unnaturally is, well, not natural.
And search engines know it.
This misplaced belief is: Writing for the web means we should repeat a subject by its proper name every time it is mentioned, working in variations and plural/non-plural versions of the word so that “all bases are covered.”
When crawled, the crawlers see the keyword repeated, and in several different versions, thus leading the page to rank well for the keyword variations used (over and over… and over again).
This just doesn’t work anymore.
Search engines are advanced enough to understand repeated keywords, their variations, and the unfavorable experience of generally bad content.
Write for humans, not search engine crawlers or any other robot.
3. Article Marketing
Any attempt to game the system doesn’t usually work out in the world of SEO.
But that doesn’t stop people from trying.
Especially when these tactics offer noticeable improvements to a brand, its website, and/or its associated digital properties.
Sure, article directories worked. And they worked pretty darn good for a long time, too.
Commonly considered one of the earliest forms of digital marketing, article syndication was low-hanging fruit to those in the know. And it made sense since the idea was similar to other channels like TV and print that already use syndicated content regularly.
But Google eventually caught on, unleashing its game-changing Panda update in 2011.
Panda chewed up the search landscape, targeting content farms and directories, as well as other websites offering crap content (whether it was simply bad/false, horribly written, made no sense, was stolen from someone else, etc.).
The idea behind article marketing doesn’t make sense in today’s world, where your high-quality content needs to be original and demonstrate expertise, authority, and trustworthiness.
4. Article Spinning
Typically done with software, article spinning is the black hat tactic of trying to recreate quality content using different words, phrases, and organization.
Essentially, the end result was a garbled mess of an article that made the same points as the source material.
It’s no surprise this isn’t effective anymore.
While AI is getting better all the time at creating content, anything generated by a machine is still of a lower quality than what a human can produce – something original, helpful, and of substance.
5. Buying Links
This one is still biting webmasters many years later.
Like most SEO tactics, if it seems shady, you probably shouldn’t do it.
Buying links is no different.
Once upon a time, it was routine practice to quickly pay to get a high volume of links pointing at your site.
Now we know that backlink profiles need to be maintained and optimized just like the websites we oversee, and low-quality domains with far too many backlinks pointing to a website may be dangerous to a website’s health.
Google can easily identify low-quality sites, and it will also identify when those sites are sending an abundance of links out that they shouldn’t be.
Today, if you want to legitimately help boost the authority and visibility of your website, you need to earn links — not pay someone to build them manually.
6. Overusing Anchor Text
Internal linking is a characteristic of any good site structure and user experience.
This is typically done with anchor text, an HTML element that allows us to tell users what type of content they can expect if they click on a link.
There are various types of anchor text (branded, naked, exact-match, website/brand name, page title and/or headline, etc.), but some have most certainly become more favorable than others, depending on the usage and situation.
In the past, using exact-match and keyword-rich anchor text were standard SEO best practices.
Since Penguin, Google has been better at identifying over-optimized content.
This goes back to the Golden Rule about producing well-constructed content that is user-friendly and natural.
If you’re optimizing for search engines and not humans, you’re likely going to fail.
7. Practicing Obsolete Keyword Research Tactics
Keywords have certainly gone through some drastic changes over the last five to 10 years.
Marketers used to have a plethora of keyword-level data at their fingertips, allowing us to see what works well for our brand and what doesn’t, but also to get a better understanding of idea targeting and user intent.
Much of this went to the wayside with keyword “(not provided)”.
In the years following, tools popped up that tried to replicate keyword data. But to fully recreate it correctly is simply impossible.
And yet, even with that now-stripped keyword data, marketers are required to do keyword research of their own to get an understanding of the industry, the competition, the geographic region, etc.
To do this, many marketers turn to Google’s free Keyword Planner. While the data in there has been subject to some scrutiny over the years, it’s a free Google-owned product that gives us data we previously couldn’t really come by, so many of us continue to use it (myself included).
But it’s important to remember what the data actually represents for keywords.
“Competition” in the Keyword Planner pertains solely to paid competition and traffic, thus it is practically useless to build an organic search strategy around this data.
Some alternatives to this are the Moz Keyword Explorer tool and Semrush’s Keyword Magic Tool, both of which are paid tools.
Google Trends is helpful for this type of competitive analysis, too, and it’s free.
8. Creating Pages for All Keyword Variations
This was once a useful tactic to rank well for all the variations of high-value keywords targeted by your brand and its messaging.
The best, most useful content around these entities should be most visible due to the value it offers users on the topic, not just one variation of the word.
Aside from the fact that this will lead to brutal site self-cannibalization, it makes a website considerably harder to use and navigate since the content will be so incredibly similar.
The negative user experience alone is reason enough not to do this. But the added fact that Google knows better than to overlook this practice makes it a no-brainer.
This tactic evolved and eventually helped lead to the inception of many content farms that were targeting traffic solely for their keyword value and visibility.
This was attributed to the “old way” of optimizing a website – for keywords and search engines, rather than users and their intent.
9. Targeting Exact-Match Search Queries
The tactic of targeting exact-match search queries in hopes to rank for those queries solely for the traffic numbers – and not because the search query or its answer actually pertained to the business optimizing for it – became a somewhat popular practice before the full deployment of the Google Knowledge Graph.
Marketers would strive to rank in the top spot for exact-match search queries to trigger a breakout box and an increased click-through rate for their sites.
10. Buying Exact-Match Domains
Having high-value keywords in your URL makes sense. To some extent.
But when it becomes confusing or misleading (i.e., it results in a bad user experience), you have to draw the line.
A main best practice for domains is to keep them consistent with your brand.
Brand names should be short, concise, and somewhat meaningful.
Why wouldn’t you want the same from your domain?
Google would value exact-match domains a long time ago because it made sense to use it as a signal.
The behavioral data now has helped Google make changes like this (and many others) that are common sense, clean-up moves.
Run a good company and offer great products and/or services under the brand name, and Google will do the work of making your brand visible when it’s relevant to the people searching for it.
11. Relying on Third-Party Domain Authority Scores
Have you built a link building or content distribution campaign off of a list of high-quality sites?
If the list ranked websites based on domain authority alone, then you will have to do further analysis to ensure the websites you are contacting are valuable to your campaign.
- Is the content on the website relevant to what you are promoting?
- Does the website receive organic search traffic from relevant keywords?
- If traffic from North America is important to your business, does the website receive traffic from that region?
- Are the incoming links to the website relevant?
Domain authority scores can help you filter out some quality sites. But they shouldn’t be the only metric/factor you rely on in your marketing strategy.
12. Publishing Subpar Content
Face it. There was a time in our world when crappy content could rank well.
Oh, how times have changed.
Stolen content, thin content, keyword-stuffed content, non-credible content – all of this could get by search engine crawlers and regurgitated back to users as worthy results.
But no more.
We know what it takes to make quality content that is rewarded by search engines because they tell us what’s right and what’s wrong.
If you want to succeed at SEO today, you must do what’s right.
You need to be the best answer.
When you’re ready to create a new piece of content, start by researching the content that ranks for your target keywords. Chances are, the content on the first page of search results is above average quality.
Thus, your content needs to be above average quality if you want to ultimately outrank your competitors.
Once you write the first draft, have an editor fine-tune your content. They can fix mistakes you may overlook and enhance the overall readability of your content for visitors.
If you can’t hire an editor, you can run your content through AI editors like the Hemingway Editor, Grammarly, or ProWritingAid.
Outdated SEO tactics may seem like an easy win for your search marketing campaign. But in the long run, the low-hanging fruit could poison your marketing efforts.
Avoid anything that could be considered low-quality or spam in your search marketing to ensure the safety of your rankings from changes to the Google algorithm.
More SEO Resources:
- 5 Bad SEO Content Tactics You Should Have Abandoned Already
- 8 On-Page Optimization Techniques That Google Hates
- How to Do Keyword Research for SEO: The Ultimate Guide
Featured Image: eamesBot/Shutterstock