Every SEO professional worth their salt knows that links (along with content) are the backbone of SEO.
What happens when you get bad links on enough of a scale to harm your site?
Your site can get algorithmically downgraded by Google – or worse, you get a manual action.
While Google maintains they are good at ignoring bad links, enough bad links can harm your site’s ranking.
This guide will explain 10 different types of bad links that can get you penalized, and what you can do about them.
These links were super easy to get.
All you had to do was write a press release and syndicate it to hundreds of press release distribution sites.
Like any SEO tactic that worked well, it got abused.
Now, Google considers press release links a link scheme because these are so easy to manipulate.
You especially want to avoid any press release links that rely on over-optimized anchor text targeting your main money keyword.
If you absolutely must have a website link due to factors beyond your control, use naked URLs or branded URLs as your anchor text, and use only one link from the contact area of the press release.
To be clear: not all forum discussion links are bad.
If a link is coming from a good quality site, an established user, and the link itself is not manipulative or spammy, you probably will want to keep it.
However, if you have thousands of links coming in from foreign discussion forums, they are all low-quality spammy links, and they continue to come in, you may want to disavow them.
Any links that look spammy won’t do you any favors in Google’s eyes.
Links like these are also manipulative.
Links from foreign guestbooks can be placed manually or with the aid of an automatic program.
Enough of these at scale can cause ranking drops.
Think you can fool Google by randomizing your footprint just enough so that your spammy link building will go undetected?
It is exceedingly difficult to create randomized footprints that you think Google will not detect.
If you are using an automated program, it is increasingly likely that Google will find the footprint of that automated program, unless it is truly random.
Why? The simple act of nofollowing the link is a footprint.
Thousands of links from many different sites that are all nofollowed is an indicator that something spammy is going on.
PBNs used to be a great way to build links to get rankings.
You could randomize your footprint and all would be well.
You could continue to see significant gains from using these techniques.
Now, PBNs on a massive enough scale can tank your site and cause it to lose organic traffic.
Google is able to detect – and punish – most PBNs.
Some PBNs may take longer to spot than others, but eventually Google will catch on.
Social bookmarking links are also considered to be manipulative by Google.
This can get you in trouble if you do it too much.
Think about it. They are all manually-placed and are spammy as hell.
It’s no wonder Google considers these a link scheme.
Directory submission services love to tell you that you will get great traction from their links.
“We’ll help increase your Google rankings!!” they will say.
However, nothing could be further from the truth.
Submitting to low-quality directories will likely do more harm than good for your rankings.
As with many things in SEO, there is an exception.
It is OK to use relevant and targeted directories for natural link building – especially in local SEO.
In fact, here are 21 Web Directories That Still Have Value.
Historically, blog comments have been one of the most-abused tactics in SEO.
Comment spam is an ancient link acquisition tactic to avoid.
It. Does. Not. Work!
In fact, you can thank spammy blog comments for the introduction of nofollowed links.
The goal was to prevent spammers from getting SEO benefits from abusing the comments section.
But there is a right way to approach blog comments. The key is leaving topically relevant comments on topically relevant sites.
This is yet another abused tactic in SEO.
Again, this is so egregiously bad that, while they are not part of Google’s guidelines, the patterns and footprints left behind are likely obvious to Google’s algorithms.
It would not be hard for Google to set up an investigative protocol to sign up for accounts for these services, pose as SEOs or other webmasters, and check out the most common patterns used by these services.
Just always remember – that person you’re talking to on the black hat forums regularly could very well be a Googler.
Ever heard of tools like GSA Search Engine Ranker, Scrape Box, or XRumer?
Sure these tools can build you lots of links. However, in recent years, these programs have become less effective.
This SEO professional does not recommend using these programs for your SEO efforts, especially not on your money site.
If an SEO can think of it, it is likely that Google is already several steps ahead with pattern variations already built into their algorithm.
So far we’ve talked about links that harm you.
But could it be something else?
To find out, you should perform multiple audits.
Assess the state of your site and move forward from there.
From here, you will be able to move forward with steps to fix the site.
If you are unfortunate enough to have a complex site with issues in all three columns, you will need to get to work.
In my opinion, nothing beats Link Detox by Link Research Tools. It can assess links from the most sources (25), such as Majestic, GSC, Moz, Ahrefs, SEMrush, and many more.
You will want to compile all links for as many sources as you can get your hands on.
Upload them according to the instructions in Link Detox.
Once you have done this, and you have gone through Link Detox’s process of reviewing and rating links, it will be necessary to prepare the disavow file.
Google has said some interesting things about the disavow file, including the following:
Mueller has also discussed the following:
Did you get a manual action notification?
Just check under manual actions within Google Search Console.
You’ll know immediately if Google has penalized your site.
Did your site get downgraded algorithmically?
Usually, you can assess whether you have an algorithmic downgrade by examining your Google Analytics data.
Typically, you’ll see an approximate 35-50 percent drop in overall traffic. This could be to certain pages, folders, or even sitewide.
A careful analysis can sometimes reveal other issues (e.g., technical or content) on a site that are causing such traffic drops.
In these cases, you should begin with a multi-tiered audit implementation approach designed to fix content-related issues along with link-related issues.
While a full audit isn’t necessary, checking out the links that link to you could be helpful.
In general, if you see something extraordinarily spammy, that lowers the quality of the link overall. If you don’t, it’s probably a good idea to keep it.
You need to stop the attack as quickly as possible. If you are under high volume link attack, it’s only a matter of time before you will be penalized.
The process itself is really fairly simple and not overly complex:
In general, your link profile should be fairly balanced, and types of links should not exceed approximately 20 percent of your overall link profile.
Avoid spammy, unnatural links and disavow any against Google’s guidelines.
It can take 6-8 months or more to remove an algorithmic downgrade, depending on its severity.
And if it’s a manual action, you could be looking at close to a year.
I’ve personally worked on a 200,000+ link profile in the legal industry that took over a year to fully get completely reversed, and seven submissions of the reinclusion request sent to Google.
Don’t give up – it’s possible to repair even the worst of the worst link profiles.