1. SEJ
  2.  ⋅ 
  3. Ask an SEO

Do Links to Noindex Pages Help SEO?

Learn four possible scenarios involving robots and noindex directives and how exactly they can impact your website's SEO.

Do Links to Noindex Pages Help SEO

This week’s Ask an SEO question comes from Susan from Texas. She writes:

“I have a URL blocked in my robots file. I just learned that one of our marketing managers told everyone at a meeting to link to this specific URL because “it will help our SEO.”

Will all those people linking to a page on our website that I set to noindex really help our SEO or will it be null and void to have all these links pointing to a page that I have set to noindex?”

Thanks, Susan, and great question!

For this answer, I’m going to start from the basics in case someone newer to SEO is reading and would like the background.

TL;DR – yes and no. It depends on the situation. I’d need to know your website to verify.

The Difference Between Robots.txt & Meta Robots

There are two types of robots we could be talking about – robots.txt and meta robots.

It sounds like you’re referencing a robots.txt file.

  • Robots.txt: This is a file that guides spiders and bots through what to crawl and not focus on during a crawl. It works more as an ignore these files and folders if you’re XY spider and do not crawl at all if you’re AB. It also tells spiders which files and folders are more important.
  • Meta Robots: This lives in the header of a page (or should) and tells a search engine spider whether it should index the page. It is also used to define whether to follow or not follow the links from the page.

Now for your answer!

So Do Links to Noindex Pages Help SEO?

There are four possible scenarios here:

  • Internal links pointing to a page you have in robots.txt set to disallow.
  • Inbound links (links from another website pointing to your website) that you have in robots.txt and are set to disallow.
  • Internal links pointing to a page on your site with a meta robots noindex.
  • Inbound links pointing to a page on your website with a meta robots noindex.

1. Internal Links Pointing to a Page You Have in robots.txt Set to Disallow

Creating links to a page that you tell search engines not to crawl likely won’t harm or benefit you in a really big way.

But it does create a conflicting signal which can be problematic in the long run.

Imagine being told to walk the long way through an IKEA by a manager when there was a direct path.

You get the same result, but you get annoyed and it was more difficult than it had to be.

This is what you are doing to Google.

Robots.txt tells search engines what your most important sections are and the correct pathway to find them.

If the file says to ignore or disallow it, then you’re saying the pages inside the folder are not important and do not focus on them. But they could still be indexable.

Internal links also tell search engines which pages are the most important and what the topic of the page is.

If you have internal links pointing to a page that is disallowed, then you have created a conflicting signal which is also a bad practice for your SEO.

2. Inbound Links That You Have in robots.txt & Are Set to Disallow

Links from other sites are something you “technically” do not control.

They are supposed to be given from journalists, bloggers, community members, and other people that feel your website or a page on your website is credible.

Because they are “out of your control”, having inbound links pointing to a disallowed page could still be beneficial.

You aren’t giving a conflicting signal and other people are saying it is credible.

You may want to find out why other people are linking to the page and possibly allow crawling of it if you find good quality links.

By having natural and high-quality links point to this page, it can build the page’s authority.

You can then pass this authority with internal links to your more and most important pages when applicable.

That can benefit your SEO.

3. Internal Links Pointing to a Page on Your Site With a Meta Robots Noindex

You do not want to do this.

A meta robots that has “noindex” on it says that this page should not be indexed.

This means it is low quality.

By referencing it with internal links, you have given a search engine a conflicting signal by saying this is an important page, only to find that you also said it is not an important page via the meta robots.

These conflicting signals are normally bad for SEO.

If the page is worth building an internal link to, then you may want to consider changing your meta robots to index.

Or create a better resource and build your internal links to that page instead.

4. Inbound Links Pointing to a Page on Your Website With a Meta Robots Noindex

The answer here is similar to the one above.

If people are legitimately linking to this page, there is some value here.

You may want to consider indexing it and using it to pass the authority to other pages.

People normally only link to a page when there is some value for their referral, so take advantage of this.

Great question and I hope this helps!

Editor’s note: Ask an SEO is a weekly SEO advice column written by some of the industry’s top SEO experts, who have been hand-picked by Search Engine Journal. Got a question about SEO? Fill out our form. You might see your answer in the next #AskanSEO post!

More Resources:

Category Ask an SEO
VIP CONTRIBUTOR Adam Riemer President at Adam Riemer Marketing

Adam Riemer is an award winning digital marketing strategist, keynote speaker, affiliate manager, and growth consultant with more than 20+ ...

Do Links to Noindex Pages Help SEO?

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.