8 Key Points to Multiple Niche Sites And Controlling Back Links

Niche sitesAsk ten SEO professionals whether it’s better to have all your content on one site or split it up, and you’ll get eleven answers (yeah, I’m usually able to see both sides of things and offer both opinions accordingly!)… Yet under certain circumstances, it just makes a great deal of sense to go with multiple web sites. Done right, the payoff can be huge. It’s a great way to control some of the back-link numbers.

Yet to me, it’s not just about building links. With the right client (read that someone with deep pockets who trusts your recommendations), it’s an opportunity to get even more value. Much more…

For those of you who are not familiar with the process, what I’m talking about here is where your client has one main web site, and X number of “micro-sites” or “satellite” sites. Or in some cases, a number of very large web sites that all drive traffic to the main site or to each other, depending on your needs and the scope of the client’s offerings.

Shameless Side Note: if you Sphinn this article, I’ll be your BFF!

Mesothelioma Clients Are Manna from Heaven

One of the first times I had an opportunity to create a multi-site strategy was back in November of 2006. I was brought on board to help take a legal web site and lift it up from the 15th page of Google. For those of you who have never heard of mesothelioma, let alone why it’s such a hot-button market for SEO, let me just say this:

CPC up to $98


1 Conversion can net $2,000,000 revenue for the site owner

Nuff said?

Gray Hat – Black Hat Competition

Given the stakes in this arena, my initial analysis found all sorts of methods and techniques being used by the big players, and they’d been doing it for several years. Lots of gray hat methods. And so much black hat going on it makes my head spin. (Not that I’m complaining – I don’t do drugs anymore, so I kinda like the head rush…)

A Real Challenge

My task then, was a real challenge because competitors had been doing SEO for years. And they all have deep pockets. And I wanted to find a way to overcome the longevity issue, and surpass them without using black hat. So I had to formulate a sound method that would stand the test of time and ongoing heavy competition.

Results Even a Mother Could Be Proud Of

I’m happy to say that as a result of my multi-site strategy, within just a short order, my clients were showing up on the first page of Google for a few of their most valued phrases. In fact, to this day, we still own the #1 and #2 positions for those same exact phrases. Even though all the while, the competition has not let up on their efforts.

Since then, we’ve also gotten up there for several other important phrases. And a number of other clients in different industries and markets have come along where I’ve been able to apply and build on those same key concepts.

Square Pegs Don’t Fit In Round Holes

One size does not fit all in this process. Every client is going to be different, so the type of niche sites you suggest they build (and thus you get to optimize) will vary greatly. For companies that have a multi-regional, national or international reach, the obvious opportunity here is to have a site set up for each of several geo-locations.

Or if the company has X number of divisions or departments, perhaps you set up a site for each of those. Does your client have five major services they offer? Well heck, that’s six web sites, plus a blog of course, just waiting to be optimized and cross-linked!

The whole concept here is that if you do the footwork, and you can get buy-in from your client, with enough time and leverage, you can (pretty spectacularly in some cases), end up with several highly optimized, highly authoritative web sites, and all passing quality link value. (No I didn’t say link juice, because I don’t drink that Kool-aid thank you very much…)

Proper Planning Prevents Poor Performance

Before you embark on such an endeavor, there are a number of key points that need to be thought through; else you end up in the black-hat arena… And surely, you’re of the same mind-set as I am, and wouldn’t ever consider violating the cardinal rule of client services business ethics, right?

So here then, are eight of the most important key points to building multiple niche sites and taking control of at least part of your back-links…

1. Key Player Buy-In

Just because you see the potential value (back-link or otherwise) in having eighteen highly optimized web sites out there promoting your client offerings doesn’t mean they do. Or that their VP of marketing does. Or that their development team is up to the task when it comes to execution.

So the first thing you need to do is map out your vision, gather supporting information, and present a plan to your client that they can buy in on. You’ll need to be well prepared to answer any questions that get thrown your way, especially if their head of Marketing is just knowledgeable enough about SEO to be dangerous.

You’ll also need to have a specification document prepared that the developer(s) can execute – especially if there’s any way to automate aspects of the seeding, and making the content management as effortless as possible.

Alan Bleiweiss
Alan Bleiweiss is a Forensic SEO audit consultant with audit client sites consisting of upwards of 50 million pages and tens of millions of visitors a month. A noted industry speaker, author and blogger, his posts are quite often as much controversial as they are thought provoking.
Alan Bleiweiss

Comments are closed.

37 thoughts on “8 Key Points to Multiple Niche Sites And Controlling Back Links

  1. Hi, Thank you for such great info. It’s perfect timing for me too! :) I’m afraid I don’t sphinn, but I do Delicious and did bookmark this :) AND print it out to re-read which I don’t often do.

    I would love to see a second article that expands the automation info for #1 with more details and specifics.

    I’m going to be looking forward to reading more since I am also subscribing! :) :) :)

  2. Great tips, Alan. I agree with your ways of addressing things in scale. So many people take a one size fits all approach and try to fit that square peg into the round hole. Sounds like you planned properly to avoid that and were rewarded. A valuable lesson, for sure.
    Try some of that link juice Kool-Aid, by the way. It’ll make you feel good.

  3. Really a great article. It helped iron out some of the kinks in my own multi-site plans. Had to come back twice to finish reading it, but well worth it.

  4. @kaye expand on this book length article? :-) actually that’s a good idea – a post about tips for automating SEO, because there’s a right way and a very very very wrong way, that most people don’t fully grasp.

    @Matt plan plan plan… and I just get all twisted at the phrase “link juice” because I think some of our readers are non SEO types and that the phrase potentially sets us up to just look like a bunch of black hat types

    @Thompson – I’ve got a PhDuh in Sarcasm 😉

  5. @RSS2MYSQL – auto-seeding is the process where you set up your site to automatically populate various spots on a page with highly appropriate optimized content.

    So for images, which have an “alt” attribute for visually impaired readers and those who have images turned off, that can be, in very limited ways, used for keywords. If you have a web site that has five hundred or five thousand images, you shouldn’t have to manually enter the text you want for every image.

    So using an SEO customized content management system, you can automate that process. The biggest challenge in that is knowing when and how to integrate keywords without abusing or corrupting the accessability reasons for the alt attribute.

  6. Thanks Alan for a very informative and thought provoking article.

    I guess you are saying that one can create new authority sites and then use these authority sites to boost the main site from page 15 to page 1. And that the new authority sites can essentially be thoughtful rewrites of the main content. How long after launching the new sites did you see a boost in the rankings of the main site for your target phrase?

    Am I right in thinking this strategy is primarily for “someone with deep pockets”, someone who has gone as high as possible with their main site – essentially who can pay for a thoughtful rewriting of content and the design of a CMS that can assist in the publishing of rewritten content? And is able to follow a plan without veering off into some kind of Grey/Black twilight zone.

    “a page has to be at least 60% unique” – do you have a source for this theory (experience)?

    what was the timetable of rolling out the new sites – did they all come online gradually? How wary were you of tripping any “spike/over optimization” filters?

    I don’t understand why you would be sending PPC traffic to the minor sites?

    How did you pitch around the uncertainty of this technique? I guess you made no guarantees but were confident that it could work – why were you confident?

    I’m thinking now your premise might be – you’ve got good enough content to make it to page 15, with enough other similarly good sites pointing to your main site you can go to page 1 – and you can essentially use your good content to kick start the process.

    My big fear here is that Google would see this as gaming – how can we be sure that these outside the square tactics would not return to bite us?



  7. OH BOY – answer may be as long as the article…

    Yes, in my mesothelioma client example, we had one site at first, buried deep in Google’s results. While part of that was scouring the site for duplicate page titles, poorly optimized pages, and the standard in-site issues, one problem they had was that their competition was primarily getting high rankings through multiple site and back-links. In several of those, they were using black hat link methods. (look for an upcoming article I’ll be doing on my own blog about that mess).

    At that point, we did all we could to clean up the main site. Yet we were still in the basement.

    Initially I created just two niche sites that were geo-targeted to areas my client serves. On those I focused on re-working the core message so that it still got the same info to the visitor but with unique text. We’re talking about five page mini sites. Nothing huge. But because they were only five pages each, I was able to optimize them so well as compared to other sites targeting the same geo-location that those sites rapidly got to the top of Google for their phrases. (Most competitors at that point only had one or maybe two pages on their own sites for those geo-locations).

    That alone got us to the 2nd page of Google within about two months.

    I then set up a news article system on the main site. Each article had its own fully optimized page, and there was a headlines landing page that was linked to in the main nav, so it was a quality funnel even for the main site.

    Then I pushed half the article headlines to one of the mini sites, and half to the other. Clicking on those article titles took the visitor to the main site’s article details page, in a new window. After a month of that and several dozen articles, we were in the top organic spot for the main site for two phrases.

    It’s important to note that we made it crystal clear on the mini sites that they were in fact, local extensions, with links to the main site as I’ve described on various pages and in the footers. While most of my clients competitors cloak the true ownership of their mini sites, I felt ethically that this was inappropriate. And that also allowed me to get much more link value because of the ability to have the clients company name be all over the mini sites and in anchor text.

    So by using white-hat methods, I was able to supplant the competition for what eventually became about a dozen phrases. That may not seem like a lot, but in the mesothelioma field every single phrase is like 20 to most other markets.

    In many situations the cost involved with such endeavors is pretty substantial. So small biz clients can’t afford it. Of course, if the market you’re working within is not as competitive it could cost a lot less both in time and effort. I’ve gotten the same kind of results with clients who are only dealing with regional competition and for a fraction of the cost.

    This is less a matter of cost as it is buy-in and establishing Best Pracices Standard Operating Procedures that everyone involved can sign off on. Yet even with that, I’ve had clients hire copy editors and writers who claimed to be able to do the work and we then had to go in and fix it. Most of the time clients are better off allowing us to interview the prospective contributors and train them as needed.

    I need to apologize for quoting a specific percentage in what makes unique content. I have only seen that specific number floated a few times and can’t recall the source so it was inappropriate for me to say that. There is of course, no set guideline from any of the search engines because they factor in like probably fifty or 80 or 100 different things into their algorithms.

    Yet in my experience, I have only seen real gain from my limited testing, when the content is at least 60% unique. Other people in our industry will tell you otherwise I am sure. Why I say at least 60% though – if the header and footer and navigation and sidebar are all almost identical that leaves the main content area. And each page, in my opinion, should have mostly unique inner content. Yet I have gotten very good results when I have one paragraph on a page that has maybe three or four total where that one paragraph is repeated on another page but where that one has at least three or four unique paragraphs.

    I have the fortune of working with some very gifted developers so those first two sites went up in about a week because they were so small compared to our clients 200 page main site. But it took me a couple weeks before that to lock the plan and specification. Since then, we’ve rolled out two more sites for that client, one of which is a blog. The other site is much more robust and took about three months to build out because it had to be tied to the cross-site CMS and we did a LOT of QA testing.

    Sending PPC traffic to the minor sites was a way to get traffic to those sites – targeted traffic that the main site was not as likely to get due to the fact that those sites were so refined in targeting specific geo-locations. Now we still do get some of that geo-location traffic to the main site, but highly qualified leads come through the mini-sites.

    I never ever ever make guarantees to clients when it comes to organics. I explain my reasoning, show my research results, and answer every question thrown at me honestly. It’s just as much about instilling confidence in a client that if you want results, I am about as good as you will get when it comes to “if anyone can succeed, I believe I can”.

    Why was I confident? Here’s where I need to admit that there’s fear in my bones with this stuff. Except I follow my intuition most of the time and that seems to ALMOST always be amazingly dead on. Get one victory of this nature under your belt and it does wonders when it comes to listening to your intuition. Get too cocky and the algorithm Gods will slap you down every time too though.

    I personally do not see this as outside the square. Every aspect of my methods has been carefully thought out over countless hours of research looking at what the biggest sites in the world do, and what the top competition does in dozens of markets. (I currently have about 40 clients that I and my team are responsible for). While this perspective is far from bulletproof, I also spend countless hours reading and following the top people in our field. And I analyze Matt Cutts articles and videos to the nth degree.

    And with every major change to the algorithms, I have found that my clients sites remain at the top of Google for most of their phrases, unlike sites I see that go down in flames.

    Bottom line though – like I said – I can’t guarantee anything, let alone that 1 day Google won’t slap every one of my client sites down.

  8. Thanks for the tips. Totally apart from links, there are site visitor benefits to having multiple sites as well. At least in my case, here in Smallpotatoesville,. Shortly after launching my original site for a business that has an over arching theme (to me) but includes sub-special interests for the customers, I realized that a visitor didn’t want to wade through, not even just one page of, African mask info to get to stuff about Indian jewelry. So I set up separate sites for each sub-interest.
    Only after the fact did I realize I could include cross links that might help with SE indexing. So far, it has worked out well. The only glitch has been having all the sites hosted with the same IP, which is somewhat of giveaway that they all may be coming from the same source, even though each is unique in content. So, I am in the process of moving some of them to new hosts. I don’t consider that black-hatting. I could justify relocating them on reliability bases alone.
    I realize now that I could have set up separately coded landing pages for one larger site. But, somehow that doesn’t seem to have the user benefits.

    Thanks again.

  9. I read your post this morning and it’s been in my head all day. I hope you are suitably flattered. :) Something occurred to me this evening and I’d love you to weigh in on the thought.

    I have seen the odd occasion of a site ranking even though the site had little to do with the search term, only because it a”appeared” that the site was indexed at a time that a featured content ad on the site did have the search term.

    Could that quirk be put to work here? If the niche sites displayed content advertising, and the main site fed content ads TO the niche sites – ads that were optimized with key phrases to help the niche rank – would there be an effect?

    I don’t know the answer. I’ve never tested this and don’t even know if it’s ethical strategy, but I’d be interested in what you think about it.

    Thanks for giving me some toys for my brain to play with.

  10. To everyone who has commented – I am truly grateful for all the kind words, and simply amazed ad how well this first article has been received.

    Martin – I honestly don’t know the answer to that because I have never experimented with any techniques related to ads that show up. I would have to assume that unless some sort of cloaking was used, that Google should be able to detect ad content as opposed to other content on the page, but again there I’m not an expert either.

    You may want to check with Rand Fishkin at SEOMoz or any one of the many other highly experienced people in our field who have put in serious testing time with all sorts of aspects of the process that I haven’t.

  11. Martin, I would also say that there could have been a host of other factors that caused that site ranking.

    What does come to mind is in the black hat realm having to do with inbound links or pages that were fed to the Googlebot that the human doesn’t see. Google doesn’t catch 100% of all the spam out there so sometimes things like that get through. (another rant for another blog article).

  12. William,

    You bring up a very good point. To me, the golden key to SEO has always been user experience. This is where Matt Cutts and I agree 110%. No matter how optimized a site is, when people come to the site, if they can’t get to what they want or need fast enough, bye bye!

    So this is how I approach SEO – and why I have zero problem with 93% of what Matt Cutts says is the best way to go about our work.

    And what I’ve found is when all else fails, if I am not sure of which optimization method to use, I put myself in the shoes of my clients customers. That almost always leads me to getting the SEO value I was going for.

  13. I am floored. Completely floored. My gratitude goes out to everyone who as re-tweeted, read, commented or sphunn this article. When I was writing it my only hope was that it would be helpful to a few readers. While it may be routine for some to get such a huge positive response, This experience in these few short days since it was published has been beyond my wildest dreams as someone just now becoming connected in our peer community.

    Now – the real question is – am I a one hit wonder? :-)

  14. Great Article (and follow up comment) but how do you handle all of the site regsitrations. Do you use the same details or (fake) penn names & addresses ?

  15. ooh a trick question from David!


    Okay that’s a very good question because the belief is that each additional site should be located on servers that are in completely separate IP blocks from the main site’s server, and that registration should be under a completely different company name and address.

    For most of my clients since the niche sites all have the name of the company all over them, this one is a no-brainer to me – I have them all registered on the same IP block and using the same registrant name. Why hide things under the hood when the site itself is full disclosure?

    I have, however seen a bump when the opposite approach is done. So the question is – can this be achieved through white-hat methods? ANd is the bump important enough?

    Because if you’re moving the site to a different IP block, that might actually be white-hat if the sites are associated with geo-location based offerings. over the years, many clients have expressed desire (without knowing what SEO is) that they want a site closer to the geo-location target clients. They seem to like the concept that a web server is closer to their regional offices.

    And different registrants is only truly white-hat when its a division of the bigger company with it’s own company name right?

    So that’s a couple examples of white hat ways to split the niche sites out.

    And how about registering each domain through the private registration process most domain companies offer now – shouldn’t that resolve that? I don’t know personally because I’ve never tested that – Does Google get a free pass to see what we’re told is truly private?

    Or registering one or more of the sites under the web development company’s name – how does that fit? That’s something that goes on all the time in our world. Again, never tested on my part.

    Then there’s the concept of having a main site registered through the corporate payment path.

    And if one or more niche sites are registered using the personal credit card information of somebody in the company, well I don’t necessarily recommend that but I can’t stop a client from not paying attention when they register their domain as to which card they use now can I?

  16. Excellent article. I haven’t worked on an environment of this scale so it was great for you to share your experiences and insights.

    Also found your mentions of corporate buy-in, planning, communication and having people on the same page to be right on the money. Long article but well worth the time to read and digest. Thanks.

  17. Mike,

    Glad you got something of value out of it. And remember that this tactic can work really well on a smaller scale as well. Doesn’t need to be only for mega-sites.

  18. Thxs Alan , that’s clear 😮

    But let’s assume you set up 100+ ministes (with good content of course, no crappy sites ) to support your moneysite.

    If we see the technical site of this setup, do you think Father G would see that as a linknetwork and devaluate the links or give your actual money site a penalty…?

  19. Good question David.

    What’s the pattern of content? Is it all truly unique content that as a stand-alone site, each offers real value? Or is it all just really fluff? If they offer real value, then they’re all really money sites in their own right. Maybe not all direct sales conversion. But surely brand building and at the very least community contributing.

    I’d say that’s what it comes down to.

    Beyond that, I can’t offer future-gazing about how the Goog will change over time. Unfortunately I’ve never studied how to read Tarot cards.

  20. @ Alan, you don’t know how to read Tarot cards ? Ok now you are disappointing me 😮

    When creating these supporting sites (and they do have 100% unique , no garbage content) they defenitely becomes moneysites in some way (reffer the traffic to your actual money site where the transaction take place). But i’m really curious in terms of linkbuidling, if i shoot 2-3 deeplinks from each of them to my moneysite will G consider these links as they were coming from a network..if all of the sites are located on the same IP and same registrant…

    If somebody has experience with this, or can read Tarot cards 😮 , drop us a line.

    Once more, a terrific post Alan !

  21. Truly magnificent article and comment replies, Alan!

    In item 5 you state the risk of the Niche website overtaking the main website. I’ve seen this before, but what can you do to prevent this? Is it a matter of slowly expanding the niche site(s) so you can control it better?

  22. Paul

    I just started to answer your question and after 20 minutes realized I have the contents of an entirely new article, just when it comes to niche sites possibly overtaking the main site!

    So rather than provide yet another book-length reply, I’ll summarize here and work on the bigger article this weekend!

    You need to clearly think through how much content you optimize on the niche site for a particular phrase, how many pages the site has that link to each other, how many inbound links you have…

    And even then, because the niche site is much more refined than the main site, you still may not be able to control what happens at Google because we can’t get access to their algorithm to run offline tests.

    The best bet is, then, as you say – to start slow and go from there, though that’s still not a guarantee – because we work in a no guarantees industry.

    Then again, maybe you WANT the niche site to overtake the main site! And that’s why I need to put out an article on this aspect of the process…

  23. Thanx for your swift reply, Alan!

    In most cases the clients won’t be happy if the Niche site overtakes the main website, but there may be cases where they don’t mind.

    Looking forward to your next article! Have a good weekend 😉

  24. Greetings from Finland Alan!

    I found your article great and subscribed to your Search Marketing Answers blog as well. This gave me supreme tools to approach a client of mine. They are going to be excited.

  25. Hello Finland! Jan, I’m glad to hear you found real value in the article. Don’t hesitate to ask if you have questions after discussing with your client.

  26. Really solid, well thought out, long term info here.

    I first learned about SEO from Ed Dale and his 30 Day Challenge. They employ a similar system, but they use web 2.0 properties instead of creating their own mini-sites. They set up a money page, and then they create Squidoo lenses, HubPages, etc. with links pointing back to the money page. Over time the money page shoots up in the ranks. Seems like your approach is the same thing, but 10x’s larger.

    You’re building a web of trustworthy sites and obviously Google (not to mention your client) appreciates your hard work and careful planning.

  27. Raza,

    Yes – the concept is similar. However Squidoo lenses and the like are at a serious disadvantage compared to my method.

    On the one hand, one or more pages within Squidoo can themselves be a good source for linking – they carry the weight of the entire Squidoo domain.

    On the other hand however, because I create entire multi-page stand-alone web sites, each dedicated domain can ultimately carry a great deal more relevant weight when it comes to how closely two or more entire sites relate to each other.

    So while your single relevant page is more valuable from a linking perspective because it’s one of 900,000 pages (and thus the link from that page is considered to be coming from an “authority” site), it’s an extremely diluted link because it is only one page on the subject matter.

    Of course, the cost involved with my approach can be far beyond what most people can afford or justify and that’s why having Squidoo lenses can definitely help as part of a much more cost conscious approach if a site’s budget requires that much constraint.

  28. Alan, really great article- thanks for making all your hard won wisdom available. there are great ideas here so thanks.

    Quick question – can you use subdomains of the main domain for niche sites or should they be separate top level domains?

  29. Andy,

    Wow – long delay in responding. somehow I missed your comment, and only found it upon a long overdue revisit here. Anyhow – sub-domains vs dedicated primary domains…

    By having sub-domains, you ultimately build up the depth of overall content associated with the primary domain, so there’s that to consider, given how important overall depth can be. But links from sub-domains aren’t truly “inbound”.

    Since this article was about inbound link building, then In my experience, having stand-alone niche domains offers the most potential in terms of inbound link value.

  30. I have a domain and i want multiple categories on it, probably in subdomains. Please give me an advice. Is it okay to have this kind of website? or do i have to purchase a domain for every topic? That would be very expensive. Thanks ahead!

  31. I have a domain and i want multiple categories on it, probably in subdomains. Please give me an advice. Is it okay to have this kind of website? or do i have to purchase a domain for every topic? That would be very expensive. Thanks ahead!