Local Search · SEO

Local SEO: Using Multiple URLs for Rankings

One tactic frequently implemented successfully with Local SEO is the use of multiple URLs, or micro-sites. To help you better understand how this functions, I’ve chosen to use a real life example of a site network in current use that is neither affiliated with me nor with Search Engine Journal.

To begin, allow me to say that I will not be writing about how to implement this. You will be able to ascertain by reading between the lines how to do this (or by reading this post from the bottom up).

Rather, I will be explaining exactly how I was able to reverse engineer the SEO strategy of this particular business and the thinking that leads to my methodology.

My journey began with a search for the term ‘Warminster carpet cleaning’.

Directly below the Google Local results, appearing in position one, was the following:

micro sites local 01 Local SEO: Using Multiple URLs for Rankings


I clicked through to this site and noticed something very odd….

The site had a handful of pages, but was specific to Warminster only?

Now, for anyone unfamiliar with Warminster (PA), it is a pretty small town. There’s no way a real carpet cleaning business would only service Warminster. There are dozens of towns within a few miles of Warminster yet the site does not mention any of them. That doesn’t make sense.

Between that oddity, and the fact that the URL exactly matched my original, non-branded search of ‘Warminster carpet cleaning’, I was able to conclude that there may be something worth looking into.

From here, I chose to begin searching for ‘carpet cleaning’ in other nearby towns to see if I could find something similar. After all, it’s only logical that if someone would use this tactic that they will own similar nearby geo URLs.

I chose to search in ‘Lansdale’.

Sure enough, when I searched for ‘Lansdale carpet cleaning’, I found virtually the exact same result:

micro sites local 02 Local SEO: Using Multiple URLs for Rankings

The only difference was that the URL was an exact match for ‘Lansdale’ instead of ‘Warminster’.

The two sites are virtually identical (ie duplicate content) in every way if you look at the sites themselves.

So, now I wanted to know how many sites (geos) are within this network and how they are linked. I wanted to know how big this is, and how successful it is.

I could have simply done a domain lookup to see exactly what the registrant owns, but I also wanted to see their linking strategy for myself.

For this reason, I chose to use Yahoo Site Explorer to see how, if at all, there is inter-linking in play here. Plus, honestly, I was curious to see if the URL ranked because of the exact match or if the site may be buying links.

After choosing to have Site Explorer show links ‘Except from this domain’, this is the result:

micro sites local 03 Local SEO: Using Multiple URLs for Rankings

It’s now clear that the only site linking to ‘Warminstercarpetcleaning.com’ is ‘AllClean1.com’. The same holds true for ‘Lansdalecarpetcleaning.com’.

This means, when I go to ‘AllClean1.com’, I’ll probably find the core of this network and some of the answers I was looking for.

Sure enough, right in the footer of ‘AllClean1.com’, is the list of every geo included in the network, with links to outside ‘<Geo>CarpetCleaning’ URLs.

micro sites local 04 Local SEO: Using Multiple URLs for Rankings

I have now been able look at a single ranking, and use a bit of competitive intelligence to rip apart the local SEO strategy this company is effectively implementing.

This is not only an effective tactic for competitive intelligence, but for education as well. It’s a great way to learn SEO or SEM. It’s also a great test project to give an employee to see if they can determine the correct answer.

This particular case study left me with a bunch of questions I’m hoping the readers have answers to. Do you think that the mainly duplicate sites pose a long term problem to the current effectiveness here? Is there an opportunity to create your own business model from what you’ve just seen? Do you look at rankings and do this in your free time, too? Please comment below and let everyone know what you think.

Matt Leonard currently directs SEO, SEM and Revenue Management for Cruise Critic, the world’s largest cruise site and part of the Trip Advisor Media Group. You can follow Matt Leonard on Twitter to keep up with his updates.

 Local SEO: Using Multiple URLs for Rankings
Matt Leonard currently directs SEO, SEM and Revenue Management for Cruise Critic, the world’s largest cruise site and part of the Trip Advisor Media Group. You can follow Matt Leonard on Twitter to keep up with his updates. Feel free to ask about his latest charity project, ‘Tweet for the Cure’, to benefit Susan G. Komen for the Cure. The opinions expressed are that of Matt Leonard and not necessarily those of Expedia, Trip Advisor or Cruise Critic.

You Might Also Like

Comments are closed.

36 thoughts on “Local SEO: Using Multiple URLs for Rankings

  1. I have used Geo domains to point a “master domain” and develop content on for years and it works like a charm :-) I find the throughput and ability to match the users intent incredibly effective as an SEO strategy

  2. Hey Matt nice to see you… again. i have been missing you on Twitter. I had no idea you worked for a cruise site. In regards to this article looks like you have been busy as an “SEO” geek. Why else would someone pick apart a site unless they are total SEO geeks. But then again thanks for doing this it gives meaning to the explanation of Geo targeting. Maybe next time you can write a guest post for us ;) lol

  3. I’m fascinated that the lansdale site and the warminster sites were exact duplicates, aside from the geo targetting, but showed up anyways in the serps for that geographic position and industry term…

    It looks like they customized JUST the homepage though with unique text, all the other pages are duplicate though…

    Also Weird/interesting:
    # Google
    User-agent: googlebot

    Disallow: /Lansdale_upholstery_cleaning.html

    Looks like..they attempted to block their internal pages, but got the urls wrong (that page doesn’t exist) So they gambled that by disallowing all the other dup pages they would get by on focusing pretty much a single crawlable page with a relevant domain…

  4. Those serps are SUPER easy soft targets, with no traffic that’s registered by Google external adwords tool… so that’d be why it didn’t take much unique content to get to the top…

    I question this method working in high competition cities/industries….

  5. This is indeed a viable geo-targeting strategy. And, in my experience, sites that are geo-targeted to such a specific service area the duplicate content likely won’t be an issue in the long term. Which is good, considering even the free Copyscape report picks up on several occurrences of duplicate content.

    I’ve seen similar tactics employed using subdomains (i.e. warminster.carpetcleaning.com) with similar results. Truthfully, though, with the little allintitle: competition that exists for a term like “carpet cleaning warminster”, similar results could probably have been achieved using individual pages of the primary domain. Just a different means to the same end.

    I applaud your researching efforts, Sherlock Holmes. Well done. :)

  6. Hi,
    I think, to get higher ranking, you also can use
    specially scripts. I have 9 specially sites to get
    many indexes by google. Using Subdomains
    is a great idea, thank you for your good information.
    Greetings from Germany,
    Bernd

  7. In the past Google talked about dupe content that has valid reasons for existing (one of Vanessa Fox’s last Google-employed posts ring a bell); geo results are the kind that often get the pass. The thin interlinking, thin content, etc., appears to have power, but it probably doesn’t.

    This is easy to abuse. I worked with a company briefly that did the exact same thing with great success about 6 months ago.

    There are a few vendors that target this. This company is no doubt using one of those vendors. It’s fun to look up their DNS info.

    Gotta admit, it is kind of cool seeing the town I was born and raised in get a mention here. Warminster IS a small town, indeed.

  8. I’ve had success with this, too and find it works best to have as much unique content as possible intermingled with duplicate content if you’re going for more competitive markets.

    Oh, and subdomains work, too, if you can logically set them up as if they were their own site.

  9. Foot in Mouth: You are correct (and your sleuthing is brilliant). It would be substantially more difficult to do this with a large city. Being from the Philadelphia area, it was interesting to me to see that this company picked the Philly burbs well. Also, volume is less but I’m sure carpet cleaning in the burbs is substantially more profitable at the individual transaction level.

    Bill Sebald: Good to see a fellow Warminster person passing by (Ivyland here). You’re right about the dup content with geos getting a pass.

  10. Wow, finally an example I can look at and learn from – this is very close to what I have a problem with right now.

    I have a number of sites – 15 or so that are focused on the same service offered in different states. Each state has this service, and yet they are completely different from each other.

    Rather than create 60 pages/posts of separate content for each site I noindexed those pages that are duplicates in every sense except the state names on the pages are different.

    Then I added 12-15 unique pages of good content to each site.

    My question is – do I need to noindex all that duplicate content or would I be ok leaving it in there because it’s FULL of great keywords…

    If these carpet cleaning sites are the same except 1 page of unique content and indexed in Google and it’s been a long time – 6 months or so, then it’s probably worth me removing the noindex tag.

    Anyone else have experience with exactly this issue? I’ve never seen it addressed by Matt or anyone else.

    It would be nice if Matt addressed this at some point because there must be thousands of people who would love to have a definitive answer. In the meantime I’ll do a little research on this network you mentioned.

    Thanks for the excellent post!

  11. I’ve implemented similar solutions, and have been advocating them since 2007.

    You definitely can gain similar results by having one domain with geo-targeted content.

    Dupe content is a spotty penalty / algorythm, since many news sites will repost other people’s content.

    I would recommend spinning those articles before posting them, and definitely keep them indexed.

    Of course, there’s the dirtier ways, that we just won’t talk about here…

  12. Is there any evidence that google eventually punishes this type of thing? Is there even any record of google officially saying not to do this?

  13. Matt,

    Great detective skills. Applaud! Perhaps a canonical tag on all these sites pointing to the parent site may be sufficient?

  14. I have seen this on other websites. Some only have the home page with different content with no other pages. They do ,however, have a lot of links.

  15. Great post! I think this page will soon get #1 ranking for “Warminster carpet cleaning” =] LOL

    I have this idea that bothers me, maybe you guys can help me:
    I’m working on a yellow pages site and considering adding every business name as a sub domain. So there would be something like 100k of them. Is this a good idea?
    Well if I was google… I would punish this site or just consider these keywords as if they were somewhere in the end of URL.
    What do you think?

  16. It may work for a short period, but not a long time since Google doesn’t like repeated content. It should work only if you put unique content on each site.

  17. While I haven’t done this – although I have THOUGHT about doing this for several of my clients – I can vouch that it 100% works.

    One of my client’s competitors in the “crime scene” cleaning world employs this tactic very effectively with not just dozens, but hundreds of geo targeted urls.

    He gets paid a referral fee for sending businesses in that industry a qualified lead, so if a site gets popped, he’s got plenty more to go around. It’s been working for him for several years, although I’ve kicked his ass in the more competitive markets.

    Be very cautious if using this method for a client’s site – or let them know of the potential danger of penalization. Reporting the sites as spam won’t do anything – I don’t even care what everyone else does anymore, unless they’re directly scraping my content. Not worth the effort.

    Great post.

  18. Hey Matt,

    Some people think this is a bit unorthodox but my perspective is different. This company is making the most of how search engines rank, especially from a local perspective. This process is no different than creating specific landing pages for PPC keywords. Everyone knows that the more targeted the landing page (to the search query), the better the results. Hats off to these guys!

  19. What about the people in Warminster and Lansdale?

    If these towns are really small, they’re probably tickled to see websites built just for them… And based on a service they’re looking for.

    Not saying it’s right or wrong – but let’s not forget about the customer perspective.

  20. Just to be clear, I’m all for the little guy and don’t think these guys are doing anything wrong at all. They are a legitimate business trying to rank for local search terms. I would hope that a search engine agrees with me long term because these people have invested time and effort to help people see their product. The reality is, no matter what keyword research says (‘no data’), people do search for services at their town level. I say good for this guy. Besides that, the only honest way to make things not duplicate is to write about the individual towns (since carpet cleaning is carpet cleaning, no matter how you spin it). Clearly people are not looking for town info. So I say, no harm, no foul here. So far Google agrees. We’ll see.

  21. Hi Matt,

    Thanks for bringing this up. We have a lot of sites with geo-targeted URL’s. The key in my opinion is exactly what @MplsWebGuy says: When the consumer hits the page does it look custom tailored to them? In this case somewhat. At quick glance it does. Marchex and MechantCircle do tons of this. From a pure SEO perspective try unique home page content (even 500 words) and have each site have the inner content mashed up differently and you will see this work on a huge scale.

    Now if you can partner with outside sites for links beyond your own network, you can rank for major cities too.

  22. Does it really works with Google?. Because Google says its better to have a unique content on each site… So does many articles say.

  23. Just checked AllClean1.com— and it appears the site has been penalized in Google. After searching for “allclean1″ in Google, their official website (allclean1.com) is no where to be found. However, doing a “site:allclean1.com” shows the site is in Google’s index. While searching Yahoo and Bing for “allclean1″, their website (allclean1.com) is number one. The microsites are still ranking well for their targeted keywords.

    Seems as though their SEO strategy caught up to them, at least with Google… but if all their microsites still do well, then they are still generating leads…

  24. i use with one of my clients this exact method, but without the so unnecessary main website, that contact all of them together.

    the most important thing is the domain name and that’s about it.
    the anchor you get, or the first link to Google is not a real factor or worth in any way the chance that Google might punish you.

    people are very accustomed to have everything linked, for no good reason.
    if you simply send the websites to other search engine and Google included, by some service that is designed just for that, you get both the link and the orientation that serves as “half” anchor.

    anybody has some light to shed in regards to that?

    because we do fairly good in the biggest cities (new york, la…) without linking from anywhere and being way less exposed.

  25. Not only is this effective, it’s the model we use to help business get traffic they’re giving up to large corporations who are dominating ranking in our client’s cities and industries. For example why would it do any good for a carpet cleaning company in Arizona to pay for for advertising (PPC) for better search engine results and have to pay for clicks by people looking for carpet cleaning in Washington. BIG WASTE of Money. Not to mention the fact that when a person searches for a business like carpet cleaning, many people will precede their search by entering a city name or state name to find local businesses offering the service they are looking for.

    Our company is based in Missouri, but in fact I just talked to a guy in Arkansas and we may be working with him to build local traffic to his site. So even though they aren’t local to us we will use the techniques discussed in this post to help him locally.

  26. Matt,

    Thank you for your article. I am just getting into local search services for small businesses in my area (Western Maryland) and no where have I ever see information about how to list for one company in several locations.

    As long as you don't abuse this method it looks like it could be very effective. I remember reading in one case study for the “Google 10-box” about a lock smith company that locked up listings in a large number of cities across the country. It may well have been similar to the example you have shown.

    Thanks again, you have been very helpful.

    Allen

  27. Matt,

    Thank you for your article. I am just getting into local search services for small businesses in my area (Western Maryland) and no where have I ever see information about how to list for one company in several locations.

    As long as you don't abuse this method it looks like it could be very effective. I remember reading in one case study for the “Google 10-box” about a lock smith company that locked up listings in a large number of cities across the country. It may well have been similar to the example you have shown.

    Thanks again, you have been very helpful.

    Allen