Local search has been a very interesting challenge and alot of old and new companies are taking a stab. Lets look at this entire continuum and where each player sits. Radical idea is to find information that is geo tagged. This information can be as granular as lat-long or as coarse as a city name. So when you setup a shop to solve local search, first question is how do you collect & tag this information? How do you present this informaton so that it creates the most wholesome experience? How do you make money from this high investment business?
You can start with smart-crawling the web. I call it smart crawling because you dont crawl everything like a general purpose search engine. You need to identify and extract local information given a webpage. Dedupe listings that occurs on multiple pages. Lets says you write a kickass neural network that labels & classify this information. How does it know which one is real address? How do you verify the information you have is correct? You start thinking about this and realize that there are directory listing services that is accumulating such information forever. So you get feeds from yellow pages. There is a problem of matching different feed formats along with crawled information. This is where most of the current search engines are today. Google/Yahoo/Live/Guruji can do “crawl + feeds” using existing infrastructure.