Client side scraping for contacts

By curious coincidence, just a day after my post on client side scraping, I had a chance to demo this to a client. They were making a contacts database. Now, there are two big problems with managing contacts.

  1. Getting complete information
  2. Keeping it up to date

Now, people happy to fill out information about themselves in great detail. If you look at the public profiles on LinkedIn, you’ll find enough and more details about most people.

Normally, when getting contact details about someone, I search for their name on Google with a “” and look at that information.

Could this be automated?

I spent a couple of hours and came up with a primitive contacts scraper. Click on the link, type in a name, and you should get the LinkedIn profile for that person. (Caveat: It’s very primitive. It works only for specific URL public profiles. Try ‘Peter Peverelli’ as an example.)

It uses two technologies. Google AJAX Search API and YQL. The search() function searches for a phrase…

google.load("search", "1");
google.setOnLoadCallback(function () {
    gs = new;
function search(phrase, fn) {
        function() { fn(this.results); });

… and the linkedin() function takes a LinkedIn URL and extracts the relevant information from it, using XPath.

function scrape(url, xpath, fn) {
    $.getJSON('', {
        q: 'select * from html where(url="' +
            url + '" and xpath="' + xpath + '")',
        format: 'json'
    }, fn);
function linkedin(url, fn) {
    scrape(url, "//li[@class][h3]", fn);

So if you wanted to find Peter Peverelli, it searches on Google for “Peter Peverelli” and picks the first result.

From this result, it displays all the <LI> tags which have a class and a <H3> element inside them (that’s what the //li[@class][h3] XPath does).

The real value of this is in bulk usage. When there’s a big list of contacts, you don’t need to scan each of them for updates. They can be automatically updated — even if all you know is the person’s name, and perhaps where they worked at some point in time.

  1. Prakash says:

    Have you tried xobni add-in for outlook (incase you use that to manage contacts)?

  2. S Anand says:

    I don’t use it myself, but have tried it — and interestingly enough, recommended it to this client.

  3. […] Blog to create a feed. I had written custom scraping code to create the feed.Today after reading Anand’s blog I parsed the blog using YQL and created the feed using Pipes. Using YQL/PIPE much easier thank […]

  4. Thejesh GN says:

    After reading this post. I redid Aamir Khans blog parsing using YQL and Pipes.
    It was very easy as YQL was available as module inside Pipes.

  5. Vasanth Asokan says:

    Hi Anand,

    The scraper demo does not seem to be working (I tried the value you suggested as well ‘Peter Peverelli’. Can you take a look?

  6. S Anand says:

    True, Vasanth. Honestly, scraping was only ever useful as a temporary strut until service providers opened up the data. It required too much maintenance anyway. With LinkedIn’s API, this is outdated, and I’m planning to leave it as it is. But thanks for pointing this out!