What is the best crawler tech to use?

I guess by best it would be easiest to learn and most functional.

I have looked and it seems that perl, python, c, all with mysql database, are the three main choices to go with. Maybe other choices include ruby or lua, not sure.

What is a good software solution for a web crawler? I prefer FreeBSD server, but open to suggestions.

Thanks.
 
I would prefer C or Perl (I don't know python). Perl because you can simply parse what you want and/or good dealing with utf8 without headaches..
For C there are some libs that you can use very well like libiconv, libcurl or libserf.

So if you need to 'write code fast' - write in perl. Need 'write fast code' - write in C ;)

Note you can extend your perl code with XS so for basic startup I would choose perl xD
 
Back
Top