I have a good idea for my second program for Black of Hat. It will be a type of web crawler. You give it a web site. It then finds all the pages on that web site, and records their URLs. I can imagine a number of uses for such a program. For example you could analyze your web site.
My goal is to knock out this program and release it here on my blog in a couple days. I hope that it is as easy as I think to write the code. This program will be the stepping stone for others like it. For example I could then write a program which identifies dead links on your site. Or I could do a link analysis of your sight for Search Engine Optimization purposes.
Keep an eye out for this second program coming soon to Black of Hat. I think I shall call it Link Crawler.
Work Smarter not Harder
-
We have large data sets in my current project. Every year tons more data is
loaded into the system. So we only keep the majority of data for 4 years.
After...