I have a good idea for my second program for Black of Hat. It will be a type of web crawler. You give it a web site. It then finds all the pages on that web site, and records their URLs. I can imagine a number of uses for such a program. For example you could analyze your web site.
My goal is to knock out this program and release it here on my blog in a couple days. I hope that it is as easy as I think to write the code. This program will be the stepping stone for others like it. For example I could then write a program which identifies dead links on your site. Or I could do a link analysis of your sight for Search Engine Optimization purposes.
Keep an eye out for this second program coming soon to Black of Hat. I think I shall call it Link Crawler.
Mysterious Double Instance Hampering Performance - I study the existing code base. Confer with a colleague. Then I determine the optimal plan to change the functionality to load only a slice of all the dat...