SiteScan Version 1.1 Release
Now that I have this major bug fixed in SiteScan, it is time for me to join SiteScan code with some newer ideas into a new mega program. Not sure what the name of this program will be. But it should allow you to simulate traffic to all pages of your web site. This mega app could have all kinds of interesting uses - legit and otherwise. For example you could check the performance of your web server. Or if you needed to test your page view counters, you could leave this app run all day and "go to town".
All the code is here for me to put this mega app together. I just need to sit down and do it.
SiteScan Known Issues
I will provide an updated version of the program soon that fixes this problem. Another issue that is becoming plain to me is that the program is slow. Part of the problem is due to the fact that the program has to download each web page linked in to find new links. It also has to keep a unique list of links so it does not go in circles when pages link to each other. However I have some ideas to have the program do more than 1 thing at a time to significantly speed up the throughput.
These are exciting times. Once I fix the SiteScan application, I plan to combine some other ideas with the SiteScan program to produce a truly useful and comprehensive tool. Not sure what I am going to name this composite application. I am open to suggestions.
SiteScan App Released
Be warned that this application can take a long time to run for large web sites. I have a fast Internet connection and web server where my biggest blog is hosted. SiteScan took 15 minutes to scan the whole blog. Part of the time is because the app has to download and look for links in all your pages. The other time consuming activity is to make sure the scanner does not get stuck in link circles. If Page 1 links to Page 2, which in turn links back to Page 1, then SiteScan has to detect this and not just bounce back between both pages forever.
This application assumes that the site you are scanning is static. That is, if you go and change the web site which the program is running, you may get unexpected results. I plan to kick off SiteScan tonight and have it scan all of MSN. Let's hope this job finishes by morning. I will let you know how it goes. In the mean time, enjoy the SiteScan application.
For Programmers Only
Currently I code up most of my apps using Visual C++ version 6.0. This is actually a very old version of the C++ compiler from Microsoft. However I have found that it is still very useful. When I release a new application to the public, I want the experience to be simple. I just want to give my users a single executable file that they can double-click and run.
My executable that gets distributed at release time must contain all of the logic build into the executable. I don't like bothering my users with complicated install programs. Visual C++ normally depends on some additional dynamic linked library (DLLs) provided by Microsoft. I choose instead to have the library code embedded directly in the executable. Here is how you accomplish this with Microsoft Visual C++ version 6.0 standard edition:
- Choose Settings from the Project menu
- On the Project Settings dialog, click on the C/C++ tab
- On the left, change the "Settings For:" combo box to Win32 Release
- In the Project Options edit control on the bottom, change /MD to /MT
- In the Project Options edit control on the bottom, remove /D "_AFXDLL"
After you have followed steps 1 through 5 above, rebuild the release version of your application. The resulting executable has all of the dependent library code built in. This does make the executable bigger. However everything it needs it included in the executable. I don't recommend this method for huge applications. But for the size of the apps I have been releasing, this makes it very easy for the user to run my app with minimal problems during install.
Be on the lookout for an upcoming post when I release my sitescan application to the public.
Indexed by Google
- I added links to the blog on every page of my main web site
- I created a new post in all my major blogs, linking to my new blog
- I submitted my blog URL to many web directories that have high Page Rank
Now let's see if I can keep up the good work. I really want the Google bot to keep coming back and indexing the new content I add to the blog. I might continue to add my blog URL to other web directories with lesser page rank. However I believe what I really need to do is to continue to generate unique, interesting, and frequency new content for the blog.
I do have a number of other programs which are already complete, but have little to do with black hat activities. Maybe I will post one or two of them here. Perhaps I can modify them slightly to show some interesting programming techniques. Either way I shall also continue to generate new ideas and write and release programs here for your use. For now they shall all be free of charge. Enjoy.
New Idea for Prog
My goal is to knock out this program and release it here on my blog in a couple days. I hope that it is as easy as I think to write the code. This program will be the stepping stone for others like it. For example I could then write a program which identifies dead links on your site. Or I could do a link analysis of your sight for Search Engine Optimization purposes.
Keep an eye out for this second program coming soon to Black of Hat. I think I shall call it Link Crawler.