Please help me,

I am working on a Unix Server. The problem is that the website [ External links are visible to forum administrators only ] has a forum folder [ External links are visible to forum administrators only ] and I only want to create a sitemap of this folder.  So I placed the generator in this forum folder.

This is not a Blog but a working information site.   There must be something wrong with my configuration.  I am an old PHP programmer and can assure you that the flags are correct and as the directories in the generator folder.

Server memory is set at 76MB.

After a period of time crawling the unit throws the below error. 

I cannot disable the htaccess program of the [ External links are visible to forum administrators only ] site because it is required needed to activate the subfolder "forums".

As a test, if I run just "2" levels the program works fine.  I am only looking to create the xml and xml.gz files.

Since I am saving crawls every 30 seconds .. if the below error occurs .. I just restart the crawl and resume from the last position.

500: Internal server error

This error is generated when a script running on the server could not be implemented or permissions are incorrectly assigned for files or directories

Troubleshooting suggestions:

Temporarily disable any rewrite rules by renaming your .htaccess file if it exists.

Ensure that any CGI or Perl scripts have at least .755. permissions.

If trying to run PHP and you get this error, you may have an invalid php.ini in your /cgi-bin, or may be missing your php.dat file in this folder.



it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.

Called ISP today .. Explained that question of Memory_Limit and possible max_execution_time could be problem.  He and I both monitored the run of the sitemap again.   

Memory_Limit is set to 96MB, informed us the max_execution_time is not limited and we set the value at 0 which ISP uses for unlimited time.   We also tested max_execution_time set to 5000 seconds.

Run produced the following information with ISP monitoring noting that the run interrupted itself approximately every 500 -700 seconds.
1) database on has 5 levels
2) consists of approx. ~2000 pages
3) Memory used is ~4500 KB
4) Only building sitemap.xml and sitemap.gz
5) Total time to run 5 levels approximately 3000 seconds

Apparently something is amiss in the Configuration.

Any addition suggestions.

Hello,  in this case I would recommend to run generator in command line if you have ssh access to your server.
What is the recommended command line for the Generator.

It is displayed on Crawling page of sitemap generator and looks like:
php /path/to/generator/runcrawl.php