I've run the Site Spider through one of my sites and came up with quite a few errors.
Is there any way that I can keep the results page viewable and then open up the offending links/pages to correct the errors.
I tried minimising the link but that then minimises the whole programme.
I tried exporting it to a .csv file but it doesn't save the important part - the referring url.
I have also tried exporting it to a Word doc but with the same results.
There must be a way of being able to work on the problems without having to lose that info in the first place.
Site Spider problem
Philip,
Are you using the HTML Editor spider? Under file, there is an export function that will give you a csv file. If you highlight all the entries, you can click on Edit|copy and paste that into a file.
If you are using the Google SiteMapper, just save it and have a web page with everything displayed.
Hope this helps.
Are you using the HTML Editor spider? Under file, there is an export function that will give you a csv file. If you highlight all the entries, you can click on Edit|copy and paste that into a file.
If you are using the Google SiteMapper, just save it and have a web page with everything displayed.
Hope this helps.
Hi Bill.
It's the HTML site spider. I tried exporting it but the important part - the Referring Url - doesn't export over.
The problem started when Google began reporting bad links such as http://w w w.harriers-online.co.uk//squad/smith. Note the two forward slashes together where there should only be one.
The site spider finds which actual page these errors originate from so I can pinpoint the offending link. I just need to be able to edit that particular page without losing the information that site spider has given me.
I estimated that there were around 50 bad links and to run site spider, find a bad link, close the results page, edit the offending page and then run the site spider again to find the next bad link would take an eternity.
I don't think the Google Sitemapper would be the answer for me because although it could tell me the offending url it would not tell me from whence it came.
The problem could be cured -Scott? - by being able to open the site spider in a different window. This could also be done for the Website Colour Schemer too because this also has to be closed every time you use it to return back to the Editor.
It's the HTML site spider. I tried exporting it but the important part - the Referring Url - doesn't export over.
The problem started when Google began reporting bad links such as http://w w w.harriers-online.co.uk//squad/smith. Note the two forward slashes together where there should only be one.
The site spider finds which actual page these errors originate from so I can pinpoint the offending link. I just need to be able to edit that particular page without losing the information that site spider has given me.
I estimated that there were around 50 bad links and to run site spider, find a bad link, close the results page, edit the offending page and then run the site spider again to find the next bad link would take an eternity.
I don't think the Google Sitemapper would be the answer for me because although it could tell me the offending url it would not tell me from whence it came.
The problem could be cured -Scott? - by being able to open the site spider in a different window. This could also be done for the Website Colour Schemer too because this also has to be closed every time you use it to return back to the Editor.
Philip,
Now I see what you mean. Having the spider results visible while having the editor open is almost a requirement for your situation.
Now I see what you mean. Having the spider results visible while having the editor open is almost a requirement for your situation.
Have something to add? We’d love to hear it!
You must have an account to participate. Please Sign In Here, then join the conversation.