Thats great I think. Could anybody make a kind of full copy of all this? This way we would all damages would be limited and if that would allow recever previous content structure and content all losts could be neutralised. I think the same content in different location is the same forum. Is it possible to make an offline copy of all this before it will dissapear or will damaged by "unknown persons" ??
it's all about month old, but I think it's more recent then anything anyone else has. If the whole wiki is cached that would be even more useful. Maybe we could create a bot that would repost all these threads but I'm not sure it would be worth the effort
I just think we should try to save as much as we can. Even if it would be a part of that would more then nothing. And it is easier to rebuild by adding to sth. IMHO. I don't know but we should do it ASAP befor it will disappear or will be ovewritten... Do you know any plugin (or anything else) which could save this to disk? Like a whole site? The whole cashed service
Well, Warrick (http://warrick.cs.odu.edu/) does the job It needs Perl (or the activestate version for windows), Simple::XML and SOAP::Lite
No point in fetching the live cache again, but it didn't fetch almost any google cache page... (and yahoo cache fetching is broken) Because it got blacklisted by google after a few requests.. I still have an API key for the deprecated SOAP requests, but warrick doesn't use it to fetch the cached pages.