Page 1 of 1

Recovery of the Wiki data

PostPosted: 17 Jun 2009 15:56
by outcrop
There are about 760 old wiki pages cached in the google's cache now.

Is anyone has the database of wiki? Otherwise, it's better to collect it manually from the cache.

I tried to use my spider to grab all the caches, but unfortunately, google will regard my spider as malware after tens of grabs.

Re: Recovery of the Wiki data

PostPosted: 17 Jun 2009 18:01
by ce3c
If there's no backup I'll take a look at it friday/saturday.
This thread could be merged with the other.

Re: Recovery of the Wiki data

PostPosted: 17 Jun 2009 19:53
by ocexyz

Re: Recovery of the Wiki data

PostPosted: 17 Jun 2009 20:04
by ocexyz
This also relates national languages versions. I don't have even an idea how to do it with chinese, even simplified, or persian, or jidish, or greek... :? Please save it guys if you can. Preferable with structure, if possible.

Re: Recovery of the Wiki data

PostPosted: 17 Jun 2009 22:32
by kathw

Re: Recovery of the Wiki data

PostPosted: 18 Jun 2009 06:54
by ocexyz
I think this can be used to recover http://warrick.cs.odu.edu/. At last is says so. We ought to try. Note there is limitation up to 1000 grabs a day.