by ailurophobe » 20 Feb 2012 22:39
Collections or a zip archive. You can compress all the files of the web site into a ZIP file, share the file, and then distribute the magnet link to the file. People would then download and share the file. To view you would just open the archive (most OS understand ZIP natively), double-click on the HTML file inside (better package things you do not want people to start with under sub-folders) and the site would open in the browser. Of course if anything on the site changed you' have to do everything from the beginning... But that is a limitation of hash based P2P links regardless of how you do it. If that is a problem (site is large and actively updated) collections would work better. I don't think they are designed to do exactly what you want, but a collection of all the files on the site would give a downloader a local copy of the site just like the ZIP method with the difference that it should (never have done anything with collection...) understand to only download files that are new or changed.
The ZIP method would have better performance, so use that if the site does not change or only changes incrementally. Like a journal, the latest issue does not include or change the previous issues, just a list of links to the previous issues. Include the date of release or a serial number in the filename of the archive and it will work just fine with Shareaza search. So you'd have something like "<descriptive name>.<tag 1>.<tag 2>.<date>.zip". The <descriptive name> would be a name that would tell something about the topic of the site so people could stumble upon it on search. Tags would be constants like "compressed website", "456ertyU", or "jefft0". People doing a search could include one of them in the search along with the descriptive name and use the second as a filter. This would first cut down on false hits and second filter out random spam. Date should be obvious.