Search idea to reduce congestion.
Posted: 14 Jan 2011 20:40
I've noticed that it is possible that there may be several sources for a download, but all of them are lacking the same portion. When doing a search, one does not know this; a lot of time is wasted downloading a file that can never be completed as the whole is not available. So, I suggest that a filter be made that removes files that don't have a complete file available from the combined known sources (maybe stable, unfirewalled, and not busy too) or maybe just one source with the entire file.
Another idea I presented before (I was welcomed to help develop...I would if I knew how) was to identify files that are the same file yet with different hashes and combine. What I mean is that many possible downloads are the same except for perhaps a name change or some other trivial (perhaps a corruption of a bit) change. Rather than having say, 100 possible downloads with one source each, have one download with 100 sources. I know such a antilogarithm exists as I read about it a few years back; I believe it would also work on finding partials with identical bit ranges to the whole and incorporating said sources too as well.
Another idea I presented before (I was welcomed to help develop...I would if I knew how) was to identify files that are the same file yet with different hashes and combine. What I mean is that many possible downloads are the same except for perhaps a name change or some other trivial (perhaps a corruption of a bit) change. Rather than having say, 100 possible downloads with one source each, have one download with 100 sources. I know such a antilogarithm exists as I read about it a few years back; I believe it would also work on finding partials with identical bit ranges to the whole and incorporating said sources too as well.