Page 1 of 1

Search idea to reduce congestion.

PostPosted: 14 Jan 2011 20:40
by iamhuston
I've noticed that it is possible that there may be several sources for a download, but all of them are lacking the same portion. When doing a search, one does not know this; a lot of time is wasted downloading a file that can never be completed as the whole is not available. So, I suggest that a filter be made that removes files that don't have a complete file available from the combined known sources (maybe stable, unfirewalled, and not busy too) or maybe just one source with the entire file.

Another idea I presented before (I was welcomed to help develop...I would if I knew how) was to identify files that are the same file yet with different hashes and combine. What I mean is that many possible downloads are the same except for perhaps a name change or some other trivial (perhaps a corruption of a bit) change. Rather than having say, 100 possible downloads with one source each, have one download with 100 sources. I know such a antilogarithm exists as I read about it a few years back; I believe it would also work on finding partials with identical bit ranges to the whole and incorporating said sources too as well.

Re: Search idea to reduce congestion.

PostPosted: 15 Jan 2011 10:21
by siavoshkc

Re: Search idea to reduce congestion.

PostPosted: 15 Jan 2011 21:03
by old_death

Re: Search idea to reduce congestion.

PostPosted: 22 Jan 2011 04:24
by cyko_01
we had a small team of developers working on developing an algorithm (under the name "flox" I believe) to find matching chunks in files with two different hashes, but the project sort of fizzled out.

More info HERE
sourceforge project page HERE

Re: Search idea to reduce congestion.

PostPosted: 22 Jan 2011 05:04
by old_death