by ailurophobe » 18 Jul 2011 14:19
I am not sure about DC++, but neither ED2K nor the Gnutella networks were really designed to share more than a thousand or so files at once. Four to five thousand files works pretty well most of the time. Anyway, boosting the ability of Shareaza to handle 30k+ shared files would not really help that much without the protocols being updated. You are better off unsharing some files and maybe cycling which files you share and which you do not. You can also pack small files into archives containing few dozen or even few hundred smaller files. Apart from reducing the number of files and making sharing work better, this also makes distribution more efficient. Anyone downloading one of the files in the archive is automatically sharing and making available all the files in the archive, which should give a huge boost on the availability of the files.
As for the downloads, the active connections used by Gnutella 2 cause increasing overhead and inefficiency as the number of active downloads increases. Thus the number of active downloads is capped, which means that optimizing for large number of downloads (most of which could not possibly be active due to the cap) has not been a priority or even a concern. That said, OldDeath has shown interest to replacing CLists with std::sets and that might fix this problem at least partially.