Please, someone tell me: is there anything I can to to get a proper, stable connection to G1 and force it to stay fully connected? Today I can't seem to get it to spend any length of time with more than one ultrapeer connected. Any 2nd, 3rd, or 4th connection to G1 lasts about as long as the proverbial snowball in hell. That's when it manages to establish any others at all. It both has a very high rate of failure establishing connections, and a very short average connection lifetime. This poor quality is simply unacceptable. Is there any setting change I can make that will make the connection to G1 something at least vaguely resembling stable???
And second: the long term chart over at crawler.doxu.org shows G2 strong up until 2012, and then entering what looks like an exponential decay in network size that has been ongoing for over two years now. What is the cause of this, and has anyone researched how to reverse it and get G2 back to normal again?
If both G1 and G2 become unusable, the former by becoming too unstable to be useful and the latter by becoming too small to be useful, there will no longer be any decent way to search for and get many smaller files, for example pictures and music. Every other P2P system seems to be geared toward larger files and has (unreliable, often ad- or even malware-ridden) web sites as the search interface (*cough*BitTorrent*cough*), and to be balkanized, so there's no global search that, if a file exists that that protocol can download, a single search will find it.
As for non-P2P, the only real alternative is the web, which is much less suited to just search, find, right click, download, as one must navigate a thicket of links and ads and is prone to run across obstacles if Javashit is disabled, obstacles that often seem to be intentionally put there by website owners just to be a jackass to people who take the security of their computers seriously. And no web browser deals nearly as nicely with managing significant numbers of simultaneous downloads, including resumption of interrupted downloads and the like, as Shareaza does. And, of course, Shareaza's library and "files you have already" filter makes avoiding downloading the same thing twice a cinch. Web browsers try to keep track of visited vs. unvisited links, but don't do a great job, are further crippled by websites that insist on using stylesheets that don't color the two differently, and have short memories sometimes.
So:
Is there a way to fix my connection to G1?
Is there a way to fix G2, which seems to have problems beyond what might be addressed by settings changes at my end?
Is there a viable alternative to the two Gnutellas that has all of these killer features:
- Sharers of content can't advertise, hack you or crash your browser with dodgy scripts, etc. (other than by falsely labeling files)
- Easy to globally search for files of a specific type matching something
- Easy to download results, without a lot of manual pointer-chasing per result first. Even to hoover up big sets of results, like dozens or even hundreds of relevant files.
- Easy to track results you've seen before/already downloaded and avoid duplications
- Good download management, progress indication, resumption of interrupted downloads, etc.
- In other words, basically works the same way as using a gnutella client: search, download, manage library all in one place, and streamlined.