Your Ad Here Your Ad Here
Page 3 of 3 FirstFirst 123
Results 21 to 23 of 23

Thread: Alt.Binz creator puts on "nice guy" hat, releases all newer versions as FREEWARE!

  1. #21
    mjmacky's Avatar an alchemist?
    Join Date
    May 2010
    Location
    Idol's guest bedroom
    Posts
    9,771
    Quote Originally Posted by Hypatia View Post
    what extras? how the f disabling or enabling extras can possibly affect decoding speed and how much CPU decoding process utilizes?

    in terms of decoding its one of the worst piece of code ive seen
    You can just chalk that up with JustDOSE just not having a clue.

    Quote Originally Posted by zot View Post
    Your CPU utilization graph will look like a series of mesas - each mesa representing a text-to-binary decoding cycle- and the faster your download speed, the closer together these mesas get. When the valleys between mesas disappear (which for this decade-old laptop was @ about 20 megabit/sec download speed) then the decoding starts getting backlogged, which besides suffering from the usual problems of a maxed-out processor, it means the PC is at that point running at its fastest effective download speed, regardless of the actual line speed.

    There are also ways to set thread priority/CPU utilization so Alt.Binz's decoding spikes don't cause delays with other running processes. My main concern would be how fast is a newsreader allowed to download before maxing out the processor continuously. So for me right now, it's not a big issue. But if I had a gigabit internet connection (and anything less than a "super"-computer) alt.binz would obviously be totally unsuitable.

    Just as a casual observation, it seemed to me that most other news clients I've tried, such as Grabit, BNR, NNTPgrab, Xnews, and others used at least as much CPU as Altbinz, though I've never done a formal comparison. (Usenet Explorer is exceptional - kind of like the µTorrent of news clients.) One problem is that news clients have traditionally written downloaded articles to HDD, then turned around and read them back off HDD when decoding (rather than just holding the 15 or 50 MB worth of articles in memory) so that the excess read/write redundancy adds to decoding. I'm not sure if altbinz's settings allow changing this, but I think i remember that function being added a year or so ago.
    Sounds like you have an Arizona yearning. I can't be absolutely positive, but it feels like you've embedded a Usenet Explorer endorsement in there. I can't for the life of me recall why I swore off UE in the past without ever trying it. It was something someone said in particular... hmmm... damn.
    prostitutin aint easy ®™©

  2. ** DONATE to REMOVE This Ad On The Site!! **
    Your Ad Here Your Ad Here
  3. Newsgroups   -   #22
    newsgroupie
    Join Date
    Mar 2007
    Posts
    1,038
    I was comparing altbinz vs. UE, and indeed UE has a fraction of the processor load at the same download speed, maybe somewhere around 15%-25% of what altbinz consumes. Altbinz still uses a 'old school'(actually 2nd generation) method (that UE also used a few revisions ago) of writing downloaded articles to disk, then decoding them into rars when enough finish to complete a rar -- while UE creates an eDonkey/Bittorrent-like dummy file at the start of the download - one for every rar- then slowly fills it in with (decoded) articles as they download. This is one big difference between them as far as their download mechanism goes.

    It seemed that many of the "first generation" binary newsreaders used a rather queer one-connection-per-rar method that allowed for fast initial download, but then caused computer lockup because all the articles composing each of the rar files would finish at about the same time, so all these rars would decode at about the same time, pegging out the processor. As well as the many other problems that one-connection-per-rar created, I never understood why any newsreader developer back then chose to process binaries this way. (as well as why the Newsflash Plus developer even today insists on this method)

    But then emerged Altbinz and the '2nd' generation usenet download clients (not quite newsreaders w/o header support) at least would complete a full rar before starting on the next one, so decodes would overlap throughout the download. I could not notice any visual (eyes on graph) improvement in resource efficiency between the old and new altbinz (if anything it was the opposite) and article decoding still resulted in a hefty processor hit. A difference in settings (default vs. tweaked-years-ago) could have been at least partly responsible for the apparent hunger of v39.4. This was running XP, on a 10 year old laptop that doesn't even handle a lot of recent Java or Python applications very well.

  4. Newsgroups   -   #23
    Quote Originally Posted by zot View Post
    I was comparing altbinz vs. UE, and indeed UE has a fraction of the processor load at the same download speed, maybe somewhere around 15%-25% of what altbinz consumes. Altbinz still uses a 'old school'(actually 2nd generation) method (that UE also used a few revisions ago) of writing downloaded articles to disk, then decoding them into rars when enough finish to complete a rar -- while UE creates an eDonkey/Bittorrent-like dummy file at the start of the download - one for every rar- then slowly fills it in with (decoded) articles as they download. This is one big difference between them as far as their download mechanism goes.

    It seemed that many of the "first generation" binary newsreaders used a rather queer one-connection-per-rar method that allowed for fast initial download, but then caused computer lockup because all the articles composing each of the rar files would finish at about the same time, so all these rars would decode at about the same time, pegging out the processor. As well as the many other problems that one-connection-per-rar created, I never understood why any newsreader developer back then chose to process binaries this way. (as well as why the Newsflash Plus developer even today insists on this method)

    But then emerged Altbinz and the '2nd' generation usenet download clients (not quite newsreaders w/o header support) at least would complete a full rar before starting on the next one, so decodes would overlap throughout the download. I could not notice any visual (eyes on graph) improvement in resource efficiency between the old and new altbinz (if anything it was the opposite) and article decoding still resulted in a hefty processor hit. A difference in settings (default vs. tweaked-years-ago) could have been at least partly responsible for the apparent hunger of v39.4. This was running XP, on a 10 year old laptop that doesn't even handle a lot of recent Java or Python applications very well.
    That was an interesting read. Do you know how Sabnzbd handles the downloading process? It feels very efficient CPU-usage wise on my old laptop, albeit it's a Python app.
    Last edited by heiska; 04-11-2012 at 05:04 PM.

Page 3 of 3 FirstFirst 123

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •