Results 1 to 4 of 4

Thread: Newsbin Pro 6 beta 1 is out.

  1. #1
    So lets get this beta testing party started

    Im especially interested if anyone with ~100mbit line has any problems\speed slow downs while downloading\unpacking simultaneously.

    because im having major issues here..

    1) Usenet Explorer 3.1.1 , RAM used 35Mb(unpack+download): 10.4+Mb
    http://www.imagebam.com/image/a8084a117821963

    Sabnzbd. Two 2 Gb files. DOwnload speed while unpacking never dropped below 10.4 MB




    2) Newsbin Pro Beta1:
    RAM used 100-120Mb(unpack+download): Speed dropped to 4-5MB then 2Mb and below.
    http://www.imagebam.com/image/c401b1117821984


    And i have absofuckinglutely no idea what is going on here.

    So im interested in the experience of other people..

    i would be really interested if someone on 100mbit posted screenshots of nbpro 6 doing downloadng\unpacking simultaneously two 2Gb files (after a while, not when,it has just started unpacking) with cache set to 200
    Last edited by Hypatia; 02-02-2011 at 09:07 PM.

  2. Newsgroups   -   #2
    newsgroupie
    Join Date
    Mar 2007
    Posts
    1,037
    For how long did your download speed drop, was it only while decoding each rar?

    Quote Originally Posted by Hypatia View Post
    And i have absofuckinglutely no idea what is going on here.
    I believe that Usenet Explorer recently made a change in its default decoding setup to address this very issue, whereby downloaded articles are now kept in memory, rather than being written to hard disk, and then read from disk, as has always been the "traditional" approach for newsreaders -- no doubt a holdover from back in the days when memory was much smaller and connections speeds were much slower.

    I'd assume that having the download speed momentarily slow down while the newsreader --any newsreader-- is decoding could be considered "normal" - or at least that seems to be how most newsreaders have always worked.

    But I think that's probably due to the software's internal coding (or lack of). If the decoding of articles/rars had been set up to be done on a completely independent, parallel thread - and efficiently utilized a multi-core processor - then the download speed, even on 100+mbit lines, could conceivably be completely unaffected during rar decoding.

    Personally, I tend to think a lot of these kind of problems could be due to outdated (or sloppy) coding skills on the part of the programmer. It's not just newsreaders, either. Consider the case of using Nero (or other CD-writing program) to burn an album's worth of MP3s to an audio CD: one processor core could be used for converting MP3s to wavs while the other core simultaneously concentrated on writing the CD. But that's not how the programmers apparently designed it. Those CD-writing apps that do on-the-fly MP3 decoding (most don't, of course) appear to do it as a serial, not parallel, process - probably as a holdover from earlier times when the limitations of slow, single-core processors was the major factor to be considered.

    This is of course just pure speculation on my part, as I'm not a programmer, nor do I have any "inside" information.

  3. Newsgroups   -   #3
    sandman_1's Avatar Poster
    Join Date
    Aug 2010
    Location
    Somewhere
    Posts
    519
    Awesome, I got my beta up and running now. Anyway I hope you can figure out what is wrong.
    Who needs cloud storage when you got the NSA?

  4. Newsgroups   -   #4
    Zot thanx for the detailed answers.
    Now back to your questions

    For how long did your download speed drop, was it only while decoding each rar?
    I didnt inspect every rar.

    You see this cache status at the screen? it says 0\200 (chunks)

    Here is the quote:

    If you look at the status bar you'll see "Cache X/200" This represents the number of blocks of data in the cache. If it hits "0/200" (it's like a gas gauge). It means Newsbin has used up all of it's buffers and it waiting for disk IO to finish. That's a sign your disks are slower than your download speed. Either because they're busy, because they're just slow or because something is getting in the way of writing.

    Chunks. Assume anywhere from 600K to 2 megs per chunk (on average it's 600K).

    If your disk is slowish, chunks can help you ride through the slow bits. It's just a buffer though. If the difference between your download speed and how fast your disks are is too great, no amount of buffer will really help.

    SO when it hits 0 all hell breaks loose.

    I had even two times drops to zero download speed.Once it hits 0 it stays there and my download speed is totally awful.

    I tried increasing chunk size(it took NBpro 600+MB RAM!)but it didnt help much at all even for 2 Gb files.
    Besides what the f is that? To do , and to do in a bad way, what UE, sabnzbd do while using just a small ammount of ram, nb6 beta 1 needs tons of ram? no way.

    Anyways, some guys have been trying to persuade me that its my disk that is slow, but , hey, when i see 3 newsreaders(well, 2 newsreaders and one nzb downloader) proving them wrong(10.4+MB ,sweet!) i guess it means something


    I believe that Usenet Explorer recently made a change in its default decoding setup

    Yep. A major change. I was helping testing it
    But i dont actually remember what the developer did lol
    I believe it says something in change log or in my MSN history..if it is still there lol

    Btw,
    You are gonna laugh your ass off ... Even that buggy piece of code junk- Newsleecher(newsleecher fans, please, forgive me) managed to keep steady 10.54MB while unpacking. It eventually slowed down the speed of unpacking (not like sabnzbd \Usenet explorer) but thats not the point. It passed the test anyways

    sandman_1
    enjoy the ride


    Here is the example of developers awesome speed while unpacking:


    I setup a test setup. 36 Gb of file sets on my server. The server is a linux box on the same network as my PC.

    - Queued up 2 6 GB sets, a 4 GB set and a 10 GB set one right after the other.
    - 10 connections to the server.
    - Machine is a very powerful quad core, downloading to a 7200 RPM 2 TB WD Caviar black drive

    Set #1 6 GB - needed an unrar.
    Set #2 6 GB - needed a repair and an unrar.
    set #3 4 GB - just download
    set #4 10 DB - needed an unrar.

    Set #1 downloaded, averaged 200 Mbps/ 24 MB/s
    Set #1 started to unrar while set #2 started downloading. Set#2 downloaded at about 200 Mbp/s 24 MB/s. Speed dipped a little and the chunk cache dropped down to as low as 150.
    Set #1 finished unrar
    Set #2 finished and started to repair.
    Set #3 Downloaded during the repair. Never went below 24 MB/s
    Set #4 Started to download while Set #2 was still repairing. Set#3 downloaded at 24 MB/sec. If anything it dipped less than Set #2 probably be cause the files were 500 MB each.
    Set #2 finished repair and started to unrar. Set #4 continued to download.
    Set #2 finished the unrar.
    Set #4 finished and unrared.

    I don't doubt what you're seeing but, I'm pretty sure the reason you see it is that Newsbin is doing the par scan as each file downloads instead of later. Maybe it's something I can look at down the road. It's clear to me that your disk is the bottleneck, at least as far as Newsbin's current design is concerned, a different design might work better on a modestly powerful machine.
    As far as i noticed

    Newsleecher is doing verification on the fly only for the first one, it pauses it for the second item while the first unpacks
    and just downloads its

    As for sabnzbd i dunno how it works, they have something like quick check but i have no idea whats that

    UE does verification ob the fly also for the first one, files for the second item are "kept" in separate "location" (not on hard drive) while the first unpacks. So it looks like this

    I item: UE says: downloading- saving-downloading-saving(thats there i suppose checking occurs) etc
    I items gets unpacked, II item gets downloaded: u see just unpacking status. When unpacking is done there are several "saves" of all downloaded II item files

    Thats why i got all these newsreaders working all right....
    Last edited by Hypatia; 02-03-2011 at 09:26 AM.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •