Page 1 of 2 12 LastLast
Results 1 to 10 of 12

Thread: Poor Postings (newbie?) HUGE rar's LOW par's

  1. #1
    Member
    Join Date
    Mar 2006
    Posts
    1,244
    I've noticed lately (as have others around usenet) that several postings of late are HUGE rar's (>400MB ea) with very limited pars to fix any errors (<1 part).

    Some of these are okay, but many have too many upload errors (and attendant not enough parity files) to correct.

    There is a reason WHY uploads since the 'beginning of time' are 50MB rar parts, and at least enough pars to recreate at least a half dozen (or more) missing parts. Even if you use Giganews or Astraweb, errors happen.

    There's been some folks here that have hit those postings, and they simply can't be repaired with the extremely limited pars.

    Why these folks think that they're postings get out is beyond me. They think that usenet (and the internet!) are extremely error-free. Right.

  2. Newsgroups   -   #2
    Just don't download from people who post such a way or maybe use something like nzbval to test validity of file against your news server. Yes they are a pain (poster who post not enough par) but I have also noticed downloads of 700 meg on usenet non split and had no problems. Maybe usenet is starting to become the perfect tool.
    Last edited by Windy72; 11-27-2008 at 11:29 PM.
    Life in the fast lane usenet freak

  3. Newsgroups   -   #3
    Usenet today is remarkably error free compared to the recent past.

    Do you need enough PARs to re-create as many as half a dozen 50MB RAR parts?

    Do you need enough PARS to re-create an entire 400MB RAR part?

    If so, there could be room for improvement in your downloading skills.

    Missing entire RAR parts, if that's what you're saying, doesn't sound right and I wonder what's really happening.

    I would be interested to know specifics. A Message-ID from a problem post (I seldom get one when I ask because so many are downloading so much porn). The newsgroup where you say others are also having trouble with one poster. And the downloader program you are using.

  4. Newsgroups   -   #4
    Member
    Join Date
    Mar 2006
    Posts
    1,244
    Quote Originally Posted by mesaman View Post
    Usenet today is remarkably error free compared to the recent past.

    Do you need enough PARs to re-create as many as half a dozen 50MB RAR parts?

    Do you need enough PARS to re-create an entire 400MB RAR part?

    If so, there could be room for improvement in your downloading skills.
    And just EXACTLY how? I've RUN usenet servers. If usenet was so 'error-free', then I guess everyone could simply upload one HUGE rar file, with no pars whatsoever, eh?? There's always transmission errors, both on transmit and receive, but I guess those are completely 'error-free' as well

    https://filesharingtalk.com/vb3/f-new...-pars-321030/?

    is just such a file problem that was reported here a bit ago.

    But I guess we'll just have to wait until you run into one of these.

  5. Newsgroups   -   #5
    Some posters are increasing RAR sizes because there are fewer errors since one particlarly nasty newsserver software bug was finally killed a year ago, partly because of my freqent urging to solve the problem. Easynews keeps logs of yEnc articles which fail CRC32 check and those logs are public. The logs of recent days are only a small fraction of what they were in 2005-07. I try to keep it that way; back in September I posted to the Eweka board asking the administrator to fix a transit server that was corrupting many articles, which he did.

    The topic you cite is yet another one where someone is complaining, but doesn't get specific, so no observer can verify that the problem was actually the poster's fault. I prefer to talk about specific posts, so here is one:

    Newzbin ReportID: 3279855 (NB32 ID: drn9x)

    This is exactly what you're talking about, because the poster uses a 4 MB article size, 4 MB PAR2 source block size, 400 MB RAR size, and for this post only 31 PAR2 recovery blocks. He's also quite chatty in the newsgroup with text posts about the propagation and availability of his posts, if you want to discuss it there.

    I already have the DVD, but if I wanted it I could use my ISP's news server 'news.qwest.net' with a block account server automatically filling in missing articles. I downloaded one of his other posts this way and no repair was needed. So I have run into the kind of posts you're talking about, and there's no problem. Would you have any trouble with this post?

  6. Newsgroups   -   #6
    Member
    Join Date
    Mar 2006
    Posts
    1,244
    Quote Originally Posted by mesaman View Post
    I already have the DVD, but if I wanted it I could use my ISP's news server 'news.qwest.net' with a block account server automatically filling in missing articles.
    'Chasing' fills is something I gave up on years ago, when GN got extremely stable. And folks shouldn't 'have' to do that, period. Posters 'should' maintain a look on the proper propagation of their files, but that may be asking for a lot these days.

    Absolutely things have gotten much better over that last few years, but the example you cite has tons of pars next to what I've seen just in the last week, where postings of 8GB is accompanied by <2% pars. Now, back in 'ye olde days' where 10% was the 'rule', that's a bit over the top these days. But with that low of a parity rate, and then added with these bloated RAR sizes, the possibility of non-recovery becomes excessive. 2% might work (most of the time) if the rar sizes were 50-100MB, NOT 400MB+.

  7. Newsgroups   -   #7
    That post has 1.5% pars for 8.5 GB in 400 MB RARS which is what you're complaining about. Read his text posts in the smaller newsgroup; he's experimenting and asking for feedback. He says that he's pioneering RAR and article sizes that will be common in 2010, when capitalism will force news server providers to adapt to Blu-Ray. When I read your message that began this topic I think you were born to have a philosophical discussion with the guy.

    GN gets DMCA notices and is sometimes forced to run a takedown bot on their own servers, which causes some of their subscribers to give up trying to download certain posts even though there are always methods to get around the damage.
    Last edited by mesaman; 11-29-2008 at 05:46 PM.

  8. Newsgroups   -   #8
    JustDOSE's Avatar look at my meatwad
    Join Date
    Apr 2008
    Location
    California
    Posts
    517
    whats going on here
    Pimpn aint easy ®

  9. Newsgroups   -   #9
    asmithz's Avatar Hi-Definition
    Join Date
    Jun 2003
    Posts
    8,642
    I curious as to witch usenet sever you use Beck38? Mesaman is certainly right, Usenet is remarkably error free now a days. I barley even download pars anymore, so it is strange that you say that.

  10. Newsgroups   -   #10
    Poster BT Rep: +1
    Join Date
    Nov 2008
    Location
    Kentucky
    Posts
    194
    While we are on the subject, please explain to someone pretty new to usenet, me, how these pars work. I've clicked them and repaired them, but I just don't get how it fixes the rars when it does not download anything to replace the errors.

    Usenet is freaking awesome though. I should download nzbs here but I already paid for vip at nzbmatrix. Hmph.

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •