Results 1 to 4 of 4

Thread: Splitting the posts why bother

  1. #1
    You see on usenet the constant rar par excercise and also the Join with Quickpar excercise. Now looking at some posts especially the HD content of 30 gig plus post some posts are split 1 Gig a piece. Now your posting program can split files into so many lines automatically. I do not profess to be a usenet expert I seldom post anything but really is there a need to par rar stuff these days? I think par 5% and take your chances, especiallty on the XviD posts. Posters what do you think?
    Last edited by Windy72; 10-25-2009 at 11:11 PM.
    Life in the fast lane usenet freak

  2. Newsgroups   -   #2
    newsgroupie
    Join Date
    Mar 2007
    Posts
    1,037
    I look forward to the day when completion will be a perfect 100% -- and when usenet reaches that level of perfection we probably won't need either pars or rars. But I can't see that ever happening unless NNTP undergoes a major structural revision -- an unlikely prospect for a standard that's barely changed over the last 30 years.

    Personally, I generally prefer having small rars over big/huge rars, but for some reason most peopleseem to prefer the big rar sizes (especially people who upload to file-hosting sites like Rapidshare, who usually split files to the biggest size allowed. I dread the prospect of downloading a 1-gigabyte file from a file hosting service and then having the connection crap out at 90% complete, and then have to start over again from the beginning.) As far as the *accepted* size of rars being posted, I think that generally is going to largely depend on whatever way Scene releases are packaged.

    One thing I'd like to see is for newsgroup providers to increase their maximum article size. Astraweb recently tripled their max posting size, now set at 1.5MB (11718 lines) per article. But even at the maximum article size, that still translates into tens of thousands of articles that comprise a typical HD/bluray release. Allowing bigger articles means fewer headers and smaller NZBs and less overhead, but I think the main thing holding back NSPs from allowing, say, 15MB articles is that they try to have universal compatibility, so their too-big articles don't end up getting rejected by other providers. As usenet has no real central authority, changes seem to happen at a snail's pace.
    Last edited by zot; 10-26-2009 at 01:22 AM.

  3. Newsgroups   -   #3
    http://en.wikipedia.org/wiki/Usenet

    Scroll down to 'Binary Content'

  4. Newsgroups   -   #4
    Member
    Join Date
    Mar 2006
    Posts
    1,244
    These gargantuan rar parts (i.e., I've also seen and tried to 'process'those gigabyte plus things), is really ridiculous. A 4-core top CPU has trouble running quickpar and unrar on those things!

    You're probably right about 5% vs. the 10% of years past...

    But NOT the 1% or less, and I've seen that!

    A few technical truths to what we're talking about, and that's digital transmission (both 'local' and 'long-haul').

    ALL repeat ALL transmission, whether on fiber or copper coaxial or twisted pair, is ANALOG. We all talk about 'digital' transmission (even over full FIOS attached to terrabit internet routers). And I'm not talking about the 2-foot piece of ethernet between the home router and the computer at the end users home/office.

    Folks 'think' the fiber links criss-crossing the planet are 'digital', when in reality they aren't. They're analog. Don't believe it, I can point you at some heavy-duty engineering books explaining transmission theory to you.

    So... errors will happen, period. From the originator to the usenet server (where hopefully it's pretty 'safe'), from that server to the other usenet servers (the 'bucket-brigade), then from there to the end 'leech'. The entire process is, of course, only as 'strong' as it's weakest link (usually those end-user to server links, traveling across copper wire or fiber of some sort).

    Most posters don't do the basics, in other words, they don't 'watch' their posts from another usenet server that ISN'T the one they are posting to. Example: Post to Giganews, watch the propagation at Astraweb.

    I think the traffic that the HD stuff in generating is being posted by folks 'new' to usenet. I can't otherwise figure out why a ton of it is being 'stripped down' from 45G to 40G by stripping out things like commentary tracks, foreign language captions, and other things that save at best 1/10th of the upload time. This kind of wackiness, I thought, had went out a LONG time ago with stripping out DVD5's, but I guess with increased bandwidth didn't come increased smarts.

    Along the same lines is, why the decreased pars...? Well, I think that a lot of the folks doing that, from the speed of the uploading, are on FIOS or some other super-duper-speed system, and have either forgotten or never had to deal with the 'real world'. Again, at the speed of the posting, we're talking about the 'thin edge of the wedge' time wise to post more.

    Oh well, there are a lot of good posters out there doing mkv stuff, and a few doing full BD posts. But the BD folks are being drowned by the wacko's to the 'strip' jobs. At least they're admitting to it in the nfo's, so one is forewarned before spending x minutes/hours/days.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •