PDA

View Full Version : QuickPar doesn't detect downloaded Files



LazyGuy
11-29-2011, 03:53 PM
So i am in the habit of running Quickpar in Background whenever i am DLing something
Dunno why, but it gives me peace of mind that my Bandwidth isn't being wasted on dud files

i was DLing some File from FST
after downloading some 50% of the file, i was a bit confused to find QuickPar Reporting
"Select More Recovery Volumes"
and in the window i saw "0 Blocks Found"
but already 15 Rars have been DLed

so is this file a DUD?
should i abandon it for a Torrent Mirror?:(


Name: Will disclose only if needed
Type:XVid HD Rar'ed
Age:55 Days(while i am writing this post)
Size:3.x GBs distributed into 30 RARs

Usenet Server: XSUsenet ( yeah i am a Freeloader):P
Retention:600 Days Claimed, But according to Hypatia's analysis 380 days.
Client:SabNZBD+

p.s
:fst:

zot
11-29-2011, 08:02 PM
Quickpar is not intended to repair files "on the fly" -- you should download all the rar files first, and then run quickpar. Or alternately, use Sabznbd (or other newsreader/NZB downloader) set up to do this automatically.

If you want to avoid wasting bandwidth, I suggest using a completion checker to test completion before downloading (as well as opening and playing the first rar file [w/order arranged to download first] to check quality).



Type:XVid HD Rar'ed
Age:55 Days(while i am writing this post)
Size:3.x GBs distributed into 30 RARs
Usenet Server: XSUsenet ( yeah i am a Freeloader):P
Retention:600 Days Claimed, But according to Hypatia's analysis 380 days.
Client:SabNZBD+


Judging from my own experience using XSusenet, I would expect this file to probably be 100% complete and not requiring any par-repair. But that's just my guess.

However, this release seems a bit strange to me. Only 30 rars -- and each bigger than 100MB? I generally try to stay away from releases that don't follow a more "standard" or "scene-spec" format, as things like odd file sizes and other irregularities can often indicate a low-quality --or fake-- release.

re:Hypatia's analysis 380 days - The times I checked, I got close to exactly 400 days retention on that server, but you can easily check retention for yourself.

mjmacky
11-29-2011, 11:43 PM
Type:XVid HD Rar'ed
Age:55 Days(while i am writing this post)
Size:3.x GBs distributed into 30 RARs
Usenet Server: XSUsenet ( yeah i am a Freeloader):P
Retention:600 Days Claimed, But according to Hypatia's analysis 380 days.
Client:SabNZBD+

However, this release seems a bit strange to me. Only 30 rars -- and each bigger than 100MB? I generally try to stay away from releases that don't follow a more "standard" or "scene-spec" format, as things like odd file sizes and other irregularities can often indicate a low-quality --or fake-- release.

If it's not a scene release, not much to worry about for non standard packaging I would think. I usually break each archive part up into 102 400 000 bytes (160 parts * 640 000 bytes or 5000 lines). Actually, for some seasons that have several MKVs I've merely gone with posting the MKV files with their par2 (~ 120 - 140 MB, par2 will check the MKVs). I do it all using QuickPar just out of the hatred and spite I have for WinRAR.

LazyGuy
11-30-2011, 12:10 PM
Okay SabNZBD completed the whole download but reports that the Download failed
it didn't pass the par check on Quickpar too
9374193742
but i unRar'ed the files anyway and the content is intact

so is this the case of Par File Mismatch?

zot
11-30-2011, 01:36 PM
It's possible that the pars in the set were for an entirely different file. It happens.

B18C5
11-30-2011, 03:29 PM
I see bad PARS from time to time. I've got a sample here where every par is bad but, the RAR's were fine. Newsbin will ignore the bad pars and attempt the unrar anyway.

Beck38
11-30-2011, 04:12 PM
Quickpar is not intended to repair files "on the fly" -- you should download all the rar files first, and then run quickpar. Or alternately, use Sabznbd (or other newsreader/NZB downloader) set up to do this automatically

I don't get all this on getting things to 'automagically' do things. It's really been my experience that more often than not, such 'schemes' tend to fall 'off the rails' often enough to make using them self-destructive. Machines today are so fast, along with the internet connections, that if one doesn't grab hold of the 'process' and hold on tight, one can be overwhelmed by the amount of data and whatever 'reasonable' idea one has to store and catalog all of this, will rapidly go off the cliff.

Then again, maybe being able to find and use 'stuff' isn't at the top of the schema for some. If so, disregard what I just said and continue the fire hose until either your machine bursts into flames or starts flooding your storage... !

B18C5
11-30-2011, 05:20 PM
I don't get all this on getting things to 'automagically' do things.

Isn't the whole point of a computer to make it do the repetitive thoughtless tasks that you don't want to do manually?

The only issues I see with automatic repair and unrar is:

1 - Spammers who try to avoid being filtered.

2 - People who can't follow the "standard" or use broken posting/paring tools

3 - People who just F up the process of generating a par set.

Had a set last night. Par set included both the RARs AND the original file that was inside the RARs so, it was impossible for quickpar to complete the file. Classic "chicken and egg" problem. I also see cases where the PAR set is good but, the first RAR wasn't included when the set was generated. Thankfully, these are rare cases.