PDA

View Full Version : NZB Checker (Download/Posting/etc)



Beck38
06-15-2011, 08:34 PM
Okay, so I decided to see what was available out there, but I'm really NOT in the mood to do 'Software Chief Analyzer' mode anymore, and all of the programs out there I've tripped across have MAJOR problem (IMHO), so I'm looking for something that checks NZB's on multiple servers, works in ALL windows (okay, anything later than W95), and DOESN'T require OS 'add-ons' in order to work (i.e., lazy programmers).

Should be dirt simple, but all I see is junk slapped together by folks who can't be bothered to actually TEST what they write.

Would be nice...

B18C5
06-16-2011, 12:09 AM
I started working on one the other day then got distracted by real work. Might build it into newsbin before I do a stand alone. The intent is to have a standalone though.

What would you want it to display? I wouldn't really use it so, I'm not clear to me what people want to see. I assume, a list of files, maybe one file per server so, each file can be compared against the server it's on. I'm thinking single threaded because "STAT" is pretty fast. You'd want some way of seeing a summary view, maybe a count of how many records couldn't be downloaded across all servers.

nntpjunkie
06-16-2011, 02:37 PM
it would be useful to know the percentage completion of the total archive, here is an app called NZB completion checker that you might get some ideas from, http://www.zoon.dk/?p=98 it's very cool and useful.

zot
06-20-2011, 10:48 AM
NZB Completion Checker might well be the 'latest and greatest' but I've never tried it since it uses DotNet -- which has not worked well for me on this PC. (Unzbin only works for me using an old DotNet version, and I got tired to switching-out different versions all the time to run various programs that only worked on specific DotNet versions)

There is also "NZB Validator" (NZBval) and "NZB Download Checker" (my favorite) - both of which don't require DotNet.



What would you want it to display? I wouldn't really use it so, I'm not clear to me what people want to see. I assume, a list of files, maybe one file per server so, each file can be compared against the server it's on. I'm thinking single threaded because "STAT" is pretty fast. You'd want some way of seeing a summary view, maybe a count of how many records couldn't be downloaded across all servers.

It probably would not need to be as detailed as apps like NZB Completion Checker (though for me personally, I always prefer more information rather than less)

The best feature would be to check server completion against Par redundancy to see if the file can be completed. (none yet do this that I'm aware of) It might be nice to give color-coded results as a red, yellow and green light so a user can know ahead of time if the pars are barely adequate or much more than sufficient.

Having each file graphically represented (into individual green/red segments) might help people to easily see for which file every rar is missing the first segment. (knowing that if every rar's first segment is always missing from Highwinds, it will certainly be missing from Giganews/Supernews as well.)

As an integrated newsreader function, I wonder how well it would work to check completion in the background as a file downloads? Doing so might require two connection-thread-number settings. For instance, 10 (dedicated) connections for article download, 20 additional connections to be used for checking completion only.

Another suggestion might be to check a missing segment a second/third time.

I often use public WI-FI, and on poor signal conditions I will get a lot of "incompletes" in the newsreader when those "missing" segments exist on the server. I usually get around this problem by chaining fill-servers together (even if using an alternate URL for the exact same server) so the missing segment always gets asked for at least twice. (of course, under extremely bad corruption-plagued transfer conditions, Bittorrent/ED2K works better than NNTP since every segment gets checked for as many times as it takes)

I've also discovered long ago that even under good conditions, a server can sometimes 'mis-report' a missing article, then have the article a few minutes later when re-requested.

... which brings me to another question ... Why don't any newsreaders automatically re-check missing segments a second time?

Beck38
06-20-2011, 02:57 PM
NZB it uses DotNet -- which has not worked well for me on this PC

I'm constantly fighting that on two of my newest Vista machines, and this with commercial s/w. I'm not about to fight it with some PD/Free s/w.



There is also "NZB Validator" (NZBval) and "NZB Download Checker" (my favorite) - both of which don't require DotNet.


Tried both, fail on both. NZB Validator rings alarms on my malware detector (malwarebytes, it immediately goes out and deletes it) and NZB Download Checker constantly says it needs a 'newer' version of Windows, even though the documentation says it runs on 'all windows'.

mesaman
06-20-2011, 05:20 PM
If you have poor wi-fi conditions, I would imagine NNTP transfer nearly impossible; connections to the server would be broken often enough to quickly get 'too many connections' responses while waiting for the connections to timeout at the server end. You wouldn't want retrys to happen after 'too many connections' because that just escalates a war between client and server. For what it's worth, I think retrys can be set in the Alt.binz newsreader. I have so many backup/fill servers, it makes retrys of article retrieval commands unnecessary. Servers that would benefit from retrys are rare anyway.

The STAT command on servers in the 'Readnews' group is partly broken. First of all, STAT on Readnews servers requires GROUP first (any newsgroup will work) even though ARTICLE, HEAD, BODY don't need GROUP first. Many of the responses to STAT on Readnews are innacurate or broken; such as sending '430 No such article' for articles that can be retrieved, and often the response to STAT <msg-id> is just the nonsensical '501 newsgroup'. Then there's also the lag time on Readnews, many seconds pass between command and response. This lag is also seen on Astraweb EU servers.

If using fill servers in a newsreader, the most useful feature of NZB Download Checker is just checking the NZB for completeness, which is done before it connects to a news server to send STAT commands.

TechSono
06-23-2011, 12:07 AM
That's right; STAT doesn't work. The NZB-checker in SuperNZB uses the HEAD command, which transfers more data, but works well. SuperNZB doesn't rely upon dot-net, and has an elaborate retry system. If it can't get a post from one server, it will try all of the the other servers.

zot
06-24-2011, 12:22 AM
NZB Validator rings alarms on my malware detector (malwarebytes, it immediately goes out and deletes it)

Indeed. I was shocked to learn this. VirusTotal shows nearly half of all AVs classify NZB File Validator as malware. I wonder why exactly?

I've not tried NZbval in a long time, not since I discovered NZB Download Checker. NZbval now seems to be abandon-ware, the developer's website has been down for awhile. The author put out a lot of NZB-related applications, so it just didn't fit the usual pattern of a malware.


If you have poor wi-fi conditions, I would imagine NNTP transfer nearly impossible; connections to the server would be broken often enough to quickly get 'too many connections' responses while waiting for the connections to timeout at the server end. You wouldn't want retrys to happen after 'too many connections' because that just escalates a war between client and server. For what it's worth, I think retrys can be set in the Alt.binz newsreader. I have so many backup/fill servers, it makes retrys of article retrieval commands unnecessary. Servers that would benefit from retrys are rare anyway.


Under poor-connectivity WiFi conditions, I would generally only run one or two connection threads since running multiple threads greatly increases file corruption. So the "too many connections" response never even gets close to occurring. I never even noticed that 'retry server' setting in AltBinz - I feel like a fool. :D



The STAT command on servers in the 'Readnews' group is partly broken. First of all, STAT on Readnews servers requires GROUP first (any newsgroup will work) even though ARTICLE, HEAD, BODY don't need GROUP first. Many of the responses to STAT on Readnews are innacurate or broken; such as sending '430 No such article' for articles that can be retrieved, and often the response to STAT <msg-id> is just the nonsensical '501 newsgroup'. Then there's also the lag time on Readnews, many seconds pass between command and response. This lag is also seen on Astraweb EU servers.

Thanks for the technical rundown. Have you reported this to tech support? Sadly resellers are only the middlemen when it comes to these kind of non-routine issues, so it pays to be persistent.



If using fill servers in a newsreader, the most useful feature of NZB Download Checker is just checking the NZB for completeness, which is done before it connects to a news server to send STAT commands.
I agree.


That's right; STAT doesn't work. The NZB-checker in SuperNZB uses the HEAD command, which transfers more data, but works well. SuperNZB doesn't rely upon dot-net, and has an elaborate retry system. If it can't get a post from one server, it will try all of the the other servers.
I've never tried SuperNZB (or even seen it talked about anywhere) but it seems very promising. The downside is having a price that is considerably higher than the competition (Newsbin is $15/lifetime), and no mention of any trial version.

B18C5
06-24-2011, 03:50 PM
I have it on good authority that STAT from another server only checks the header database and not the actual file records so, you could say it's broken there too. HEAD transfers a shitload of data in comparison to a "STAT" though. Maybe you'd need a "USE STAT or USE HEAD" option. XHDR is probably as bad as STAT and only checks the header DB. I guess I should write it that way, then.
"

TechSono
06-25-2011, 02:17 AM
SuperNZB will run with two server connections in unregistered mode. Nothing else is restricted. The $40 price helps to insure that it will not become abandoned-ware. The program came out in January 2006, making it one of the oldest dedicated NZB downloaders.

The HEAD command transfers a lot more data than STAT, but it's a tiny amount compared to the files to be downloaded. And HEAD has the popular "it works" feature.

False positives from anti-virus programs are common. They only scan files for small "signatures", and with the nearly infinite number of apps out there, those strings of bits are bound to show up in legit apps. They don't actually test the app to determine whether or not it is harmful. In fact, what they do might even be considered libel. I would love to see a developer bring a suit one of these days.

zot
06-25-2011, 06:58 PM
SuperNZB will run with two server connections in unregistered mode. Nothing else is restricted. The $40 price helps to insure that it will not become abandoned-ware. The program came out in January 2006, making it one of the oldest dedicated NZB downloaders.
I like that limited-connection/unlimited time feature. (15 day trials are rarely enough for me, as I might only end up using it once in that time span) A few newsreaders limit trials to a set number of times run, which is also OK -- unless the software needs a lot of re-starts.



False positives from anti-virus programs are common. They only scan files for small "signatures", and with the nearly infinite number of apps out there, those strings of bits are bound to show up in legit apps. They don't actually test the app to determine whether or not it is harmful. In fact, what they do might even be considered libel. I would love to see a developer bring a suit one of these days.
Also, the reverse is true, sadly all too often. I think a major factor might be the size of the company -- and the number of lawyers they employ.

When Sony started infecting millions of computers with its infamous Rootkit a few years ago -- dangerous malware by any definition -- many of the major AV companies (most notably Symantec) steadfastly refused to label it as malware, at least until the bad publicity became too much to bear.