PDA

View Full Version : Is a 0-day tracker worth it compared to specialized trackers?



baskinghobo
07-05-2014, 12:08 PM
Is A 0-day tracker like ftn and scc worth it compared to specialized trackers such as passthepopcorn, blackcatgames, btn etc? How much faster are they at grabbing content compared to these sites?

IdolEyes787
07-05-2014, 12:51 PM
Worth what?

And I suppose that depends if you value getting stuff infinitesimally faster than retention or content.

Rart
07-06-2014, 09:39 PM
In my mind the only type of content where pretimes could really even matter are TV shows. And even then I'm talking about the waiting 10-20 minutes it takes for a show to get on TVT.com, not the difference between SCC and BTN, which is largely irrelevant.

Also since the content at BTN is unrar-ed I think you could make the case that you would actually be able to physically watch the show more quickly (that is why you're downloading them, right) on BTN since you don't have to unrar the show after downloading it.

megabyteme
07-06-2014, 10:00 PM
I find that IPT, with all of its P2P groups, has some content (movies mainly, and some shows) that never appear on scene-only trackers. If you are desperately waiting for a show, P2P may have it a lot sooner in some cases. There are also more choices when it comes to resolution and file size.

piercerseth
07-06-2014, 10:07 PM
Also since the content at BTN is unrar-ed I think you could make the case that you would actually be able to physically watch the show more quickly (that is why you're downloading them, right) on BTN since you don't have to unrar the show after downloading it.

I absolutely despise when trackers do this. Pack p2p content however you like (or don't). Leave scene stuff unmolested. At the very very least go to the trouble of including an .srr/.srs so I can rebuild the sets properly.

I can cross seed scene rars on half a dozen trackers with hardlinks. I can go back and forth with it on usenet no problem. Keeping duplicates of larger 1080p content isn't practical.

Rart
07-06-2014, 10:44 PM
Eh. I much prefer the ease of having unrar'd content. It's a minor inconvenience to have to unrar things, but given the choice between rar'd and unrar'd I would choose unrar'd every time.

It also means I don't have to keep two copies of the video for seeding purposes (or else I'd have to unrar it every time I want to watch). It's also minor, but it's less disk usage as well, so it's less taxing on your SSD when you don't have to unrar everything.

In addition I personally feel like any tracker in which you feel the need to cross seed at in order to keep your ratio up isn't really a tracker that's worth the effort of staying at.

I guess your concern of going back and forth with usenet is a valid concern, but in my mind it's enough of a fringe case to make me think that generally having unrar'd files is a better choice.

megabyteme
07-06-2014, 11:00 PM
I agree with Rart here. As an end user, which most torrent users are, it is FAR more convenient and less space consuming to download a playable file. For me, this makes a difference because I play my files from a network attached media player. If something is RAR'ed, then I have to let it finish downloading, try to remember what files are RAR'ed and which ones are not then come back to my computer to extract the file(s).

Stooopid Usenet. :fist:

EDIT- Software is a bit different, though. RARs are fine there. Not multi-compressed files within other compressions :frusty: , but a single RAR compression is fine.

mjmacky
07-07-2014, 01:01 AM
It's fine the way it is. If a tracker keeps everything split in rars because it helps them achieve faster pre times, then it's easy for me to assess it as a twat site. I don't have to ponder it. I'm not against pondering, but I would prefer to spend my pondering time on things that don't make me feel gross inside.

piercerseth
07-07-2014, 01:25 AM
In addition I personally feel like any tracker in which you feel the need to cross seed at in order to keep your ratio up isn't really a tracker that's worth the effort of staying at.

I guess your concern of going back and forth with usenet is a valid concern, but in my mind it's enough of a fringe case to make me think that generally having unrar'd files is a better choice.

I don't care for ratio games either, I just look at it as path of least resistance when I share. It's just me whining in the context 0day scene stuff. I bitch when they don't race Samples/Proof too :P

Ultimately, Idol's criteria: retention and content are most important. A strong request section with posters who actually fill is a huge bonus.

I didn't realise there were issues with standalone devices etc playing rarchives. I'm a vlc/mpc-hc guy for the most part.


EDIT- Software is a bit different, though. RARs are fine there. Not multi-compressed files within other compressions :frusty: , but a single RAR compression is fine.
Yeah, the zip/diz is a legacy thing they have yet to sunset. They're pretty stubborn.

megabyteme
07-07-2014, 02:27 AM
I didn't realise there were issues with standalone devices etc playing rarchives.

With the 4 year olds, a standalone device works for our needs and is easy to restart if someone presses an unknown sequence of buttons that could be quite irritating on a Window$ machine. We have Logitech Revue units which do enough for now. I am tempted to see what Amazon comes out with next year to replace their first gen Fire device. XBMC can be installed on that, which would give me the best of both worlds. Hopefully.


I'm a vlc/mpc-hc guy for the most part.

Rart
07-07-2014, 05:54 AM
I'm actually curious as while I am vaguely familiar with the concept of compression for encoding/transcoding things such as music and video, I have no idea how the compression of files in general for actually making files...you know, smaller, actually works. I'm assuming the two concepts could be somewhat similar, although obviously things such as varying bitrates during certain lulls in the music/videos wouldn't work for something like software.

The reason I bring this up is I was wondering would it be possible for trackers to you know, use winrar/whatever archiving program for actually... compressing the file? Making less strain on the servers and the users, and allowing files to be downloaded more quickly? Is this feasible? Is there some technical reason for why trackers haven't done this?

megabyteme
07-07-2014, 06:25 AM
For most of us, bandwidth is abundant. The work involved to reduce a file is not worth the effort to do so anymore. There was a time when file size mattered, that was when everyone used dialup connections. While what you are suggesting may make some difference, nobody would really notice it. Can you tell the difference between downloading a 1GB file and an 800MB? Is this going to change your mind about what you are downloading? I expect most people, since we are not paying by the MB, do not care. We want what we want, and we don't want complications. None seems to be the standard tolerance. We are spoiled by cheap, abundant bandwidth and a never ending supply of media. We can also look for this media on a number of trackers. We'll grab from the one that causes us the least amount of resistance.

mjmacky
07-07-2014, 11:58 AM
I'm actually curious as while I am vaguely familiar with the concept of compression for encoding/transcoding things such as music and video, I have no idea how the compression of files in general for actually making files...you know, smaller, actually works. I'm assuming the two concepts could be somewhat similar, although obviously things such as varying bitrates during certain lulls in the music/videos wouldn't work for something like software.

Many executable files are already compressed (also called self-extracting archive), though that's not to say they cannot be compressed further, you would just get a marginal reduction in file size. In other words, they're like a zip file that doesn't require another program to extract it.

General compression is similar to what you're already familiar with, where repeated information/sequences/code gets indexed, which is contained in the archive, and the index is used to rebuild the file(s) when extracting (or rebuild the media file). There are many algorithms that achieve this, mp3s use a Fourier transform-type algorithm, H.264 codes frames as differential to a reference frame, etc. The entire point is to get rid of redundancy.

The different algorithms do not stack because once you have reduced redundancy with one method, the subsequent algorithm would not be able to calculate that much more. In fact, if you tried to compress a file that was already compressed using a superior method, you could end up with a file that is slightly larger than the original (on account of the index).

In case this seemed to pleasant and helpful a disposition, suck a dirty dick, bitch.