• US ISP AT&T Developed New Torrent Tracking System

    US provider AT&T developed a new method to track traffic in Peer-2-Peer networks. With this new algorithm it's now possible to detect torrent files not by itsnames, but by their content.

    AT&T is the one of the biggest telecommunication companies in the United States that invests huge budget into developing of different tracking back methods to monitor sharing traffic. The new development should help the provider to detect online users who upload and download child pornography files. It can be also used for monitoring of traffic for pirated content and finding IP addresses of users who download and share these files.

    In simple words: AT&T's creation indexes search engines and RSS feeds for torrent files and adds them to the database. The system analyzes the content, gathers information about peers and becomes a fake seeder itself. After, torrent clients of the customers connect automatically to the new monitored AT&T's peer and start downloading from it. At the same time, the IP address will be tracked and saved in special file that can be used as evidence for copyright infrigement or any other illegal action lately.

    Today, this technology is applied only to the AT&T's customers. The company's speakers already sent information about the new technology to the other internet service providers and offered them the same method of tracking. The new development started a big discussion among ISPs and copyright holder, who are able to sponsor this art of tracking real online users to get their personal details and make them responsible for illegal actions.

    The whole situation shows that internet becomes a completely censored place where each individual will be monitored 24 hours a day. The only way to protect a connection between your PC and internet is using a VPN service.

    Source: http://torrentus.to/blog/us-isp-att-...ng-system.html
    Comments 12 Comments
    1. megabyteme's Avatar
      megabyteme -
      I am always outraged by the politics of tying child pornography to ANY need to monitor file transfers. Cheap fucking, low-life trick. No kind way to censor that.
    1. piercerseth's Avatar
      piercerseth -
      Quote Originally Posted by megabyteme View Post
      I am always outraged by the politics of tying child pornography to ANY need to monitor file transfers. Cheap fucking, low-life trick. No kind way to censor that.
      Why won't you think of the children? Seriously though, trotting out the appeal to emotion fallacy is de rigeur. Cuomo did something similiar when he was NY AG a few years regarding usenet iirc. Can't spell panopticon without 'c' & 'p.'
    1. turin's Avatar
      turin -
      they are trying so many things, but it wont stop torrenting
    1. eyekey's Avatar
      eyekey -
      at&t just made my personal boycott list...which i can start once my cell contract ends..lol
    1. TheFoX's Avatar
      TheFoX -
      Considering that images are nothing more than a sequence of pixels, how could an algorithm discriminate between a family photo of mum, dad, brother and little sister on a beach somewhere, and little sister being used for child pornography purposes?

      In fact, how could an algorithm determine the difference between a home movie file you have uploaded, and a blockbuster?

      It cannot. If it could, we would already have robots walking the streets with the ability to reason.

      This is nothing more than Public Relations designed to scare people into thinking they have something special that they can use to detect this material. The only way to detect this material is for someone, a human, to scan it and make a reasoned judgement.

      Google made the same statement about their Streetview software, stating that it automatically blanks out faces and registration plates of vehicles it photographs. It doesn't. It is done by a human. If you are observant, you will notice lots of registration plates that are legible, because someone in their haste missed them. Also, many of the faces on Streetview aren't blanked out because, again, a human missed them.

      Some things cannot be left to a computer algorithm, because no algorithm can match the mind of a reasoned human, but then again, only a human could miss the obvious.

      So, to summarise, this is nothing more than scare tactics to make the gullible think that they have some tool to detect transgressions on the internet. They don't. They are using our fear of getting caught against us, to frighten us into submission. If they genuinely had such a tool, they wouldn't need to announce it. They would simply trap those abusing the rules and prosecute them, extorting millions in the process.
    1. duke0102's Avatar
      duke0102 -
      How can an algorithm accurately detect child porn? Are people who download midget porn going get court letters soon? lol
    1. brilman's Avatar
      brilman -
      Great, now for some reason I want to see midget porn lol
    1. torrentus's Avatar
      torrentus -
      Quote Originally Posted by duke0102 View Post
      How can an algorithm accurately detect child porn? Are people who download midget porn going get court letters soon? lol
      This is exactly the point - it's not possible! So all the things they do will have no success.

      Alle these companies are not awared of child porn or other illegal content, they just wish to track people who download copyrighted content. To all, use VPN while downloading from torrents. In this way you stay safe!
    1. KRink's Avatar
      KRink -
      Quote Originally Posted by TheFoX View Post
      Considering that images are nothing more than a sequence of pixels, how could an algorithm discriminate between a family photo of mum, dad, brother and little sister on a beach somewhere, and little sister being used for child pornography purposes?

      In fact, how could an algorithm determine the difference between a home movie file you have uploaded, and a blockbuster?

      It cannot. If it could, we would already have robots walking the streets with the ability to reason.

      This is nothing more than Public Relations designed to scare people into thinking they have something special that they can use to detect this material. The only way to detect this material is for someone, a human, to scan it and make a reasoned judgement.

      Google made the same statement about their Streetview software, stating that it automatically blanks out faces and registration plates of vehicles it photographs. It doesn't. It is done by a human. If you are observant, you will notice lots of registration plates that are legible, because someone in their haste missed them. Also, many of the faces on Streetview aren't blanked out because, again, a human missed them.

      Some things cannot be left to a computer algorithm, because no algorithm can match the mind of a reasoned human, but then again, only a human could miss the obvious.

      So, to summarise, this is nothing more than scare tactics to make the gullible think that they have some tool to detect transgressions on the internet. They don't. They are using our fear of getting caught against us, to frighten us into submission. If they genuinely had such a tool, they wouldn't need to announce it. They would simply trap those abusing the rules and prosecute them, extorting millions in the process.
      Verizon already does this. It only works with known images, meaning they have people on payroll viewing and archiving child porn to a corporate database.

      http://arstechnica.com/information-t...-in-its-cloud/
    1. duke0102's Avatar
      duke0102 -
      Quote Originally Posted by brilman View Post
      Great, now for some reason I want to see midget porn lol
      I warn you, it can't be unseen....
    1. megabyteme's Avatar
      megabyteme -
      Quote Originally Posted by duke0102 View Post
      Are people who download midget porn going get court letters soon?
    1. bobbintb's Avatar
      bobbintb -
      Quote Originally Posted by TheFoX View Post
      Considering that images are nothing more than a sequence of pixels, how could an algorithm discriminate between a family photo of mum, dad, brother and little sister on a beach somewhere, and little sister being used for child pornography purposes?

      In fact, how could an algorithm determine the difference between a home movie file you have uploaded, and a blockbuster?

      It cannot. If it could, we would already have robots walking the streets with the ability to reason.

      This is nothing more than Public Relations designed to scare people into thinking they have something special that they can use to detect this material. The only way to detect this material is for someone, a human, to scan it and make a reasoned judgement.

      Google made the same statement about their Streetview software, stating that it automatically blanks out faces and registration plates of vehicles it photographs. It doesn't. It is done by a human. If you are observant, you will notice lots of registration plates that are legible, because someone in their haste missed them. Also, many of the faces on Streetview aren't blanked out because, again, a human missed them.

      Some things cannot be left to a computer algorithm, because no algorithm can match the mind of a reasoned human, but then again, only a human could miss the obvious.

      So, to summarise, this is nothing more than scare tactics to make the gullible think that they have some tool to detect transgressions on the internet. They don't. They are using our fear of getting caught against us, to frighten us into submission. If they genuinely had such a tool, they wouldn't need to announce it. They would simply trap those abusing the rules and prosecute them, extorting millions in the process.
      i've used forensic software that scans images for suspected porn, so why not child porn? it's already well established that facial recognition software exists. it has just been adapted to recognize much more of the body, it's age, and state of undress. hell, an xbox can do most of that.

      and google does have software that automatically blurs faces and license plates. it is not done by a human unless the software misses it and they catch it later. the occasional unblurred face or license plate isnt unblurred because a person missed it, it's because the software missed it.

      as for a movie vs a blockbuster, that's also possible. cinematic movies tend to have better lighting, audio, jump cuts, steadier and better camera work, credits etc. home movies, not usually. that and they could also scan the file for predetermined markers such as a the title screen. it's very possible (although difficult, at least for me) to scan a video file for those kinds of things.

      if you think about it many, if not most, of the things we do with computers on a regular basis were thought to be impossible for a computer to perform.