So what happened to the rest of the internet while they tried this piece of crap?
Fucking wankers.
So what happened to the rest of the internet while they tried this piece of crap?
Fucking wankers.
.Political correctness is based on the principle that it's possible to pick up a turd by the clean end.
Nothing. They are just using the existing cables more efficiently and have better TCP.
I hadn't read it properly, I thought they had effectively commandeered a section of the internet, causing congestion elsewhere.
However, I think the measurement is nonsensical. The distance involved is largely irrelevent, the routers are the same ones as before so the transmission speed between routers is not changed. But of course, that is only between routers, there will be a delay between a packet being received at a router and it being transmitted on to the next one.
This significance of the number of routers depends on their topology. The more routes you can use in parallel the higher your available bandwidth, and the greater the distance between routers, the lower the end-to-end retransmission lag for a given distance (or the higher the distance covered per router).
This means that by picking their routes carefully, they may have achieved much longer distances. The fact that they only achieved 4.23Gbps compared to Caltech/Cern's 6.25 Gbps shows that their actual speed was slower, but of course an increased number of routers in the end-to-end circuit could also account for this, or at least some of it.
While I am aware that this is a land speed record, it is important to demonstrate the absurdity of the measurement. To do this we could bounce a signal off the moon using no routers at all, and therefore no retransmission lag. Since the round trip would be about 800,000 km, the data rate would only need to be about 100Mbps to smash the current record significantly. However, the recipients of the large data files (the ultimate purpose of the Internet2 consortium) would be pretty upset about the time taken for them to receive their data.
It is all very well to say that "carefully designed, all-purpose networks" are the way to go (and without a doubt this is true), but the whole point is that they deliberately chose their start and end points and the route between to maximise the result. In real life, data travels from source to destination wherever these may be, over whatever routes are available between them. These variables can't be picked to maximise an arbitrary measurement.
I am not suggesting that the Sprint/SUNET group did anything significantly different from the Caltech/CERN group, but that the actual measurement used by both groups is not completely meaningful.
.Political correctness is based on the principle that it's possible to pick up a turd by the clean end.
by that time media player will be about 500GBOriginally posted by Alex H@5 July 2004 - 23:33
Cool - my only question is: when can I get it a home?
I only licked you for the salt
Bookmarks