Contact Info

(for those who care)

Instant Gratification   

Tue, 21 Dec 2004

The history (and future?) of P2P

So, was reading blogs today and ran across Bram Cohen’s latest posts. Bram’s the smart guy who wrote BitTorrent. The less interesting of his two new posts is in response to the recent SwarmCast announcements.

…the sweet spot for streaming is fairly small. Lots of formats can’t be displayed in real time at current speeds of net connections…


A lot of people think because of the history that being stuck on a TV schedule is what people want, but in fact people vastly prefer a tivo/netflix interface to a real-time streaming interface, and frequently completely dump the real time when the other one becomes available. Play on demand is when there’s a specific file which someone might want to watch and it starts playing immediately when they click on it. That’s the vast bulk of what users want. Tapping into whatever’s going on in a live stream is a niche market.


What frustrates me is how goofy the Konspire people are. (Please avert your eyes if you aren’t interested in a brief interlude on streaming / mass distribution).

In the beginning there was FTP (File Transfer Protocol). Then came HTTP, which is/was somewhat less effecient at transferring files, but it’s one less client / server to keep running and maintaining, so HTTP it is. These are what might be called ~star~ topology, where the file is at the center on a central server, and (let’s pretend) 100 people are all connected to the server trying to download it. Space, time, and bandwidth are all not infinite, so there is a potential for simultaneous requests to overload that central server. (Visualize a star / nexus).

Now enter P2P. First came Napster, where searches were centralized, but transfers were Peer-to-Peer. This alleviates (somewhat) the burden of bandwidth because (apart from searches) there is no file on the central server (if I recall, this was a portion of Napster’s legal defense). You were still downloading the file from one place, and that place could get overloaded but with Napster odds were that there was more than one source for the file. (Visualize parachuters holding hands in a circle).

The next generation of P2P is the BitTorrent-style (“swarming” downloads). In the simplest case, the file is available (seeded) by the server and let’s pretend that one client downloads the file and finishes. With BitTorrent, when the next client comes in to download, it will download a portion of the file from the original central source and request the other portion from the first client that finished (managed by the central tracker). The neat thing is that BitTorrent doesn’t download the file sequentially, but in random chunks so that each client can be both downloading and uploading (sharing) different parts of the file simultaneously. What this does is opens up the upload bandwidth of all the clients who are currently downloading the file. Listen to Bram, do some googling, he talks about “tit for tat”, and “encouraging sharing” and all the other neat tricks to make stuff happen. (Visualize a woven basket / tumbleweed. Lots of interconnecting points).

What Konspire does is says: All these goofy protocol tricks are a waste of everyone’s time. Instead Konspire will use the n^2 power of distribution and just transmit honking huge files all at once and hope the clients will re-transmit (re-broadcast) them to other people waiting in line. (Visualize a tree, the leaves come from the branches, which come from other branches which come from a trunk).

…this concludes the P2P protocol gobbledygook…

Now, to get to the real point. Konspire is basically saying: “We want to be the distribution platform for ‘The Apprentice’ over the internet.” Which is cool. You subscribe to a “file-channel” (like a blog), you wake up on random mornings, and you have hot-fresh files waiting for you. Seems like a pretty good way (protocol) to do it. Only problem is that if your “Konspire Receiver” isn’t online at the right times, you never “Katch” the broadcast. Which isn’t necessarily that bad— consider TiVo / PVR. If your satellite receiver / TiVo isn’t on, the broadcast of ‘The Apprentice’ isn’t recorded either.

What frustrates me is that Konspire seems to be so affronted by the fact that people are using BitTorrent that they frame everything in the context of “how much better they are than BitTorrent”, when what they should be doing is recognizing how the two can complement each other. Ever since I got my laptop, I don’t have an “always on” PC anymore. And since I don’t see any decently advertised sites / channels that are pushing Konspire content, and I’m not into pirating whatever large files that somebody decides to push down my pipe, I haven’t even tried downloading anything via Konspire (played with their client, but nothing beyond that).

What the Konspire people need to recognize is that they have a decent system and they shouldn’t (ever!) compare it in a challenging way to BitTorrent. Konspire (mathmatically) serves it’s purpose very well, but Konspire / BT serve very different purposes.

Instead the Konspire people need to figure out how to integrate BitTorrent into the Konspire protocol, along with at least a basic type of specialized blogging software with a focus on distributing large(r) files. I recommend it to Music Reviewers and/or Game-Mod people (maps, skin-packs, etc).

Reviewer posts file / TV-Show, all Konspire subscribers receive it via Konspire at their leisure. If you miss the broadcast, you ding the file 1-2 weeks later and get it served up as a .torrent (Perhaps a Konspire “NextGen(tm)” client could do this automatically? Transparently .torrent received files on all clients?) Different interaction needs → different interaction protocol.

Also, Konspire (in my opinion) needs to do more to forcibly encourage sharing/re-broadcasting of content (I haven’t seen this referenced at all in their design documents beyond the vague hand-waving “it is likely that the receiver will stay on to re-transmit”). For example, any Konspire broadcaster should pair two receivers together- N+1, and N+2. N+1 receives the encrypted file “X”, N+2 receives the decryption key for “X”. N+2 holds the key hostage until it receives the file (possibly re-encrypted?) from N+1.

Of course this has issues / difficulties with clients entering / leaving in the middle of a broadcast, and malicious or uncooperative clients, but fundamentally, this is why BitTorrent (at last from protocol descriptions) appears to have the edge. The instant that two people on BitTorrent have received the file, you have redundancy no matter if one of them drops off in the middle of a transfer or becomes malicious / uncooporative. And BitTorrent’s protocol behaviour is to prefer sending (uploading) files to people who I have downloaded from. Tit-for-Tat. Simple. Konspire’s protocol pages describe how they handle malicious clients via signatures on broadcasted files, but not how they handle uncooperative clients in the chain.

To sum up: Neither BitTorrent nor Konspire is perfect. Konspire is what will power some brave soul’s re-distribution of HBO via MPG on a 4hr tape-delay around the world. Think Satellites! Why launch a satellite when you can launch a 128kb upstream ADSL line that can service 1 Million people in 19.9 timesteps, and 1 Billion people in 39.9 timesteps? If the last episode of Seinfeld is meant to be watched live, transmit it encrypted (a-la Steam?) and publish the decryption key at midnight on a Tuesday. If you agree with Bram, scale via P2P the distribution of the large portion of the file, and narrow your bandwidth down to 1kb per person request required for real-time needs.

While this works fantastic for “prepared media / content” (TV Shows, Movies, Music, Video Games, Books, etc) it doesn’t address how to do massively live streaming of 1M plus simultaneous clients which is where SwarmCast comes into the picture (Nightly News, Speeches, Debates, Sports?). Maybe it isn’t possible to service 1M clients on a live stream? Maybe we need multicast? But the sheer volume of content that is effectively serviced by massively parallel distribution of prepared media means that these P2P broadcasters will only become more utilized as file-sizes grow larger on the ‘net.

Bram’s entry #2 on great programmers will have to wait until tomorrow.

01:27 CST | category / entries
permanent link | comments?

Like what you just read? Subscribe to a syndicated feed of my weblog, brought to you by the wonders of RSS.

Thanks for Visiting!