JamieG Analysis

JamieG looks deep into the ramifications of current trends in Technology and Media

JamieG Analysis header image 2

Evolution of file copy. “cp”, “ftp”, “http”, “p2p”

January 11th, 2008 · 2 Comments

In a previous post, a comment was posted pointing at a company called “ReelTime“. This is a movie rental service like “Jaman“. Both use a downloaded client that implements a DRM system and also a P2P file transfer system to download the files. The Reeltime P2P service is supplied by “GridNetworks“.

I had also recently posted a story on proxies for P2P networks found here.

This got me thinking of the evolution of the “copy” command. In unix/linux the command is “cp”.

The copy command is used to copy data from one place to the other. For example, if I wrote a story, I could copy it onto a floppy disk so I could take it with me or give it to a friend. The birth of social networking as we know it today.

The network was then invented.

The copy command evolved into “ftp” (File Transfer Protocol). This is basically the same as “cp” however, it allows you to copy a file not just from disk to disk, but from computer to computer over the network. This was mainly used by scientists to share data when it was implemented. Explains why it is so hard to use. 😉

Then the WEB was invented.

The web, initially was simply being able to type in memorable URL’s such as www.crafted.com.au, to get to a repository of information. You would use a Web browser to access this repository. It would request a “COPY” of the data be sent to you so it could be displayed in the web browser window. The FTP protocol was initially used for this but was found to be very inefficient. The HTTP protocol was established. It was designed specifically for web pages, displaying them faster and with less overhead.

So what now.

Well it appears many internet users are attempting to copy/download very large files. From this, a new form of “COPY” has evolved. “P2P” networking.

As previous evolution has suggested, P2P type technologies will eventually become part of the Web browser if not the operating system. For this to happen, a P2P open standard will evolve. A free to use and implement P2P implementation. Similar to FTP (See RFC959RFC2616 standards document for specifications) standards document for specifications) followed by HTTP (See

Personally I am surprised that this has not already happened. There is talk of implementing it into firefox in the future.

And obviously, once it is an RFC standard, implementations of a proxy-P2P server by all ISP’s would be a logical next step. As it was for HTTP.

But really, at the end of the day, the long term aspect of any company that builds it business model on charging for access to their P2P infrastructure is not a very good long term bet. Obviously Jamen and Reeltime may be using these proprietary P2P networks due to the lack of any standard. But at the same time, their business model is very much connected to it.

And finally, if you like conspiracies, one would be inclined to start asking question WHY a P2P protocol has not evolved. Many companies with a lot of cash are very keen to make sure it never does. Keep this in mind.

Additional: See an update to this story here.

Tags: IPTV · P2P · Standards

2 responses so far ↓

  • 1 Davis Freeberg // Jan 11, 2008 at 1:49 am

    I’ve used both Jaman and Reeltime and I think that these programs still have a few limitations. I don’t mind them going P2P to help lower their distribution costs, but as a customer its pretty annoying to have to give up bandwidth to help them save money. It wouldn’t be such a big deal except with the weak transfer speeds in the US, it’s pretty painful to use the net while you are uploading content for them. Perhaps even worse, I’m unable to cleanly stream content from my PC to my Xbox or to my DivX Connected player while their p2p program is running. I have the same problems using bit torrent, but at least with bit torrent, you’re not also having to pay for the content.

    I think it’s interesting that the Pirate Bay is trying to define a new p2p protocol and I’m hoping that they’ll figure out a way to better meter the bandwidth. If I could tell the program to only use so much to receive or deliver content, then I don’t think I’d have these same problems. I’m not sure when TPB will be releasing more details about their P2P protocol, but I think that it has the best chance of becoming the new standard.

  • 2 Matt Waring // Jan 12, 2008 at 4:06 pm

    I have used Reeltime’s service for a while, and their P2P player only shares pieces of the title you are watching while you are watching it. It never has interfered with my bandwidth that I noticed, since the picture of their movies rarely, if ever, buffers. In any event, if it ever changes its settings and begins to have an effect on my web surfing, it is easy to shut off, and all parameters are controllable, and uninstall is easy and complete, as far as I can see.

    It seems that Grid Networks is looking to create a P2P service that does not share the pitfalls of the older technologies, like Bit Torrent. Time will tell, but the early results are very encouraging. I will be watching intently. If people use their eyes instead of their prejudices, they have a chance to do something valuable to all of us.

Leave a Comment

Powered by sweet Captcha