Wired's online site is taking a look at some unsolved mysteries of the net. Their first article is this missive on how much of the net is consumed by file sharing. Seems that noone really knows what's going on. A good summary of the problem, part way down the article is:
NBC filed something with the FCC using the Cache Logic study, done a year after the Pew Internet study, saying that file sharing was dropping and our study showing file sharing was increasing. And the Cache Logic study just came out with a number -- no trends, just that file sharing was 30 to 50 percent of traffic, and NBC uses that number -– way old, no peer review, no methodology -- to say 'You guys, the FCC, have to start policing the network and getting this file sharing off the network.'
All of the data out there is suspect.
24" iMac, 2.8 GHz, 4 GB RAM, 1 TB HD
eMac, 700Mz, 640 MB RAM, 40 GB HD
I don't know exactly how it would be done, but, now that I think of it how could you really tell? I mean if P2P used HTTP, then in theory you would have to open each packet and have a look; a packet sniffer does this but I doubt this is done on that scale. Even at that if P2P is always done one port 2990, they could capture the data traveling throgh that port but is it really an accurate picture; me thinks not.