Internet network as utility

A recent Slashdot question to readers on How Much Does Your Work Depend on the Internet? includes a number of posts remarking on how important redundancy is to running an effective network.

Also, Ars is noting that the fight over encrypted Bittorrent is coming down to the creation of systems which systematically examine each packet a user sends/receives.

This has brought a few questions to mind:

First, how technically effective could a network that is not redundant and not packet neutral be? The original DARPA net was designed to be neutral to withstand attack, but it was later found that this design also has the benefit of greater reliability from being able to route packets through the path of least resistance. Routers know where to send packets, but if they have to additionally take on the task of looking ‘inside the envelope,’ will they still be as effective (read: fast and reliable)?

Second, I am having a difficult time understanding the basis for bandwidth throttling of bittorrent (or any other resource intensive application) users. My understanding of purchasing a network service is that one is allowed to use the service in whatever legal way they see fit up to the capacity of what was purchased (be it minutes or kb/s). Bittorrent users may use a greater amount of this capacity over time…but didn’t they already pay for it?

Perhaps these are two other aspect of the definition of network neutrality: packet neutrality and service/content neutrality (or the freedom to use a service, in whatever capacity desired, as long as it’s within the bounds of time/’bandwidth’ limitations).

What is Web 2.0?

Tim Berners-Lee on Web 2.0: “nobody even knows what it means”

I haven’t listened to the interview yet, but it looks as though the inventor of the Web isn’t too hot on the idea of Web 2.0 as something new. Specifically, he argues that the original web acted as a collaborative space.

For geeks like me who were learning html in 1995, that might have been partially true, but there are a number of emergent aspects which I would argue make the ‘connecting of people’ more of a reality:

  • Greater interactivity through scripting and databases has brought the web beyond just static pages,
  • Greater attention to design has made using web pages, and even publishing much easier (think Blogger), and finally
  • Greater connections are being made between content points. Digg, delicious, trackbacks and other collaborative/responsive linking and moderation is tying the web much closer together than simple static links.
  • (edit: add to this extensible web applications and open web services/APIs which allow for greater customization of experience)

Sure, this might not be “2.0,” but the name does indicate what has been happening on the web: something new which has even more greatly democratized mass communication.

Group response to the RIAA video

RIAA copyright education contradictory, critics say | CNET News.com

It looks as though a number of groups are going to “issue a joint statement condemning some statements on the Recording Industry Association of America’s video.” Pointing out the discrepancies and generous interpretations of the law must be done, but I hope that the statement also notes the role that Educause played in the creation of the video.

“First, we were told we should not enforce our rights,” said an RIAA representative responding to critics of the video. “Now we are told education is wrong, too. We won’t accept such a do-nothing approach. We’ll continue to work with respected higher-education groups to engage students to think critically about these issues.”

This RIAA spokesperson has received an important message: education about copyright is not the answer. For the public to truly accept and adhere to copyright law, we should attempt to open a dialogue between owners and users. If the public is alowed to have a stake in the law they are expected to adhere to, there may be greater compliance than in a law which is imposed on them.
For more on the video and the role of Educause, see the last few posts under “Copyright.”

Times Withholds Web Article in Britain – New York Times

Times Withholds Web Article in Britain – New York Times

Here’s an interesting note about how the Times blocked an article from IP addresses within Britain because it contained information that might prejudice an English jury. Kudos to the Times for coming right out and admitting the block.

The problem with this action is the type of precedent it might set. Would I be liable for the same offense if I posted the same info on my blog, which is available to all of England. What about a more prominent blogger? It’s interesting to think in this case about what separates the Times as an institutional speaker from less notable (but, in theory, just as accessible to the public) publishers.