Monday, May 24, 2010

New Developments in the Net Neutrality Debate

In a recent issue of the Feliciter, two articles discuss issues and problems concerning the "net neutrality" debate in Canada and the United States. In the first article, Bruce Harpham gives a brief summary of the issues surrounding how net neutrality is "debated" in Canada and the United States. That is, he identifies some of the main arguments made in the American context, and contrasts them with how the Canadian context is taking shape. This issue is particularly interesting to me because it may impact, IMHO, how a "service" like the internet becomes a utility.

In the second article, Devon Greyson provides additional details about how the arguments about net neurtality will affect library professionals.

From these two articles, I gather that there are two main issues at stake in the general discourse about "net neutrality". One is the notion that data packets should travel with equal efficiency across the network of servers and network routers that make up the infrastructure of the internet. The other is that anyone should be able to access any content that they want on the internet, regardless of the source. In the second article, Devon Greyson seems to conflate the two issues. That is, she seems to imply that the ability to access information and the restriction-free transfer of information are the same thing. I disagree.
The reason that I think that these are two issues and not one is about the way that actors are involved. When it comes to the transfer of information (in the form of data packets) across the network, we are talking about a passive characteristic. Meaning, there should not be an active regulatory mechanism that gives certain data "priority" over others. In Ms. Greyson's article, she outlines a model of corporate interference in hte free-flow of information, where advertising and paid-content is more valuable than free and open-source content. I believe that, though the argument is a little sensationalist, it is a valid fear of the potential of corporate interest in the flow of information. Ms. Greyson argues that internet providers (ISPs) should be held to a "non-interference" standard. I agree with this argument insofar as it relates to the transfer of information across the network.

In the second issue: the ability for any user to access any content equally, there is a similar implication that there should not be a regulatory mechanism assigning priority to some data and not to others. However, I believe that in this regard we are missing an important aspect: users are actively seeking and accessing information. That is, there are two decisions being made by users: the type (category, keywords, database name) of information being accessed, as well as the size, meaning the extent, of the data. When looking up information about using Adobe Dreamweaver, for instance, one could look up information and access individual content pages (which are relatively small) or try to download the entire guidebook (which is potentially quite large). As she does with regard to the first issue, Ms. Greyson argues that users should be able to access any content of any size found on the internet. However, in both of the articles, the internet providers in Canada and the United States, allege that they should be able to “throttle” certain users who, by using peer-to-peer sharing networks, are “consuming” more bandwidth and server capacity than casual internet browsers. In other words, ISPs are throttling users accessing large amounts of content that has been deemed “low priority” (or, pirated).

In my opinion, Ms. Greyson's model of internet providers requires users to be able to access whatever they want and as much as they want, while burdening the providers with the cost of treating each user equally. This is troubling because it is not sustainable. If anyone, for personal or business/industrial purposes, could consume as much electricity as they want whenever they want to, the electric grid would be put under undue strain. Despite generous caps on the cost of utilities, users are limited by the cost of their consumption on the available supply of power.Link

A recent article by the Economist outlines a model that provides a middle ground. In the Economics Focus section, the implications of a recent FCC ruling are described. Apparently, the FCC has ruled that the internet providers in the US are subject to the common carriage protocols affecting telephony and railroads and hoteliers. I will not outline what “common carriage” describes here, but I will explain why I think it is an important ruling.

First, by subjecting ISPs to common carriage restrictions, it denies them the ability to regulate the priority given to certain packets of data being transferred. And, it denies them the ability to restrict the types of data accessed by users. So, the first half of the issue has been satisfied, as well as part of the second.

In my opinion, this leaves room for coming up with a model of internet use that is sustainable because the only “wiggle room” for ISPs is too charge more for higher volumes of usage, that is greater number of data sent and received.

I'm not sure how the FCC ruling might impact the Canadian debate on net neutrality, but I think that it is an important step towards encouraging a more mature “internet” market.

No comments: