PFIR - People For Internet Responsibility
Breaking the Internet Network Neutrality Deadlock
Update: October 1, 2007 -- Please see:
September 27, 2007
Practical Issues of the Proposed "Global Internet Measurement Analysis Array"
Update: October 1, 2007 -- Please see:
The battle over Internet network neutrality seems to have reached something of an impasse, with both the pro-regulation camp and anti-regulation forces appearing to have become increasingly entrenched in their largely diametrically opposed positions and associated rhetoric.
Essentially, the pro-regulation side suggests that telephone companies, ISPs, and related entities cannot be trusted, based on their past actions, to behave in a pro-competitive manner that fairly treats all users of Internet bandwidth in a non-discriminatory manner, including both consumers and the independent providers of Internet-based services. The concern is expressed that "after the fact" remedies may be too slow and unwieldy to appropriately redress perceived or demonstrated neutrality abuses.
The anti-neutrality-regulation contingent argues that any attempt to codify network neutrality in law would stifle innovation, and that there's plenty of time to deal with any unlikely abuses after they occur. They also assert that existing regulatory mechanisms and laws are adequate to ensure a balanced and fair competitive landscape.
These arguments on both sides of the issue have been exhaustively explored in many venues, and I will not revisit their details here.
Rather, I propose that now is the time to consider a different, more quantitative approach to network neutrality issues, an approach that may offer a direction that could benefit all stakeholders in the Internet ecosystem across the spectrum of concerns and opinions.
The first step in this process would be to establish, to the greatest extent possible, a quantitative, rigorous understanding of operational bandwidth, throughput, and other parameters of public Internet traffic on the broadest practicable scale, and in a wide variety of operational contexts, ranging from home users to large corporate enterprises.
A key aspect of this effort would be to gather for comparison and study both user end-to-end metrics as well as measurements of traffic statistics between end users and central server facilities. It's important to characterize both of these topologies to detect and analyze possible distortions or other undesirable artifacts that could negatively affect Internet data flow and resulting applications performance and user satisfaction.
Accomplishing this sort of data analysis appropriately and realistically will necessitate the collection of Internet-related metrics of a form and scope that do not readily exist at this time.
An infrastructure for the collection of such data, and for the longterm monitoring of associated performance parameters in the real world Internet, could be deployed on a widely distributed basis (much like the SETI project's distributed data analysis environment, for example), via small software programs running in the background on the PCs and other systems of cooperating Internet users and organizations on a voluntary basis.
These systems would perform as the data collection endpoints of a vast global Internet traffic measurement and analysis environment. This would ideally be represented by a very large number of individual participants (likely tens of thousands or potentially very many more -- the greater the number of high quality data sets available, the better and more widely applicable (on scales ranging from local to global) the overall analysis can potentially be.
To coordinate, analyze, and make this collected data available in various user-friendly mapping, report, and other forms, I anticipate the participation of one or more large Internet service entities, who could act as the processing and distribution centers for the significant amounts of traffic-related data that the project would entail.
Given such a measurement infrastructure as a starting point, likely with a relatively "formal" core group of coordinating participants plus the widely distributed volunteer force of data collection sites, it would then be possible to derive (obviously not set in concrete, but subject to change and adjustment over time) a relatively rigorous algorithmic characterization of network traffic and associated patterns in various Internet contexts and service spaces.
The data and performance work product, reports, and other output generated by this project could then be key inputs used to characterize and provide early warning -- again on a variety of scales -- of the extent, if any, to which Internet traffic patterns are distorted at any given time by technology breakdowns, constriction of throughput due to purposeful discriminatory behaviors, or other causes.
This proposed distributed global Internet measurement environment would provide technologists, corporate executives, legislators, and other interested parties with a systematized, full-time, universally accessible means to perform quantitative analysis and assessments of the degrees to which the various segments and aspects of the Internet are being managed in a manner considered reasonable, or the degree to which Internet management abuse is taking place.
To the extent that any unacceptable distortions in Internet traffic characteristics are discovered, this continuing data could, as one likely but not the only possible option, be used as a basis for legislation that would incorporate data-related "triggers" to activate predefined and immediately implementable remedies. This may be considered by many observers to be a preferred alternative to the approach taken in some existing pro-neutrality regulation legislation -- that of simply opening a window for the filing of neutrality or antitrust complaints for usually prolonged consideration by regulatory or other agencies.
Triggers and remedies under the approach proposed here would be as specific and quantitatively precise as possible, and only activated in the face of defined violation conditions based on the hard data from the measurement environment. In the absence of any defined abuse conditions being triggered, ISP and related operations would proceed on a free market basis without new constraints.
This proposal, if implemented from both the technological measurements standpoint and on a legislative basis to whatever degree may be deemed appropriate, would offer what amounts to a "status quo" operating environment to ISPs so long as they continued to compete in an open, fair, and nondiscriminatory manner, but would enable the promise of quick and decisive corrective actions in the face of any specific abuses as detected by, and defined in conjunction with, the proposed global Internet measurement infrastructure.
This document obviously does not attempt to detail all aspects of this proposed project and any associated legislative efforts. A wide variety of technical and non-technical facets would need to be further researched and developed before even experimental initial deployments of the Internet measurement infrastructure would be possible. Nor have I attempted to address organizational, logistical, or funding-related issues at this time.
Nonetheless, I hope that this text provides some food for thought -- a possible starting point along a course that could help break through the current Internet neutrality deadlock. I would of course welcome any comments, questions, suggestions, objections, endorsements, or even polite flames.
The responsible stewardship of the Internet is a critical issue not only for the Net itself and its direct users, but by extension more generally for our technological cultures around the world. By working together toward finding an appropriate operational balance based on solid data, we can hopefully evolve the best possible Internet for all of us moving forward into the future.
Thank you very much for your consideration.
- - - - -