Wednesday, September 23, 2009

Let's Rally Around Net Neutering

For those of us opposed to Internet regulation under the guise of the "net neutrality" label, admittedly a superficially appealing slogan, it has always seemed to me that we needed to come up with an equally appealing moniker to capture the reality behind the proposed net regulation. For many years, I've thought "net neutering" was a good such banner.

Indeed, I had started to use the term "net neutering" in the 2004-2005 time frame in opposition to the net neutrality forces. At the time, I believe it was the earliest use of the term. In 2005, I published a paper titled, "Net Neutrality and Net Neutering in a Post-Brand X World: Self Regulation, Policy, Principles and Legal Mandates in the Broadband Marketplace." And, when I edited a book with my then-colleague Tom Lenard that was published in 2006, I titled it: "Net Neutrality or Net Neutering: Should Broadband Internet Services Be Regulated?" In 2007, I published a law review titled, "Net Neutrality Mandates: Neutering the First Amendment in the Digital Age."
(The constitutional objection to net neutrality is a whole other story.)

Thus I was pleased to see Holman Jenkins' column in today's Wall Street Journal titled, "Neutering the 'Net,"
in which he explains why FCC Chairman Genachowski's just announced proposal to impose new net neutrality mandates would have the effect of neutering the net.

For many years, I have thought that, in today's digital age, imposing analog-age common carrier-type rules – yes, that is truly what net neutrality mandates are – would have the effect of neutering the net. In other words, such Internet regulation would slow down innovation and investment by removing incentives for Internet service providers to differentiate themselves lest they violate nondiscrimination (neutrality) mandates.

Because there are many sound policy and legal arguments why new common carrier-type regulation should be rejected for the Net, it is not essential to have a sound bite slogan to refute proposals for such regulation. After all, it ought to be the ideas that matter. But I sure think "net neutering" is a good banner around which to rally, one which fairly captures the peril inherent in net neutrality regulation.

Let me know what you think.

Tuesday, September 22, 2009

Richard Epstein on Net Neutrality

Richard Epstein, a member of FSF's Board of Academic Advisors, is widely acknowledged to be one of the nation's foremost law and economics scholars. Among many other positions, he is Distinguished Service Professor of Law at the University of Chicago, where he has taught since 1972. His impressive bio is here. Please peruse it.

But more importantly for the moment, please consider carefully the last paragraph (reproduced below) from Professor Epstein's chapter in the Free State Foundation's new book, New Directions in Communications Policy, published by Carolina Academic Press. Richard's chapter is titled, "What Broadcast Licenses Tell Us About Net Neutrality: Cosmopolitan Broadcasting Corporation v FCC." In it, he shows -- brilliantly, as usual – what one piece of the FCC's history of broadcasting regulation teaches us about the perils of net neutrality regulation.

In his summary and conclusion, Professor Epstein explains that we ought to be able to distinguish between regulations that strengthen markets and those that undermine them, but that there is a long-standing pattern of government's failure to do so. The regulators' proclivities to overreach are often driven by a denigration of property rights and systems of voluntary exchange. So here is how he concludes his chapter:

"A similar pattern is at work in the modern debates over net neutrality. The defense of that position starts out as a plea to end discrimination. Yet there is little evidence that the new dose of regulation will produce any gains in the short run. In the long run, we can expect a repetition of the sorry performance of the FCC (or, for that matter, Congress) with respect to broadcast rights to work its way through the law of net neutrality. The sad truth is that the parties who seek to develop sophisticated and sensible schemes for state control quickly lose control over the administrative process to persons whose ambitions for state control are not bound by any fine-grained rationale. The dangers for this predictable drift usually suffice to err on the side of caution. Stated otherwise, the expected rate of depreciation of sound public norms that rely on administrative discretion is high. There are too many pressure points to keep the rascals at bay. So the recommendation here is to follow classical liberal principles that treat all state intervention as a mistake until it is shown to be a good. More practically, and much to the point of the current public policy debate: Keep private control over broadband pipes by abandoning the siren call for net neutrality."

Having in mind even a bit of regulatory history, it is difficult to put more concisely the case against net neutrality mandates that FCC Chairman Genachowski intends to have the FCC impose.

I wish those FCC commissioners considering new Internet regulation would stop for a moment and think about Professor Epstein's warning:

"The sad truth is that the parties who seek to develop sophisticated and sensible schemes for state control quickly lose control over the administrative process to persons whose ambitions for state control are not bound by any fine-grained rationale."

In Dynamic Markets, Potential Competition Matters

Yesterday, FSF President Randolph May and I submitted comments to the FCC in an important forbearance remand proceeding. At stake in the proceeding is the importance of the potential for competition by new entrants in the broadband marketplace when the FCC conducts regulatory forbearance analysis. The FCC has questioned whether or to what extent it should consider potential competition when it weighs petitions for forbearance from regulation. As our comments relate, it is crucial for the FCC to adhere to those precedents in which it has considered both existing and potential competition in the telecommunications marketplace.

FSF advisory board scholars Dr. Dennis L. Weisman and Prof. Glen O. Robinson have aptly described telecommunications as a dynamic industry characterized by rapid innovation and competition, not stasis. The FCC has similarly recognized this dynamism, as we point out in our comments. Unfortunately, the FCC failed to take this dynamic market seriously when it rejected both Verizon's and Qwest's respective forbearance petitions from certain unbundled network element (UNE) regulations.

As explained in the FSF Perspectives piece, "Assessing the FCC’s Competition-Assessing Competence," the D.C. Circuit recently held that the FCC failed to consider the marketplace impact of potential competition in rejecting Verizon's forbearance petition. The D.C. Circuit remanded the case for the FCC to give a reasoned explanation of its decision. It also remanded Qwest’s forbearance case because the FCC had similarly rejected Qwest's forbearance petition without properly considering potential competition. At stake in the remand proceeding is the fate of both forbearance petitions, and (possibly) the continuing role of potential competition in the FCC's forbearance analysis.

Forbearance is an important deregulatory tool that Congress intended the Commission use to keep the fast-evolving telecommunications marketplace free from outdated and burdensome regulations. (For more on that, see the FSF Perspectives piece, "Why Forbearance History Matters.") It's especially important for the FCC to exercise its forbearance powers with respect to a dynamic market, and consideration of potential competition by new entrants or existing rivals is a crucial element in any dynamic market.

The FCC's own precedent concerning UNE regulations includes careful consideration of potential competition as an important force for ensuring improved price and service options for consumers. Our comments to the FCC emphasized that the Commission, especially in such a dynamic market as communications, should carefully examine potential competition and take it into account when conducting a forbearance analysis.

Monday, September 21, 2009

An Immodest Proposal for Internet Regulation

Despite the appealing modesty of FCC's Chairman Julius Genachowski's tone, the Internet regulation proposal unveiled in his speech today at the Brookings Institution is anything but modest. Indeed, the peril lies in its immodesty. It is immodest in the way that those imbued with a certain faith in their good intentions often, in their regulatory zeal, may fail to recognize.

I will be saying much more about the FCC’s proposal in the weeks and months ahead. But here I just want to explain briefly what I mean about the troubling immodesty of Chairman Genachowski's remarks. In order to appreciate what I am saying, please have in mind these two statements from his address: "We cannot know what tomorrow holds on the Internet, except that it will be unexpected." And: “I understand the Internet is a dynamic network and that technology continues to grow and evolve." There are others in a similar vein but these suffice.

Here is some of what makes Genachowski’s proposal so immodest from a regulatory perspective:

While asserting that he cannot know what tomorrow holds for the Internet and that the Internet is a dynamic network that continues to evolve, it is clear that Genachowski nevertheless presumes that all worthwhile innovation takes place at “the edge of the network,” and not within the various broadband networks. This presumption drives his view that that there must be a strict non-discrimination rule – the same type of rule that was at the heart of nineteenth and twentieth century common carrier regulation. Ultimately, this rule means enforcing a strict separation of content and applications from transmission media. This separation precludes any efficiencies and cost-savings and innovation that might result from integration of content, applications, and transmission. What's more, in the digital broadband world of integrated bit streams, such separation of content and applications from transmission necessarily will involve more metaphysical conjecture than sound regulatory policy. On this point, see my piece, "The Metaphysics of VoIP," written back in 2004. The principle is the same.

It may be that, as the Internet evolves in the dynamic way that Genachowski concedes it has without common carrier regulation, separation of content and applications from transmission will prove to be the best business model. Or, more likely, the 'best' business model will change continually over time. But Genachowski presumes to lock a business model in place based on all innovation "at the edge," even while admitting he cannot know what tomorrow holds for the Internet.

He says "this is not about government regulation of the Internet," it is about "fair rules of the road for companies that control access to the Internet." Of course, it is all about government regulation of the Internet, regardless whether he believes the rules or "fair" or not, or whether they are put in terms of "access to the Internet" or some other way. Genachowski surely knows this. But he asserts, soothingly, that "we will do as much as we need to do, and no more, to ensure that the Internet remains an unfettered platform for competition, creativity, and entrepreneurial activity." Nice thought, but not very comforting, despite the good intentions. The history of the FCC shows that the error costs from overregulation – especially in a dynamic environment in which Genachowski concedes we cannot know what tomorrow holds for the Internet – are likely to be exceedingly high. The presumption that the FCC will know the point at which it has regulation "just right" – as much as is needed, but no more -- demonstrates an immodest faith in the prescience of the FCC commissioners that likely would cause Goldilocks to blush.

Despite acknowledging the growth and many successes of the Internet in the past decade, Genachowski asserts at one point that "as American consumers make the shift from dial-up to broadband, their choice of providers has narrowed substantially." It is discouraging that he would make this claim, because it is true only if one does not appreciate the difference between the facilities-based competition that continues to grow in the broadband world and the artificial or "managed" competition that prevailed in analog dial-up world. While the FCC may have created more dial-up "competitors" under its regulated resale mandates, these so-called competitors weren't investing in network facilities and innovating. They were offering what generally were acknowledged to be plain vanilla services. It is true that we would like to see even more robust competition among providers than we have today, even as the competition between wireline, cable, satellite, and wireless providers continues to increase. But to say that consumers had more choices of providers in the dial-up world than now is like comparing a 56Kbs modem – remember? – to a DSL, FiOS or cable broadband connection, or even today's wireless or satellite broadband connections. Apples and oranges, as they say.

Finally, the notion of proposing new Internet regulation in the midst of the FCC's efforts to develop a national broadband plan is immodest as well. With all the emphasis the FCC has placed on gathering data in that proceeding, it would make sense for the agency to finish that job before proposing new rules. After all, if the agency thinks new common carrier-type Internet regulation should be adopted, wouldn't it be advisable to include such a proposal in the broadband report being prepared for Congress before forging ahead?

Tuesday, September 15, 2009

FSF Conference Video Posted

The video from the Free State Foundation's September 10 conference at the National Press Club is now available on the NextGenWeb site here.

The keynote addresses of FCC Commissioner Meredith Baker – her maiden FCC address – and Blair Levin, the Coordinator and Executive Director of the FCC Omnibus Broadband Initiative, deservedly, have garnered much press attention. It's worth emphasizing that the presentations of Professors James Speta, Christopher Yoo, and John Mayo not only were scholarly, as one might expect from these prominent scholars, but they were informative, interesting, and entertaining as well. Looking out into the audience while they were speaking, I was struck – nearly dumbstruck – that the berries, pods, and other assorted handheld devices, had all but disappeared.

Both Jim Speta and Christopher Yoo, in the context of addressing the FCC's decision last year in the Comcast file-sharing case now on review, discussed both procedural and substantive failings in the way the FCC handled the case. While their focus was principally on net neutrality and the agency's decisionmaking process, the cautionary note they sounded concerning the perils of regulatory overreach, and the potential consumer harms from such overreach, has more general applicability as the Commission considers other competition issues. Similarly, John Mayo's discussion of the urgent need for universal service reform, while cogently making that case, does double duty as a basic refresher, for those in need, of some fundamental economic principles.

In other words, the video contains much useful and thought-provoking information, and much for the new FCC to consider as it gets ready to tackle formulating the broadband plan and addressing other communications issues. So, as they invariably say in New York, when they bring your hot pastrami sandwich and pickle: "Enjoy!"

PS: As I said at the conference, there is 20% discount for Free State Foundation friends and family on FSF's new book, New Directions in Communications Policy, published by Carolina Academic Press. In addition to Professors Speta, Yoo, and Mayo, other contributors to the book are: Professors Gerald Brock, Diane Disney, Richard Epstein, Bruce Owen, Glen Robinson, Dennis Weisman, and Steven Wildman. They are all members of FSF's Board of Academic Advisors. The secret code for the discount is COMM20 – but don't spread it around too freely. You can order the book on CAP's website here, or call 800-489-7486.

Monday, September 14, 2009

The Fiction of Forced Access

Adam Thierer, my friend and former colleague, was kind enough to link to my piece, "When Friends Agree to Disagree: Data and Regulatory Philosophy," in a blog on the Technology Liberation site.

Adam has always contributed greatly to the fight against "forced access" regimes in the broadband world, and, as usual, his latest piece, "Fiction of Forced Acccess 'Competition' Revisited", is well worth reading.

Saturday, September 12, 2009

When Friends Agree to Disagree:Data and Regulatory Philosophy

At last Thursday's Free State Foundation conference at the National Press Club, celebrating the release of FSF's new book, New Directions in Communications Policy, there was an exchange between me and Blair Levin concerning the respective roles of data and regulatory philosophy in the development of the broadband plan. As I said in my introduction before his opening keynote: "I am an admirer and respecter of Blair's communications law and policy expertise, his experience, and his dedication. And I consider Blair a good friend." Those of you who have heard me over the years introduce Blair, now the Coordinator of the FCC's national broadband plan efforts, know that when I call him a good friend and say I respect his expertise, I am not just making small talk.

I think Blair will agree – and indeed he said so in his remarks at the conference – that he considered our exchange, now carried on, and perhaps magnified a bit, in the press, valuable in helping to think about the development of a broadband plan that best serves consumers and the overall national interest. You can read the reports about the conference in Communications Daily [subscription required], Broadcasting & Cable, and TR Daily [subscription required] to get a flavor of the exchange.

As background, I had been quoted earlier in Communications Daily as saying that while I thought the broadband workshops had been useful, I would like to see, in addition to the Commission's ongoing emphasis on data-gathering, more focus on what regulatory philosophies or models work best. In my introduction of Blair on Thursday, I said:

"Certainly, the essays in the New Directions book contain considerable data concerning broadband subscription, the number of competitors, the growth in USF funds, and the like. I agree, of course, that it is important for our public policymakers to have good, reliable data as they make decisions – and I bet our opening keynoter, a very key player at the FCC, might make the point that the new Commission is going to be 'data-driven,' and that it needs good data to develop a broadband plan. True enough."

And then I went on to say:

"But the development of sound policy truly does depend on what you do with the data, how you interpret it. Here's an example. Free Press says that prior to the abandonment of the FCC's sharing requirements for the incumbent telcos' lines there were 6000 ISP competitors in the country and now, as a result of deregulation, there are less than a dozen. Well, that is true only if you don't recognize as a matter of regulatory policy the important distinction between facilities-based competition and competition based on regulated resale, and if you don't think it makes any difference whether competition is facilities-based or resale-based. So if you take the 6000 figure at face value and run with it, you would be making a big mistake, in my view, in terms of developing sound policy. Indeed, the development of sound policy depends on having good data and having a good understanding of the economics of network industries and, in the case of communications policy, especially of the dynamic nature of technological change in the marketplace."

During his presentation, which is now posted on the FCC's Blogband website, Blair made several points responsive to my contention concerning the need for some focus on regulatory models and perspectives. I agree with much of what he said. For example, Blair said that if the FCC doesn't get data on who is unserved, where they are, and what it costs to build out a network to serve them, "you are going to get the wrong answer no matter how thoughtful your philosophy." Agreed. He also said it is not the staff's role to focus on regulatory policy, but rather the Commission's. I agree, of course, that it is the Commission's role, ultimately, to establish regulatory policy. But I think it is also the staff's role, with its expertise and the tools available to it, to facilitate the commissioners' understanding of how various regulatory policy and perspectives may impact their decisions. In my view, the broadband workshops would be one appropriate way of doing so, along with others. But I don't disagree with Blair's statement that the fact-finding comes first, during what he called "the early stages of a policy process." After all, he did state that we "will get to that point in the dialogue when philosophy does matter, where different views about the appropriate role of the public sector will be germane to the issues."

Thus, on this matter of the intertwined relationship between "data" and "regulatory philosophy," in reality our differences probably are not that large, especially as Blair agrees that there is a point in the process – before decisions are made – where "philosophy does matter," when the appropriate role of the public sector is germane. (There is a reason why FSF's masthead proclaims "Because Ideas Matter.")

Let me illustrate a couple of instances in which good data certainly matter, and an instance in which regulatory perspectives or philosophy matter, and why in this latter instance being "data driven" in and of itself is insufficient to lead to sound policy.

In my comments filed with FCC on the plan, I urged that the priority should be the targeting of federal support for the deployment of broadband facilities in presently unserved areas. If deploying broadband to unserved areas is a priority, and it should be, it is necessary to have good (which is not the same as perfect) data that indicates which areas presently lack service. Fortunately, through the now prevalent ongoing mapping exercises, to the extent that such data previously was unavailable, it is now quickly becoming ever more so.

With respect to increasing adoption, an objective I consider consistent with the development of a sound broadband plan, having good data is also important in many respects. For example, in my comments I suggested it may be appropriate for the government to subsidize the purchase of computers by low income persons as a means of promoting more widespread adoption. Whether this makes sense or not, or the extent to which it does, depends on having sound data concerning computer ownership by low income households, the number of persons affected, the cost of the computers and the like.

And regarding adoption, there is good data, especially from the Pew Internet and American Life Project, showing that, while there are many non-financial reasons for non-subscription to broadband, for some the principal reason is related to income. FSF Distinguished Adjunct Fellow Deborah Tate, who chaired the Federal-State Joint Board on Universal Service while serving as FCC Commissioner, recently advocated in a Baltimore Sun commentary that the Lifeline/Linkup program be changed to provide support to low income persons for broadband subscriptions. Again, having good data on the extent to which income level correlates with broadband subscription, and the extent to which a particular level of financial aid is likely to correlate with an increase in take rates, is useful in developing sound policy in this regard.

So, these are examples where good data points are necessary if policies designed to increase deployment and adoption are to be implemented in the most effective and efficient way.

Now, consider an instance in which regulatory perspective and philosophy matters a lot. As I said on Thursday, Free Press has suggested in its Dismantling Digital Deregulation paper that at one time there were 6000 Internet Service Providers in the country and that the average American consumer had access to more than a dozen ISPs. As a result of the FCC's abandonment of DSL line sharing and wholesale price regulation policies around 2003-2004, this story line goes, only a handful of the 6000 ISPs remain and consumers no longer, on average, have access to a dozen ISPs. Well, in deciding how to interpret this data, how to process it, here regulatory philosophy matters a lot.

Let's assume for the sake of argument that the 6000 figure for the number of independent ISPs is an indisputable fact. Nevertheless, I would not want the FCC's development of a broadband plan to be "data driven" (in the wrong way) by this particular data point. Rather, I would want commissioners to understand that the 6000 ISPs existed merely at the sufferance of an agency policy of "managed competition" through regulated common carrier resale, and that such a "managed competition" policy does not provide incentives either for the incumbent providers to upgrade their networks or for the so-called "competitors" actually to build out their own network facilities. And I would want them to understand that, in the long run, which is what matters, consumers benefit more from facilities-based competition that supports sustainable competition than from managed resale that does not support sustainable competition.

And here is the key point: In deciding whether to recommend returning to the common carrier/net neutrality/open access regulatory regime advocated by Free Press and others, or whether to adhere to the deregulatory broadband policies the Commission largely adopted five or so years ago, what matters in reaching the right decision is not only knowing the 6000 ISP data point. Rather what matters is having in mind a regulatory philosophy that enables one to understand why this particular data point should not be considered a reason for supporting a common carrier regulation in light of the difference in investment and innovation incentives created by facilities-based and non-facilities-based competition. This is what I mean when I say that, in developing the broadband plan, the commissioners should have in mind a regulatory philosophy.

Commissioner Meredith Baker, who delivered the closing keynote at Thursday's conference, was most eloquent in speaking to the importance of having both good data, what she called "fact-based evidence," as well as a regulatory philosophy grounded in free market principles. In her speech, titled, "Incentives Matter: Decision Making at the FCC," Commissioner Baker said that having "good data is critical," especially with respect to what ought to be the highest priority in broadband policy, "to get broadband to all remaining unserved areas of the country." (As stated above, I fully agree this should be the top priority in the plan.)

As for regulatory principles, Commissioner Baker said she starts "with an assumption that markets work better than government intervention and that competition regulates market behavior more efficiently than regulators can." This is a simple but fundamental "first principle," but not one to which everyone subscribes, certainly not those parties urging the Commission to impose common carrier regulation on broadband ISPs. Commissioner Baker added: "[M]ost importantly, I believe incentives matter. Actions government takes –or doesn't take – affect market behavior and create incentives. I recognize that nearly any regulatory change the Commission makes will disturb the balance of the market—maybe for better, but possibly for worse. Therefore, our decisions must be fact based, fully considered, and reasoned. Good intentions are not enough."

Speaking more specifically about the analytical framework she proposes to use in reaching policy decisions, Commissioner Baker identified these elements: (1) identify objectives when launching proceedings; (2) look to the statute; (3) understand the context of the policy at issue; and (4) weigh the costs and benefits of any decision, including the potential unintended consequences. This last element is critical, because if the wrong balance is struck, "we risk imposing costs and other regulatory burdens on providers." That raises prices, reduces quality of service, and harms innovation, outcomes that "damage consumer welfare and endanger economic growth in the dynamic sector of the economy at a time when we can least afford it."

The upshot is this: The extent to which one starts with a basic assumption that markets work better than government intervention or the other way around is part and parcel of a regulatory philosophy. Whether one embraces cost-benefit analysis, or appreciates the law of unintended consequences, or the principle of "first, do no harm" is part and parcel of a regulatory philosophy. So is the extent to which one accepts, especially in regard to communications markets, the power of Schumpeterian dynamism to change markets, as opposed to embracing a static or snapshot market view. And while it is easy to hypothesize that regulators should "let go" of regulation at just the right time as competition becomes more robust, one's regulatory philosophy influences whether one believes the error costs of waiting too late generally exceed the error costs of deregulating too early. These are but some of the ways in which one's regulatory philosophy is brought to bear on analysis of available data.

In my view, as Commissioner Baker articulated so nicely, a regulatory philosophy grounded in an understanding of free market principles best serves consumers. As I said in my comments to the Commission back in June, the broadband plan should be grounded firmly in such principles.

I am grateful to Blair Levin for his thoughtful and thought-provoking remarks at Thursday's FSF conference. It is true we don't always agree, but we always learn something by listening to each other. As Blair said on Thursday: "One of the nice things about friendship is it does not require agreement." I agree.

And, finally, I am very grateful to Commissioner Baker for delivering her maiden address at the FSF conference, and for doing so after flying back all night from Brazil to do so. Her keynote will repay many times those who read it closely, as I have several times.

PS – I should add that, by all accounts, the presentations of the three scholars – Jim Speta, Christopher Yoo, and John Mayo – were extremely informative and useful. More about these terrific presentations when the transcript is released. In the meantime, you will be able to watch them on the video soon if you missed attending the conference.