Thursday, January 31, 2013

An Experiment Is Worth a Thousand Pleadings


As regular readers of this space know, I consider Blair Levin a good friend, and a longstanding one at that. Aside from our friendship, I recognize and respect the considerable contributions that he has made and continues to make in the communications policy arena.
Here I will only mention, most recently, Blair's leadership of the team that put together the FCC's voluminous National Broadband Plan (regardless of whether you agree with all its recommendations, and I don't, preparing the plan was a huge undertaking), and his efforts with regard to the ambitious (and, in my view, significant) GigU project.
Despite our friendship, Blair and I certainly don't always agree, especially with regard to assessing the competitiveness of communications markets and weighing the costs and benefits of regulation. Simply put, at least as I see it, I have a more deregulatory perspective in light of my assessment of the increasing competitiveness of most communications market segments and the costs and unintended consequences of regulations.
All this is a prelude to saying that I was pleased to read about Blair's remarks yesterday at a "Georgetown on the Hill" forum regarding AT&T's IP-Transition petition. AT&T has asked the FCC to allow it to conduct some trials regarding the ongoing transition to all-IP services and the retirement of the legacy time-division multiplexed (TDM) facilities. Communications Daily [subscription required] reports in today's edition that, at the Georgetown forum, Blair stated: "If a picture is worth a thousand words, an experiment is worth a thousand pleadings." He declared the requested trials are unlikely to cause any harm. And he added, "the sooner they start that experimentation, the sooner the FCC will have real data" so that the telephone companies will no longer be required to invest "in an infrastructure we know will be stranded."
In its piece on the forum, TR Daily [subscription required] reports Blair said: "We are losing investment, we are losing jobs by not making those investments."
With regard to AT&T's petition, I have to say Blair and I are in "wild" agreement, and I commend him for the points he made. In comments that my colleague Seth Cooper and I filed with the FCC on Monday, we supported AT&T's request, and we urged the agency to act promptly.
Here is a brief excerpt from our summary:
"In recent years, the market for voice services has undergone dynamic change. This includes the emergence of cross-platform competition in local and long distance services as well as the employment of IP- and broadband-enabled technologies. Trends over the last several years reveal an ongoing migration of providers and consumers from legacy TDM networks to all-IP networks. For consumers, and for the nation as a whole, to realize the full benefit of these innovative and competitive breakthroughs, the Commission must replace its legacy monopoly approach to ensuring access to voice services with a much less regulatory approach. Otherwise, the service providers necessarily will have fewer funds available to invest in new broadband facilities as they continue to sink scarce capital into legacy facilities utilized by a dwindling number of customers."
As I see it, this ought not be a left-right issue, or a pro-regulation or anti-regulation issue. It ought to be a matter of the FCC facilitating, in a way that best serves consumers, the completion of the transition from analog to broadband networks that began in earnest more than a decade ago. If the telephone companies are not allowed to retire legacy facilities in a timely manner, but instead are required to continue expending funds operating and maintaining them, then they will, as we said in our comments, and as Blair's remarks suggest, lose investment opportunities to build out new broadband Internet infrastructure.
A final note: As I think more about this, it occurs to me that perhaps there are other subject areas, with their own particular sets of legacy regulatory requirements, where the Commission usefully might employ trials or experiments to determine whether outdated regulations can be eliminated, or at least relaxed, while at the same time protecting consumers. In proper circumstances, conducting such trials might be one way the agency can help reinvent itself as an institution more attuned to the fast-changing marketplace realities of the digital age. Or put another way, to quote a friend, in proper circumstances, "an experiment is worth a thousand pleadings."
I suspect that Blair might be of the same mind with regard to thinking about other areas where trials might be a means towards implementing further reforms. But, in any event, I'm heartened he added his voice to those urging the FCC to move forward to facilitate completion of the transition to all IP broadband networks.

Wednesday, January 30, 2013

Watch the Video of FSF's January 23 Book Event on Communications Policy!

A video of the Free State Foundation's January 23 book launch and luncheon at the National Press Club is now available here

The event celebrated FSF's new book, Communications Law and Policy in the Digital Age: The Next Five Years. Several of the book's contributing authors, all prominent experts in the field of communications law and policy, discussed their book chapters and engaged in a lively, interactive exchange with the audience concerning today's most pressing issues.  Topics included broadband policy, net neutrality, USF reform, and public media reform. 

The authors who participated in this event were: Moderator: Randolph J. May, President, The Free State Foundation; Christopher Yoo, Professor, University of Pennsylvania Law School, and Member of FSF's Board of Academic Advisors - "Internet Policy Going Forward: Does One Size Still Fit All?"; Daniel Lyons, Professor, Boston College Law School, and Member of FSF's Board of Academic Advisors - "Reforming the Universal Service Fund for the Digital Age"; Ellen Goodman, Professor, Rutgers School of Law - Camden, and Member of FSF's Board of Academic Advisors - "Public Media Policy Reform and Digital Age Realities," and; Seth Cooper, FSF Research Fellow - "Restoring a Minimal Regulatory Environment for a Healthy Wireless Future."

Tuesday, January 29, 2013

FSF Urges FCC to Speed Transition to IP Broadband Networks


On January 28, Free State Foundation President Randolph May and Research Fellow Seth Cooper submitted comments in the Federal Communications Commission’s proceeding considering the transition from legacy transmission platforms to broadband services based Internet Protocol. FSF’s comments urge the Commission to exercise its forbearance and waiver authorities to replace its legacy monopoly approach with a much less regulatory, and more market-driven approach to ensure universal service through an interconnected voice network with basic consumer protections. FSF's comments assert that this change will provide greater funds for service providers to invest in broadband deployment, and will benefit consumers by increasing access to the advanced capabilities and prospectively reduced prices of new all-IP networks.

Friday, January 25, 2013

GAO Report Urges Agencies to Improve Efforts to Respond to Public Comments


In a report released on January 22, the Government Accountability Office (GAO) found that federal agencies did not publish a notice of proposed rulemaking (NPRM) for more than a third of major rules, making meaningful public participation more difficult, decreasing transparency of rulemaking records, and creating the perception that final rules are issued without consideration of public comments.  Agencies published 568 major rules and 30,000 non-major rules during 2003 through 2010, and failed to publish NPRMs for about 35 percent of major rules and about 44 percent of non-major rules during that period.  Although agencies often requested comments on major final rules published without an NPRM, they did not usually respond to the comments received, likely because agencies are not obligated to request or respond to comments submitted after a final rule is issued.

Several rules issued by the Department of Commerce were published without an NPRM, including the Broadband Initiatives Program and the Broadband Technology Opportunities Program; rules with an economic impact of over $7 billion.  NTIA and the Rural Utilities Service determined that an NPRM was unnecessary in these cases because “notice and comment procedures would unduly delay the provision of benefits associated with these broadband initiatives and be contrary to the public interest.”  However, GAO notes, “when rulemaking is expedited, there is a trade-off between obtaining the benefits of advanced notice and comment and the goal of issuing the rule quickly.”

NTIA issued several other major rules without an NPRM: The 2009 State Broadband Data and Development Grant Program, which awarded $240 million to track broadband availability and adoption; the 2009 rule extending the deadline for analog-to-digital conversion and the converter box coupon program, and; The Public Safety Interoperable Communications Grant Program, which gave nearly $1 billion to public safety agencies in 2007.  For the first two rules, NTIA cited “good cause,” and for the third, NTIA said an NPRM was not necessary because the rule related to public property, loans, grants or benefits.

For many of the missing NPRMs, agencies cited the “good cause” exception and other statutory exceptions for publishing final rules without an NPRM; agencies used the “good cause” exception for 77 percent of major rules, and 61 percent of non-major rules published without an NRRM.  Agencies may use the “good cause” exception when they find that notice and comment procedures are “impracticable, unnecessary, or contrary to the public interest.”  The Administrative Conference of the United States (ACUS) recommended that agencies request comments whenever they invoke the “impracticable” or “contrary to public interest” reasons for the “good cause” exception.  ACUS believes this effort will help improve transparency and public participation in rulemaking.

GAO recommended that the Office of Management and Budget (OMB), in consultation with ACUS, issue guidance to encourage agencies to respond to comments on final major rules that are issued without a prior NPRM.  However, OMB disagreed that guidance would offer substantial improvement.  GAO maintains that agencies’ failure to publish NPRMs before issuing a final rule and failure to respond to public comments is a “missed opportunity.”  GAO found that when agencies did respond to public comments, they frequently made changes to improve the rules.  GAO emphasized that “what matters is not simply providing opportunity for comment but also public understanding of whether comments were considered . . . .  As courts have recognized, the opportunity to comment is meaningless unless the agency responds to significant points raised by the public."


Tuesday, January 22, 2013

Don't Miss FSF's Book Luncheon Tomorrow, January 23rd!


Register now for the Free State Foundation's Wednesday, January 23, 2013, luncheon to celebrate FSF’s new book, "Communications Law and Policy in the Digital Age: The Next Five Years," at the National Press Club. Several of the book's contributing authors, all prominent scholars and experts in the field of communications policy, will discuss their chapters and lead a lively, interactive exchange concerning today's hottest-topic issues. Speakers include: Christopher Yoo, Professor, University of Pennsylvania Law School and Member of FSF's Board of Academic Advisors - "Internet Policy Going Forward: Does One Size Still Fit All?"; Daniel Lyons, Professor, Boston College Law School, and Member of FSF's Board of Academic Advisors - "Reforming the Universal Service Fund for the Digital Age"; Ellen Goodman, Professor, Rutgers School of Law – Camden, and Member of FSF's Board of Academic Advisors - "Public Media Policy Reform and Digital Age Realities"; and Seth Cooper, FSF Research Fellow - "Restoring a Minimal Regulatory Environment for a Healthy Wireless Future."

For more information, contact: info@freestatefoundation.org

Thursday, January 17, 2013

USF Surcharge Hikes Hit Over-Taxed Wireless Consumers Hardest


On December 12 the FCC announced that the universal service contribution factor will be 16.1% for the first quarter of 2013. The contribution factor translates into the line-item surcharge added to the interstate long-distance portion of consumers' monthly phone bills. For wireless consumers, already hit with multiple state and local taxes far exceeding general sales tax rates, the USF surcharge is just like another tax. For the first quarter of 2013, wireless customers will saddled with an extra 16.1% surcharge (or tax, in effect) on the long-distance part of their bills.
USF surcharges are the now main source of the growing and disproportionate tax burden placed on wireless consumers. Heavy taxation of wireless services hits consumers and hampers investment in wireless infrastructure. This slows deployment of next-generation wireless networks that have a force-multiplier effect on business activity and job creation. Given the USF system's counterproductive burdening of wireless consumers and economic growth, the FCC should take concerted action in 2013 to reduce surcharge rates.
USF surcharges subsidize telephone companies in rural or high-cost areas, as well as schools, libraries, and some health care facilities. In some instances, the USF system subsidizes providers serving qualified low-income consumers. As the FCC has itself acknowledged, the system is outdated, lacking accountability, and increasingly costly. Over the last decade, program subsidies for telecommunications service in high-cost areas surged from $2.6 billion in 2001 and to $4.5 billion in 2011.
The chart below show the steady climb of USF contribution rates over the past decade. This upward trend has translated into ever-higher USF surcharge rates.
After kicking the can down the road for several years, the FCC finally adopted a framework for modernizing and disciplining the system in its 2011 USF Reform Order. Reforms include a first-time budget to govern USF's "high-cost" fund and keep costs under control. These reforms are commendable and should be carried out. But the FCC has only reached the early stages of implementation. And the agency has yet to adopt reforms to the contribution side of USF. The growing burden of rising USF surcharges (or taxes) on wireless consumers highlights the urgency for fully implementing and extending the scope of its comprehensive USF reforms.
In my November blog post, "Growing Tax Bills Burdening Wireless Subscribers," I called attention to a timely analysis of wireless taxes by Scott Mackey titled "Wireless Taxes and Fees Continue Growth Trend." According to Mackey, "[t]he average burden on consumers increased from 16.26 percent in July 2010 to 17.18 percent in July 2012, a 5.5 percent increase in just two years." Moreover, "Wireless customers now pay taxes, fees, and surcharges nearly two and a half times higher than the average 7.33 percent general sales tax rate imposed on other taxable goods and services."
My prior blog post focused on state and local taxation of wireless services. Yet Mackey's article also called attention to federal taxation of wireless:
[T]he primary source of the growing wireless consumer burden during the last two years is the continued increase in the federal Universal Service Fund (USF) contribution rate and the corresponding surcharge imposed on consumers to cover that obligation. The federal USF surcharge has nearly tripled over the last decade, from 2.07 percent in 2003 to 5.82 percent in 2012. In fact, the 5.82 percent federal USF rate in 2012 almost exceeds the combined federal rate imposed in 2005, when the 3 percent federal excise tax still applied to wireless service.

The FCC treats 37.1% of a wireless consumer's calling plan as interstate long distance, and hence subject to the USF surcharge. That is, 37.1% of a wireless consumer's bill is subject to the 16.1% surcharge (or tax) for the first quarter of 2013. However, the FCC permits wireless providers to classify a lower percentage of consumers' calling plans as interstate long distance if providers supply the FCC with supporting network-wide traffic studies. The FCC's 2012 USF Contribution Reform FNPRM indicates that most wireless providers submit studies rather than rely on the safe harbor. Interstate long-distance averaged 23% of network traffic in submitted studies. So, for the first quarter of 2013 the average wireless consumer will see a 16.1% USF surcharge assessed on some 23% of their calling plans.
USF surcharge mechanics aside, the current tax treatment of wireless and overall trend is unmistakable: wireless is taxed at a higher rate than other services, and the tax burden on wireless consumers is growing. Multiple state and local wireless taxes, fees, and surcharges – piled one on top of the other – subject wireless services to tax rates far above services subject to standard state and local sales taxes. And federal USF surcharges surge higher, increasing the effective tax burden on wireless consumers.
This heavy and disproportionate tax burden on wireless consumers has negative consequences for wireless infrastructure investment. Mackey cited a study indicating that wireless consumers are price-sensitive, the takeaway being that high taxes negatively impact consumer adoption of wireless services. Onerous taxes therefore impact revenues for wireless service providers. Mackey estimated that "[i]f wireless services were subject to the same tax treatment as other taxable goods and services, increased carrier revenue could make as much as $3 billion more per year available to invest in network expansion and improvements."
Reductions in wireless investment on account of over-taxation result in lost economic opportunities and job creation. Next-generation wireless networks offer increased speeds, reliability, and capacity that allow business enterprises to operate more effectively and efficiently. Taxes that discourage investment, however, potentially slow deployment of expensive infrastructure that is necessary to support next-generation wireless networks. And according to published research by economists Robert Shapiro and Kevin Hassett, "every 10 percent increase in the adoption of 3G and 4G wireless technologies could add more than 231,000 new jobs to the U.S. economy in less than a year." Heavy and disproportionate taxation of wireless that depresses investment therefore serves as a barrier to job creation.
The FCC's USF system bears special responsibility for the growing tax burden on wireless. So the FCC has an obligation to reduce the USF system's share of that burden. In order to bring about much-needed tax reform and relief for heavily and disproportionately taxed wireless services, "Universal Service Reforms Must Continue To Be Implemented." As the FCC addresses contribution reform and modernizes the system in 2013, it should take action to reduce USF surcharges on over-taxed wireless consumers.

Monday, January 14, 2013

Chevron Deference and Independent Agencies


Tomorrow (January 16) the Supreme Court will hear oral argument in City of Arlington v. FCC. As we have observed previously, City of Arlington is likely to be one of the most important administrative law decisions of the last quarter century, impacting not only the FCC but most other federal agencies as well.
The question before the Court is whether courts, upon judicial review, should give Chevron deference to an agency’s determination of its own jurisdictional boundaries. When Chevron deference applies, courts give what the Court called "controlling weight" to an agency's statutory interpretations. While, at times, an agency's statutory interpretation may not prevail even if Chevron deference is held to apply, this is rare.
In my December 27 Washington Times opinion piece and in FSF Academic Board Member Jonathan Adler's November 26 Perspectives from FSF Scholars, we explained why the Supreme Court should hold that deference should not apply to agency determinations concerning the bounds of an agency's own jurisdiction. As Professor Adler concluded: "There are good reasons for [the Supreme Court] to make clear that agencies should only receive Chevron deference when they are exercising that authority Congress has delegated, and they should not receive deference when facing the question of whether the agency has authority at all."
And, as I explained in my Washington Times commentary:
"The very separation of powers principles upon which the Chevron deference doctrine primarily rests should mean that such deference doesn’t apply to agencies’ interpretations regarding the bounds of their own authority….If agencies themselves are allowed, by virtue of receiving extraordinary judicial deference, to presumptively resolve statutory ambiguities over the outer bounds of their own power, then it is far easier for legislators to avoid political accountability for decisions Congress makes regarding the expansive reach of the regulatory state. Finally, if agencies’ decisions about the scope of their own jurisdictions are given 'controlling weight' under the Chevron doctrine, the bureaucratic imperative naturally will be for officials to continue enlarging the limits of their regulatory authority."
Before Wednesday's argument, I want to call your attention to another point, one not addressed directly either in my piece or Professor Adler's. For a long time, I have suggested that independent agencies should not receive Chevron deference in the same way that Executive Branch agencies do. In other words, I have argued there ought to be a difference in application of the deference standard depending on whether the agency is an Executive Branch agency – like EPA, the agency whose ruling was at issue in Chevron itself – or an independent agency like the FCC, whose ruling is before the Court in City of Arlington.
The principal reason I have advocated this difference in judicial review deference standards has to do with constitutional separation of powers principles. In Chevron, the Court grounded the deference doctrine primarily (but not exclusively) in notions of political accountability inherent in separation of powers principles. In this regard, the Court explained that when congressional intent regarding delegated authority is not clear:
"[A]n agency to which Congress has delegated policymaking responsibilities may, within the limits of that delegation, properly rely upon the incumbent administration's views of wise policy to inform its judgments. While agencies are not directly accountable to the people, the Chief Executive is, and it is entirely appropriate for this political branch of the Government to make such policy choices -- resolving the competing interests which Congress itself either inadvertently did not resolve, or intentionally left to be resolved by the agency charged with the administration of the statute in light of everyday realities."
Simply put, because independent agencies such as the FCC are, as a matter of our current understanding of the law and of historical practice, mostly free from executive branch political control, Chevron’s political accountability rationale should imply that independent agencies' statutory interpretations receive less judicial deference because such agencies, including the FCC, are less politically accountable.
I have published two articles in the Administrative Law Review articulating my position at some length and won't elaborate further here. The first, published in 2006, is titled, Defining Deference Down: Independent Agencies and Chevron Deference, and the second, published in 2010, Defining Deference Down, Again: Independent Agencies, Chevron Deference, and Fox. You may find these law review articles interesting if you are following the City of Arlington case.
A final intriguing note: Then Harvard Law School Dean Elena Kagan – now Justice Kagan – took essentially the same position that I have advocated regarding the judicial deference owed independent agencies in her widely-praised 2001 Presidential Administration law review article. And she based her view on the same rationale that I have suggested – that independent agencies are not as politically accountable as Executive Branch agencies because the President cannot control the independent agencies to the same degree. We'll see whether in City of Arlington Justice Kagan, or any other Justice, finds the differences between executive and independent agencies relevant for Chevron purposes.   

Studies Emphasizing Negative Effects of Taxes on Economic Growth


The Tax Foundation's Dr. William McBride published a Special Report in December titled "What is the Evidence on Taxes and Growth?McBride conducted an economic literature review of more than two-dozen peer reviewed academic journal articles from the last thirty years examining the empirical relationship between taxes and economic growth.
The Special Report contains a helpful table listing the referenced studies, summarizing their methodologies and findings.  McBride concluded that, "the results consistently point to significant negative effects of taxes on economic growth even after controlling for various other factors such as government spending, business cycle conditions, and monetary policy." 
According to McBride, "a review of the empirical studies establishes some standards by which a tax system may be judged."  Applying those standards, McBride judged that "the U.S. has probably the most inefficient tax mix in the developed world."  Here, McBride pointed to U.S. tax policy regarding personal and corporate income taxes, including double taxation of corporate income through capital gains and dividend taxes.  As explained in the Special Report, these kinds of taxes can adversely affect production, innovation, and risk-taking, thereby harming economic growth.
But McBride also offers a way forward for U.S. tax policy:
Pro-growth tax reform that reduces the burden of corporate and personal income taxes would generate a more robust economic recovery and put the U.S. on a higher growth trajectory, with more investment, more employment, higher wages, and a higher standard of living.

Thursday, January 10, 2013

Captive Audience's Captive Thinking


I've just finished reading Susan Crawford's new book, "Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age." It is interesting in many ways, and even entertaining in others. 
But, most importantly, "Captive Audience" is fundamentally flawed. 
"Captive Audience" is flawed because Professor Crawford relies on an incorrect – indeed, a hypothesized – view of the communications and information services marketplace to construct the case for monopoly power. And then she offers anachronistic, legacy regulatory measures to remedy the supposed ills that exist in her hypothesized market. In my view, the book more appropriately might have been titled, "Captive Thinking: Viewing Today's Telecom Industry Through An Analog-Era Lens." 
The book's central thesis is unmistakably clear: Comcast possesses monopoly power with respect both to the provision of broadband services and the provision of video programming. While less clear, at times it appears Professor Crawford may be making the same monopolistic power claim with regard to Time Warner Cable and other cable operators. 
While it doesn't come until the very end of the book, the proposed remedy for this supposed monopolistic power is unmistakably clear as well: "America needs to move to a utility model." Professor Crawford looks back to the days of the early twentieth century creation of AT&T – even further back to the late nineteenth century's railroad regulation model – to support her argument that America needs a "utility" regulatory regime for today's broadband companies, especially for Comcast. 
Indeed, consistent with her backwards-looking focus on public utility models, Professor Crawford equates the provision of electricity service and broadband service. She argues that electricity service, while first considered a luxury, soon came to be viewed as a necessity, and that broadband is like electricity in this respect. Therefore, broadband companies, like power companies, should be regulated as public utilities. 
Among others, a central problem with Professor Crawford's argument, however, is that the provision of electricity, at least to a large segment of America's population, does exhibit, to a degree that differs materially from broadband provision, natural monopoly characteristics. There are not significant competitive facilities-based distributors of electricity to end users. (Resellers of government-regulated wholesale services are in a different market category, and, in my view, do not provide sustainable competition to end users.) This lack of alternative competitive providers is not true with respect to broadband, including video programming. 
There are not alternative providers of electricity that employ cable, wireless, telephone, or satellite network infrastructures. There are for broadband. I dwell on the electricity analogy because, at the end of the day (and at the end of the book!) Professor Crawford rests so much weight on it. She considers the markets for electricity and broadband both to be natural monopolies. I don't. 
Professor Crawford's argument that cable operators like Comcast are now monopolies is based on her assertion that "cable has decisively won the battle for high-speed wired communications in America." Over the decades I've learned it is wise not to be overly certain when predicting the future course of developments in the communications marketplace. Technological advances occur in unpredictable ways, sometimes more rapidly than predicted, sometimes more slowly. For example, when I first got involved in communications law and policy, there were many pundits predicting that domestic satellites (we called them "domsats" and the FCC for years carried on "domsat" proceedings) might become dominant providers of all forms of communications. Today they play an important role in our communications landscape, but they certainly are not dominant. 
In support of her claim that cable already has "decisively" won the battle, Professor Crawford engages in a bit of sleight of hand. This is because her claim is based entirely on substantially narrowing the market definition by defining the relevant market – and this is buried in a footnote [page 284, note 3] – as "cable companies with DOCSIS 3.0-enabled infrastructure that can offer very high peak download speeds." In other words, in the few places where Professor Crawford addresses market power and market definition in any serious way, she distinguishes, albeit very subtly, between "high-speed" and "very high peak download speed" broadband services. The latter only includes services with speeds above 100 Mbps. In other places, she distinguishes between high-speed wired broadband and what she calls "truly" high-speed broadband. 
The problem with this approach is that it doesn't reflect present marketplace reality. Even though Professor Crawford may think otherwise, for most consumers, telephone company-provided high-speed broadband services provide a satisfactory alternative to those of cable operators, even in areas in which fiber technology is not employed. Many consumers consider the broadband service provided by satellite operators a satisfactory alternative to cable broadband. And still other consumers consider wireless broadband services not only complementary to cable (as Professor Crawford maintains), but substitutable as well. This is increasingly so as high-speed 4G wireless services become more ubiquitous. 
Indeed, over 80% of American households have access to at least two high-speed wireline providers consistent with the FCC's definition of high-speed, even if one does not provide service, at peak times, at the "truly" or "very" high-speed above-100 Mbps that Professor Crawford insists on employing for market-defining purposes. 
So, the problem with Professor Crawford's claim of cable monopoly power is that it rests on a hypothesized marketplace based on an unduly restrictive market definition, not on the broadband market as it presently exists. Her hypothesized market definition is based on her own personal predilections concerning the level of service she thinks she will demand in the future, rather than on an analysis of the services consumers demand in today's actual marketplace. Note that she does not report that when the FCC last surveyed consumers, the agency found that 93% were satisfied with their broadband service. And note also that right at the outset of her book [on page 2] she predicates her assertion of Comcast's dominance on the level of service that she hypothesizes will be sufficient to satisfy Americans "in the near future," without explaining what that means.   
It is possible – although I don't think it is likely – that the market may evolve in the way Professor Crawford hypothesizes, that is to say that Comcast and other cable companies come to dominate the broadband distribution marketplace because telephone companies, wireless providers, and satellite operators simply cannot compete for the "truly" high-speed services that consumers may come one day to demand. I doubt this will be the case because, barring regulatory policies that constrain ongoing investment by cable's competitors, I expect that technological and innovation advances, along with new business models, will continue to evolve in ways that provide consumers with meaningful competitive alternatives that satisfy evolving consumer demands. If I am wrong about this, there will be time enough then to consider what remedial measures may be appropriate to prevent consumer harm.          
Rather than going on at length now, I want to make a few additional points briefly. They likely warrant more attention in subsequent posts. 
As I said at the beginning, Professor Crawford claims that Comcast possesses dominant market power not only with respect to broadband transmission but also with respect to video program acquisition and distribution, although she doesn't make the latter claim with the same degree of certainty. Although she apparently envisions a "captive audience," in an age of media abundance, most Americans would beg to differ. Indeed, in a chapter describing the Comcast-NBCU merger, accompanied by broad allegations concerning Comcast's alleged chokehold on video program distribution, Professor Crawford pivots, declaring at page 134 that broadcasters "have more distribution outlets for video than they had before – they can get their programing out through satellite (Dish, DirecTV) or telco (Verizon, AT&T) video offerings as well as cable." This concession – or at least recognition – concerning the availability of satellite and telco video distribution alternatives undercuts the claim that Comcast monopolizes this market segment, whether with regard to broadcasters or other independent programming entities looking for distribution outlets. 
Now, even if Professor Crawford were to concede the existence of alternative providers or outlets, as she seems at times to do, I'm quite certain she would not back away from her plea for public utility regulation of Comcast and other cable operators. This is because she would claim, as she does throughout, that these service providers nevertheless possess enough market power that other entities (that she happens to favor) do not receive "fair treatment." [p. 175]. Especially with respect to Netflix (a company she particularly favors) and other online video providers, Professor Crawford expresses concern that these entities may not be treated fairly. And in this regard, she targets usage-based billing offerings as an especially egregious means of treating Netflix and other online video providers "unfairly" because, in her view, such plans may cause users to curtail the amount of video streamed. Thus, she argues the FCC needs to examine cable operators' revenues, costs, quality of service, and other data – in other words, conduct a traditional public utility rate case – to determine the ultimate "fairness" of usage-based billing. 
We have explained many times on these pages why this plea for the FCC to conduct what, in effect, would amount to an old-fashioned rate case is a recipe for disaster. Usage-based billing plans are employed throughout the economy to enhance economic efficiency by using the price mechanism to allocate scarce resources, here limited bandwidth capacity. Employing price signals this way promotes overall consumer welfare. Not coincidently, to the extent Professor Crawford is really concerned about "fairness" to consumers, and not merely to Netflix, such plans may be fairer to consumers than non-usage-based plans because consumers' charges are more nearly aligned with the costs they impose on the network. Here I am just going to refer you to the recent FSF Perspectives published by Daniel Lyons, a member of FSF's Board of Academic Advisors, entitled "Why Broadband Pricing Freedom Is Good for Consumers." Please review. 
Professor Crawford's single-minded focus on "fairness" for Netflix and others who ride on top of broadband operators' networks, and the supposed harm these "on-toppers" might experience under usage-based billing plans, means there is little acknowledgment that, having invested billions of dollars in building and maintaining their networks, broadband providers are entitled to a fair return on their invested capital. 
Again, Professor Crawford's book is interesting and informative in many respects, especially with regard to some of the historical information concerning the companies and industry leaders she discusses. At times, it is even gossipy, and this can be entertaining. 
But "Captive Audience" is fundamentally flawed because the call for public utility regulation of cable companies is based on an incorrect view of the market. Only by hypothesizing market developments that she predicts – wrongly in my view – will come to pass in the future is Professor Crawford able to construct an argument that provokes her reach back into history for ill-conceived remedial measures. The call for public utility regulation may have been appropriate for the early analog-era of Theodore Vail and the late nineteenth century's railroad magnates. But Professor Crawford's plea to apply such utility regulation to broadband providers operating in today's dynamic digital environment should be rejected.