The Best and the Brightest of Local Regulatory Reform

The Regulatory Reform Committee addresses municipal innovation

Jake Auchincloss Grey

By Jake Auchincloss • July 29, 2014

This post is part of the Regulatory Reform for the 21st Century City project.

The $18 billion valuation of Uber is, in the words of L. Gordon Crovitz writing in the Wall Street Journal, a market estimate of the value of the waste caused by taxi regulations around the world. And taxi regulations are but one of myriad anachronistic, duplicative, or unduly cumbersome rules that have turned many American local codes into demoralizing puzzles. Beyond headliners like Uber that have challenged regulatory landscapes, though, these local codes have been subject to much less scrutiny than their federal counterparts.  

Some of the best theorists and practitioners behind the movement for local regulatory reform met at the Harvard Kennedy School on July 15th to help chart the way forward. The Regulatory Reform Committee, chaired by Professor Stephen Goldsmith, former Mayor of Indianapolis and Deputy Mayor of New York City, drew together leading scholars of the subject with senior policy-makers from Boston, Chicago, and New York City, which have been among the most innovative cities in the country on the issue. At the meeting, hosted by the Project on Municipal Innovation at the Ash Center, the committee reviewed a framework for guiding cities through regulatory reform and discussed the politics and policy of balancing economic development with public health and safety.

One of the scholars, Quinton Mayne, an Assistant Professor of Public Policy at Harvard Kennedy School who has investigated regulatory theory in Europe and the United States, suggested organizing the far-reaching analysis according to the four modules by which government reduces risk to the population and thereby answers the purpose of regulation: nodality, authority, treasure, and organization. These four modules proved useful to capturing both the realities of practice brought forward by Rosemary Krimbel, Special Deputy for Regulatory Reform in Chicago, Alex Lawrence, Permitting Project Manager of Boston, and Robinson Hernandez, Executive Director of the New Business Acceleration Team of New York City, as well as expert perspective on the limitations and possibilities of those practices afforded by Mayne, Lisa Robinson, President of the Society for Benefit-Cost Analysis, and Cary Coglianese, Professor of Law and Political Science at the University of Pennsylvania, director of the Penn Program on Regulation, and author of numerous books on regulation.

Nodality refers to the government’s role as curator of information, and is perhaps the regulatory module ripest for innovation. Regulation is fundamentally a means to redress information asymmetry. The shopper, the driver, the diner do not have the time, access, or expertise to determine whether the building they are inside is structurally sound, or whether the gallons of gasoline they purchase are truly four quarts, or whether the head of lettuce they wish to buy is esculent. Instead, government validates that information for them.

In a world of frictionless data curation, the market would internalize information and enforce standards through reputation rather than regulation. Third parties, counter-parties, and customers would validate information instead of government. That world is permanently hypothetical: markets can never perfectly disperse information, and, even if they could, the private sector often lacks the means to remediate shortcomings before hazards appear. However, as Prof. Goldsmith noted, the transaction costs of data curation have decreased significantly enough that local governments should be examining to what extent reputation levers can replace enforcement levers. Some of the drivers of reduced information transaction costs include mobile technology that can store, display, and share data; social media’s capacity to rate and revise reputations in real-time; for-profit guarantors of safety like Uber; third-party enforcement by professional associations and insurance companies; and self-certified or independently certified experts.

The committee was skeptical that local governments should – or would – significantly devolve information validation to the market, but they embraced one driver of reduced transaction costs as key to better regulatory regimes: more transparent and accessible municipal data. Ms. Krimbel, for example, recounted her struggles with implementing change in Chicago, but also explained how the city had reduced transaction costs for businesses, especially restaurants, by downsizing permitting and licensing requirements and moving the process online. New York City and Boston have successfully passed similar initiatives, with New York City even appointing case managers to guide small businesses through regulatory compliance, and the committee agreed that making information more digestible for use by civic hackers, and more accessible by consumers and businesses, were the highest nodal priority.

In a world of frictionless data curation, the market would internalize information and enforce standards through reputation rather than regulation.

The highest priority in regards to regulation under the government’s authority module, which is direct enforcement of rules, revolved around making inspection more predictable and efficient. Ms. Lawrence of Boston described a deep-seated resistance to change within the ranks of inspectors, who especially disliked the internal peer comparison system Boston uses to highlight underperformers. That system, however, is an important step towards ensuring predictability and, as Prof. Coglianese termed it, “obligatee service” for businesses; obligatee service being the analogue of customer service when there is no choice on the part of the consumer. Greater efficiency will come in the short term from the use of tablets that connect to a cloud of municipal data, enabling inspectors to comprehend the full scope of a firm’s compliance record and even remediate areas that may be outside of their normal purview. That data also feeds predictive analytics, which enables cities to target the most likely offenders and fast-track the good actors. New York City’s Fire Department is at the forefront, but the approach should percolate throughout large and medium-sized cities.

The predictability and efficiency of extant regulations, however, are undermined when perverse ones are added due to hasty or thoughtless deliberation. Mr. Hernandez, the New York City policy veteran, asserted that public clamor following a high-visibility accident, like a crane toppling over, can force bureaucracy to enact new, unnecessary rules. This dynamic holds, Prof. Mayne pointed out, whenever citizens perceive that the costs of hazards are borne by innocent third parties instead of the actors, and when, as is true throughout the United States, these citizens have minimal confidence in the competency of government. At the federal level, impulsive regulation is difficult because of the requirement for vetting and analysis, but Ms. Robinson, the expert on benefit-cost analyses, explained that the extensive time and cost required to properly conduct such analyses largely precludes them from municipal use.

The committee embraced the concept of a quick impact assessment to ensure analytical rigor and to provide regulators with a buffer zone. The standardized assessment would necessarily be far more cursory than its federal cousin, but it would improve the weighing of options and incentives than form the treasure module of governance. Prof. Coglianese also introduced a complementary process to help rationalize rule-making and corral public support for the best option: POCER, for a five-step outline of the (1) problem, (2) options, (3) criteria, (4) evidence, and (5) reason for regulation. City governments could follow POCER, or variants thereof, both in internal and external regulatory deliberation to foster a culture of empiricism and thoughtfulness.

That culture must also be encouraged by inter-city collaboration mechanisms. Ms. Krimbel noted that proponents of regulatory reform often feel like “a voice in the wilderness” within city hall. Bureaucratic inertia within the departments is often even more intense; Mr. Hernandez explained that sometimes the only way to innovate is to enact small, innocuous pilots that slowly expand. To overcome the pervasive sentiment of ‘that is the way we have always done things’, the committee advocated a more robust network between cities to share and reinforce best practices, so that policy-makers could go to their mayors with persuasive evidence that a new, risky proposal really had worked elsewhere. This network, furthermore, could draw upon the expert counsel of academics like Prof. Coglianese, Prof. Mayne, and Ms. Robinson, who lend further credibility to reform efforts.

Inter-city collaboration is by no means the only important dimension along the organizational module of governance –– Mr. Hernandez, for example, stressed how vital it is to bring city agencies on board prior to publicizing new initiatives –– but it is where the Program on Municipal Innovation has the comparative advantage.  The Program has already developed a network of senior policy-makers within the 44 largest cities in the United States, who are next scheduled to meet in early September, and will be revising the regulatory reform framework for their edification based upon the committee’s recommendations. Furthermore, over the succeeding months, the Program will be publishing a series of academic white papers on the subject of regulation.