Every day, seven million rows of new data are created in the City of Chicago alone, and countless billions of rows of data are generated across all of state, local, and federal government. And yet while there are islands of innovation in government where data drives day-to-day decision-making, there is a still a long way to go to before this approach becomes the status quo. According to McKinsey, while 90 percent of all digital data has been created within the last two years, only one percent of it has been analyzed, across both public and private sectors. There is ample opportunity to apply more data analysis to the improvement of government operations using a variety of methods including descriptive statistics, predictive models, and data visualization.
We recently launched a new project to advance efficiency in government. The Operational Excellence in Government project seeks to share ideas for improving performance across a wide range of government functions. The project makes available for the first time in one place a searchable database of 30 exemplary studies of government efficiency, chosen from among over 200 that we identified. Looking across the studies, we found the two most common tools needed to improve government efficiency were data analytics and lean process improvement methods.
Reflecting on the 2,000 recommendations in the studies gathered for the Operational Excellence project, and our combined experience as public servants and advisors to government, we developed a list of ten great ways that the public sector can use data to improve the efficiency of its operations. Governments across the country have found countless creative solutions to urban problems by leveraging data and we highlight just a sampling below. We selected ideas that we think have value across a wide range of municipalities seeking to deliver more value to taxpayers and that address important policy and operational issues. Ten categories of these efforts stand out for their transferability across cities.
Three ways to use data to improve public safety
- Data and analytics enable more efficient, life-saving 911 responses. Data analytics have been applied to 911 response times in multiple cities with significant benefit. In New York City, analysis of the processing time for 911 calls led to streamlining the process to speed up the assignment of calls. In Cincinnati, data analytics helps dispatchers decide which calls need to be taken to the hospital in an ambulance and which can be treated on site. Using data about the call type, as well as other variables such as the historical results for that type of call, when the call happens, the weather, and location of the incident, the team was able to reduce delays in getting patients to the hospital by 22 percent. Faster response came from better allocating the scarce resource of ambulances, and allowing the incidents that can be treated on site to receive appropriate response. In Boston, analysis of 911 calls identified hotspots with high call volume for substance abuse and mental health issues which require treatment but not hospital transport. Now, in a pilot project, mobile EMTs (MEMTs) proactively patrol these areas on foot or by bike, provide onsite assessment and immediate medical care when needed, and then can call for an ambulance only when transport is needed. When appropriate, they can make connections for substance abuse and mental health outreach workers, helping people get the services they need. This delivers more efficient deployment of ambulance resources, and allows better connection between health and social services agencies and first responders. The goal is to deliver ambulance services faster to the locations that actually need them. In Baltimore, which implemented Budgeting for Outcomes, teams of city employees now use data to solve complex problems. One team spanning the Fire and Health Departments decided to assign nurses to frequent 911 callers to prevent repeat calls. This innovation reduced 911 call volume significantly, improving response time and saving money.
- Data-driven inspections identify risks faster and improve public safety. Chicago has used data analytics to improve its restaurant inspection process, with a 20 percent improvement in operational efficiency, which means finding the critical violations of health code that can make people sick over a week sooner. The City of Chicago made the model available to others on an open-source platform, and Montgomery County, MD used the code to build its own model. This modified model found 27 percent more violations and found them three days sooner than the existing process. Boston, meanwhile, developed its own predictive analytics model; the pilot found 20-25% more violations. Using a restaurant inspection model as a starting point can speed the process for creating other inspections models, as they are all processes with a binary pass-fail result.
- Data-driven pretrial decisions save money while improving safety and increasing fairness. Cities, counties and states have been slow to optimize efficiency of decisions about who should be released and who should stay in jail while awaiting their trial. Because only ten percent of jurisdictions use data when making these decisions, jail days are often wasted on those who aren’t dangerous. Those who are dangerous are too often allowed out because instead of using data about individual flight risk and safety risk, the judge used a preset schedule of bail costs determined solely by the crime the person is accused of. Using data would mean more fairness and a savings of $3 billion. Washington, D.C., Allegheny County and the State of Kentucky are all among the growing minority of jurisdictions using data for decision making at the pretrial stage. This approach could deliver taxpayer value and improve public safety if applied more broadly.
Four ways to improve infrastructure with data
- Smarter water management with sensors and analytics reduces leaks and water main breaks. Cities can improve the efficiency of water services using wireless “smart meters” to get real-time information on water flow and by collecting data via remote sensors. Baltimore, New York City, and San Francisco are already demonstrating the value of such approaches to monitor water flow and identify and fix water leaks. Washington, D.C. uses analytics to identify potential leaks or plumbing problems by examining water usage trend data and flagging any spikes in usage for customers. They can then send alerts to customers who may be experiencing a problem, preventing or minimizing damage from water leaks in homes. The University of Chicago Data Science for Social Good program developed a predictive model for the city of Syracuse to identify the water mains at greatest risk of breakage, so that they could be prioritized for preventive repairs. The model incorporates 12 years of water main break data, along with data about the age and diameter of the pipes and the geology of the soil and road quality above them. The model allows for strategic deployment of sensors on the water mains, as well as for preventive repair and replacement in coordination with other Public Works projects. The new model is six times as effective as the status quo process for preventing water main breaks.
- Data makes commute times shorter and improves safety. Most major cities have made data available about their transit systems, from arrival times to maintenance status. The City and County of Denver has taken this one step further with its Denver Go app that integrates information across modes of transportation and allows a user to compare not only how long a journey takes with each method, but also the cost, environmental impact, and health benefits of each. The app uses real-time traffic data to create accurate estimates of arrival times. Denver’s growth by 40 percent in the last decade and a half has meant increasing congestion. The goal of this app that integrates bike share, ride share, public transit, and parking information is to reduce congestion and improve the efficiency of flow of traffic. The app will also collect data that can help as Denver plans for long-term improvements to its transportation infrastructure. Other cities are innovating too – as part of its Vision Zero efforts, Los Angeles just released a map that allows users to plot their travel informed by past pedestrian, cyclist and motorist fatalities. Atlanta developed an app to help its residents navigate the traffic challenges imposed by its recent bridge collapse. Kansas City, another leading user of data for municipal decision-making, now publishes real time data and a visualization of data on streetcar, pedestrian and car traffic in its downtown corridor.
- Demand-based parking pricing reduces congestion, improves safety, and increases revenue. Many cities are leveraging data for their parking efforts, but two examples stand out – San Francisco and Boston. When San Francisco implemented dynamic pricing for city metered parking, the time spent circling looking for a space decreased 43 percent and violations for double parking dropped by 22 percent, results that improved both efficiency and public safety. On the opposite coast, Boston is a city dense with drivers, and studies have shown that an average of 30 percent of traffic on congested streets comprises drivers looking for a parking spot. To address this problem, Boston is experimenting with dynamic pricing for on-street parking in one of its busiest neighborhoods. In a one-year experiment, the city will collect parking availability data using sensors and will use the data to manage demand to underused spaces and away from overused spaces. By reducing the volume of cars circling to find a parking space, Boston hopes to lower the carbon footprint and improve safety by reducing distracted drivers looking for spaces. The experiment affects over 500 parking meters in a 40-block area, with rates varying by time of day and location. Every two months, data analysis will change prices and rates will go up 50 cents per hour in busy spots and go down 50 cents per hour in less busy ones. The experiment seeks to adjust pricing of metered parking spots so that turnover reaches the optimal level for supporting economic activity.
- Smart technology and data analysis save energy - and money. Gathering data and analyzing it can create dramatic reductions in energy usage. Emerging technologies such as sensors to track energy demand and analysis of energy data provide increasing opportunity to save energy and money. Two examples come from San Francisco and Wellesley, MA. The small suburban town of Wellesley has saved $132,000 in energy costs using analytics. By managing and reporting regularly on energy use for each town building and benchmarking the results, the town reduced energy use by nine percent over a three-year period, even with 37 percent more extreme weather days, for a total avoided cost of $132,000. Installing exterior LED lights in town buildings is achieving 15 percent more light and a ten percent overall reduction in electricity energy cost. On the other coast, San Francisco lowers energy and monitoring costs with smart streetlights; after testing three brands of sensor-enabled streetlights and evaluating the pilot test results, San Francisco is now on a path to use data-enabled lighting on its streets. The city will lower energy costs by running the streetlight system from wireless smart controllers that let city staff remotely monitor the performance of every light in the city and adjust light intensity levels as needed, with less light needed when no human activity is detected. Real-time data updates can alert workers to any burned-out bulbs, significantly reducing downtime for repairs and improving public safety.
Three ways to improve internal government operations with data
- Identifying underreporting and fraud in tax returns can unearth revenue. No government can afford to allow uncollected tax revenue to lie dormant in the accounts of taxpayers who have chosen not to pay. Two examples demonstrate how applying analytics can improve the efficiency of targeting noncompliant taxpayers, returning more to the treasury for less effort. New York City improved efficiency 40 percent in finding fraudulent tax returns. David Frankel, who was Commissioner of the New York City Department of Finance from 2009 to 2013, led an effort to use data to more efficiently identify taxpayers who failed to pay accurately. Using data from city, state and federal tax records, along with other data such as business licenses, the team looked for patterns—identifying similar businesses and their tax payment patterns to identify outliers who might have failed to pay taxes or who paid too little. This increased their ability to target their auditors to cases that were in fact fraudulent, while reducing the burden on compliers. In another example, the State of Maryland is 12 times more efficient at identifying fraud with analytics. Using data on risk profiles of individual returns and patterns of anomaly across returns allows the state target which tax returns to audit. Before using data to prioritize, the state was auditing 110,000 tax returns a year and finding fraud in five to ten percent of those audits. Now, the data-driven approach enables half as many audits to find fraud 60 percent of the time. The data-driven approach is 12 times more effective at finding fraud, which makes more judicious use of the auditing resources of the state, gains additional recouped taxes from those who perpetrate fraud, and increases fairness as fraudsters are forced to pay their fair share. The approach has recovered nearly $40 million, compared with an average of $10-20 million before implementing the data driven approach.
- Data unlocks hidden potential for cost savings in procurement. The State of New York analyzes data about its purchases and uses the insights from the data to increase volume from fewer suppliers to enable discounts. This strategy, known as strategic sourcing, coupled with procurement process efficiencies, saved the state $780 million over five years on purchases across 47 different categories. Examples of areas where strategic sourcing is beneficial to the state include office supplies, IT hardware and software, telecommunications, facilities, fleet and vehicle purchases and maintenance, advertising, travel, and medical products.
- Analytics can address staffing shortages and produce overtime savings. A state government customer service center was experiencing difficulty serving its customers on days when sick calls were higher than average. Staffing shortages were causing backlogs of work and processing delays. Data analytics of historical sick-call patterns identified days that would be particularly challenging, including the day after the Super Bowl, which was the highest sick-call day of the year. Armed with this trend and predictive information, managers were able to proactively manage staffing levels and improve customer service. The City of Houston saved $10 million a year with an electronic time and attendance system by reducing unauthorized compensatory time off. Replacing its paper time and attendance system also allowed greater transparency and accountability with the system. The City of Dallas used analytics on police and fire overtime payments to determine that they would more efficiently cover staff shortages with new hires rather than via overtime.
Increasingly, leaders in government are recognizing the power of data to provide insights that drive better results. Across the United States, mayors are appointing Chief Data Officers and Chief Performance Officers and charging them with using data to deliver better service to the public. With the Operational Excellence project, we hope to advance the capacity of state and local government to optimize efficiency of operations, and to provide a forum for celebrating successes. If you know of examples of excellence in operational efficiency, please contact Devon_Ziminksi@hks.harvard.edu.