Each week we will bring you a summary of what happened this week on our site, on Twitter, and in the wider world of civic data. Suggest stories on Twitter with #ThisWeekInData.
Here on Data-Smart, Robert Burack wrote about Pittsburgh’s efforts to procure smart lighting to replace all 40,000 city owned and operated streetlights. After releasing a request for proposals (RFP) in 2012 to upgrade the city’s lights to LED, Pittsburgh was surprised to receive a number of proposals that offered additional possibilities like connected networks of sensors. As a result, the city paused and rethought the effort, and recently released a more open request for information (RFI) with the intention of providing equitable lighting across the city and phasing into smart streetlights that gather real-time data.
Also on Data-Smart, Sari Ladin profiled Los Angeles’ CleanStat initiative, which maps the cleanliness of every LA block and shares the data with residents using a story map on GeoHub, the city’s map-based open data portal. To gather cleanliness data, Los Angeles Sanitation’s five two-person crews drive over 22,000 miles every quarter to assess 42,000 blocks using video and geographic information system (GIS) tools. Since launching CleanStat about a year ago, the city has reduced unclean streets by 82% and somewhat clean streets by 84%.
Sean Thornton published an article on Data-Smart about the power of analytics projects to turn large troves of data into public value. Cities have begun taking advantage of analytics’ recent affordability and accessibility, due in part to advancements in computing and code-sharing as well as increasingly open mindsets about releasing government data. For example, New Orleans used building and occupant data to predict which buildings need to be equipped with fire alarms.
The Sunlight Foundation released a guide called Tactical Data Engagement outlining how governments can facilitate stakeholder use of data in ways that bring about community impact. Based on literature on open data impact and community stakeholder engagement as well as case studies and interviews with experts, Tactical Data Engagement provides a roadmap for low-cost interventions.
On GovTech, Stephen Goldsmith discussed the use of Internet of Things (IoT) technologies in tandem with insights from behavioral science to deliver message-driven nudges. Internet of Things sensors have the capacity to mine and deliver data in real-time, providing residents with information that can galvanize preferable behaviors. For example, by connecting traffic sensors to messaging systems, cities can alert residents at times of high congestion in order to ease traffic.
The State of Maryland announced MD Think, a cloud-based data repository that will break down data silos between state agencies to provide integrated access to programs administered by agencies including the Department of Human Resources, the Department of Health and Mental Hygiene, and the Department of Juvenile Services. The state hopes that the tool—supported by $195 million in federal funding and $14 million from Maryland— will greatly improve service delivery by streamlining program operations and increasing agency productivity.
CityLab profiled Boston’s use of data analysis to track snow removal and direct plows to areas in greatest need. Using a tool called SnowCOP, the public works office collects GPS information from 700 trucks every minute, which is automatically mapped onto a visualization of all 30,000 city streets. This data is then paired with 311 call data in order to determine which streets or neighborhoods need another round of plowing.
GCN highlighted a voice-activated virtual assistant created by electronic health records firm Epic in partnership with tech company Nuance that intends to improve service at the Department of Veterans Affairs. The virtual assistant, called Florence, guides patients through picking their doctor and shows them openings in that doctor’s schedule, promising to reduce the administrative load of physicians and make scheduling easier for users with impaired vision or poor motor skills.
The MIT Technology Review discussed a study from the National Bureau of Economic Research in which economists and computer scientists trained an algorithm to predict whether defendants are a flight risk based on their rap sheet and court records. Using data from hundreds of thousands of New York City cases, the algorithm is able to better predict what defendants will do after release than judges. The project’s researchers estimate that the algorithm could cut crime by defendants awaiting trial by as much as 25 percent without changing the number of people waiting in jail or reduce the jail population awaiting trial by 40 percent while leaving the crime rate unchanged.