Each week we will bring you a summary of what happened this week on our site, on Twitter, and in the wider world of civic data. Suggest stories on Twitter with #ThisWeekInData.
Here on Data-Smart, Jon Jay published the second installment of his series on tapping local data to combat drug overdoses. Jay analyzes police and EMS dispatch data from Cincinnati that can reveal when and where overdoses occur. Cities can use this data to create an effective early warning system for overdose spikes, identify chronic hotspots to target interventions, and assess performance across units.
Also on Data-Smart, Eric Bosco profiled the City of San Diego’s Chief Data Officer (CDO) Maksim Pecherskiy as a part of our Becoming a Leading CDO series. Coming from a coding background and serving as a Code for America fellow, Pecherskiy runs the city’s data analytics team like a consulting shop. He led the development of the city’s open data portal, which puts an emphasis on automating city data, pulling data directly from the city’s various systems, transforming it, and putting it into the cloud.
CityLab highlighted a study led by SUNY’s College of Environmental Studies and Forestry that sought to quantify the value of trees to 10 cities and understand the extent to which these benefits would increase if the cities went greener. To estimate the existing tree cover in Beijing, Buenos Aires, Cairo, Istanbul, London, L.A., Mexico City, Moscow, Mumbai, and Tokyo, researchers adapted the i-Tree model, which uses aerial photography to gauge the dollar value and environmental payoff of the urban canopy. Across the 10 cities, the researchers estimated an annual median payoff of $505 million and determined that payoffs track closely with density, meaning cities have much to gain by increasing investment in arboreal infrastructure.
CityLab also wrote about interactive ESRI maps created by humanitarian aid organization Direct Relief that display the vulnerable communities in Hurricane Harvey’s path. The mapmakers have used the Centers for Disease Control and Prevention (CDC) social vulnerability index to show the geographic distribution of households with elderly or disabled members, immigrant and limited English-speaking populations, and pockets of poverty. Often, these populations live in the areas most prone to flooding or lack the resources to evacuate, meaning interventions should target these communities in particular.
In preparation of Hurricane Harvey, a number of federal agencies took steps to ensure that data collection during the storm would be as effective as possible. NASA, the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS) worked together to facilitate the seamless collection and sharing of satellite data and on-site sensor data that indicate the volume of precipitation, the extent of flooding, and temperature changes. This information is crucial for emergency managers to understand the state of various areas and deploy responses accordingly. Read more at GCN.
The City of New Orleans has launched a citywide data-driven racial equity initiative called EquityNewOrleans in partnership with the W.K. Kellogg Foundation. The goal of EquityNewOrleans is to institutionalize racial equity in policy development, program creation and service delivery via an inclusive process that focuses on two goals: engaging community residents and other stakeholders about equitable government and using a data-driven approach that creates a blueprint for structural and systemic change. Read more at Government Alliance on Race and Equity.
While urban renewal and decline receive much press, the majority of neighborhoods are actually staying much the same, CityLab reported. A new study by geographer Elizabeth Delmelle identifies seven basic types of American neighborhoods, the main contours of their change, and the kinds of metros where different types of neighborhoods predominate. While every large metro contains many, if not all, of the basic neighborhood types, many metros are dominated by a particular type of neighborhood, or a particular combination of types. The study uses advanced clustering and mapping algorithms to examine 18 key variables driving neighborhood change between 1980 and 2010, including race, housing type, and other socioeconomic conditions.
Sean Thornton published an interactive story map on Data-Smart that highlights Chicago’s Clear Water Project, an initiative to keep beaches safe from harmful bacteria through a combination of new water testing technologies, predictive modeling, and continued volunteer engagement. While standard water testing methods—like time-consuming culture tests, rapid but expensive DNA tests, and promising but unproven predictive models— are on their own inadequate, using a combination of the three allows for more efficient testing.
The London School of Economics Impact Blog created a short open data starter guide intended to help researchers and governments make their data openly accessible and usable for others. The guide recommends updating forms to ensure subjects give informed consent to have their data shared, anonymizing all data, formatting data in plaintext, developing readmes that describe the data and analysis processes, future-proofing data, and hosting resources in a reliable place that won’t give users access problems.