Each week we will bring you a summary of what happened this week on our site, on Twitter, and in the wider world of civic data. Suggest stories on Twitter with #ThisWeekInData.
CityLab profiled a new report and map from the U.S. Department of Housing and Urban Development (HUD) that outlines areas in which the need for affordable housing is most dire. The report focuses on the sliver of the population with the “worst case needs”—renters who a) make at or below 50 percent of the area median income; b) do not get housing assistance; and c) pay more than half of their income on rent and/or live in physically unsafe or deficient housing. This category of renters grew by 8 percent between 2013 and 2015.
Also on the topic of affordable housing, San Jose is using its IBM Smarter Cities Challenge grant in order to address the city’s housing crisis. With technical support from IBM, officials will seek to develop a technology infrastructure that matches residents with the housing they need while ensuring that the existing rental stock is monitored and well maintained. Projects will include a rental unit registry that allows the city to track rentals governed under its rent control ordinance and a website that will guide prospective residents to affordable housing. Read more at StateScoop.
GCN highlighted efforts by Michigan State University to create a voluntary registry of all Flint residents exposed to lead-tainted water. The registry will be used to understand relationships between participant health and community interventions over a four-year period. With this information, the city will be able to better determine what interventions are effective.
Here on Data-Smart, Jonathon Jay reviewed New York City’s RxStat initiative to combat opioid abuse and devised his own analysis of datasets in other areas. RxStat convenes public health and public safety representatives in monthly meetings to review recent opioid data from a variety of sources, allowing the city to more effectively coordinate responses across agencies. Perhaps most importantly, this program has provided law enforcement access to mortality records, allowing the police department to target the types of drugs and users involved—a process other cities should emulate.
Also on Data-Smart, we published a transcript of Pittsburgh chief data officer (CDO) Laura Meixell’s presentation for the Civic Analytics Network’s webinar “The Power of Data Visualization in Cities.” Meixell outlines the development process for the city’s Burgh’s Eye View open data platform, a map that amasses all of the city’s location-based data. The platform has been an iterative and customer-based project for the city, starting as a tool for the police department and expanding to many other city agencies.
The United Nations’ Sustainable Development Solutions Network released the US Cities Sustainable Development Goals Index ranking the top 100 most populated cities in the U.S. based on their sustainability, Vice Impact reported. The Index includes 16 categories including health, economic stability, and social justice equality based on sustainable development goals agreed to by the U.S. and 192 other nations in 2015. As examples, Orlando, FL came in first for the clean water and sanitation category and Spokane, WA claimed first for affordable and clean energy.
GovTech reported on a program called beyond.uptake, a six-month fellowship from Chicago data science company Uptake that includes training sessions on methodologies, cybersecurity, machine learning, agile development, data visualization and more. This program has helped train data scientists transitioning to a role within city government. By combining the vast resources of a private data science company with people who work in municipal or social groups, the fellowship brings much-needed technical capacity to organizations attempting to serve the public.
Nesta published a piece arguing that innovations like open data and artificial intelligence could greatly benefit philanthropic organizations in their fundraising efforts. The article calls for opening data to make philanthropic initiatives more accountable, using machine learning to analyze patterns in successful funding campaigns, using AI to more efficiently complete funding applications, and building common data repositories so that applicants can reuse data for grant applications.