The Ash Center for Democratic Governance and Innovation at the John F. Kennedy School of Government, Harvard University, this week announced the 100 programs named as Semifinalists in this year’s Innovations in American Government Awards program, which will compete to be named Finalists and will have the chance to be awarded two $100,000 grand prizes in Cambridge this spring. These programs advanced from a pool of more than 500 applications from all 50 states, and were selected by the Innovations Award evaluators as examples of novel and effective action whose work has had significant impact, and who they believe can be replicated across the country and the world.
Below, we highlight some of the great data-related initiatives recognized as Semifinalists.
18F, General Services Administration
Built in 2014 in the spirit of America's top tech startups, 18F is a digital consultancy for the US government inside the US government, working with federal agencies to rapidly deploy tools and online services that are reusable, cut costs, and are easier for people and businesses to use. 18F is a growing team of technology experts that build custom solutions to pressing problems and also help agencies buy better products from private vendors. In 2015, 18F doubled the size of its team from 80 to over 160 as its product and consulting work increased. Unlike other technology teams in the federal government, 18F is a fee-for-service organization, helping customer agencies deliver on their mission-critical projects. In fiscal year 2015, the 18F team delivered over a dozen products for their customer agencies and released a number of tools and platforms that help make their team more efficient and effective.
BigApps Competition, City of New York, NY
The New York City BigApps Competition is the largest city-run civic tech competition in the country. The annual competition is the brainchild of the New York City Economic Development Corporation (NYCEDC). First announced in June 2009, the BigApps Competition was designed to be an annual contest spanning several months that would incentivize software developers and members of the public to use newly released city data to create web and mobile applications with the chance to win cash prizes. BigApps aimed to diversify New York City’s economy by strengthening the information technology and digital media sectors of the city’s business community, foster a culture of innovation, and make city government more transparent, accessible, and accountable to its citizens. As part of the competition’s initial launch, NYCEDC worked with more than 30 city agencies to release more than 170 datasets, including information on census figures, restaurant inspections, property sales, and traffic. BigApps empowered the public to use this newly released data to build applications that could improve life for New Yorkers. In the competition’s first year, 85 applications were submitted and 10 winners were awarded a total of $20,000. Since then, the program has evolved to themed competitions that address specific problems. For instance, in 2015, BigApps called for participants to build products addressing four specific city challenges consistent with the mayor’s priorities: affordable housing, zero waste, connected cities, and civic engagement. NYCEDC partnered with over a dozen city agencies, policy advocates, and tech experts to create a network of mentors that helped competition participants create impactful products addressing these issues. Recognizing that previous competition winners were often left with limited support after the contest's conclusion, NYCEDC also provided the 2015 winners with four months of programming and support services tailored towards civic tech innovators in addition to a total of $125,000 in cash prizes.
Citywide Analytics Team, City of Boston, MA
The Citywide Analytics Team uses data to improve quality of life and enhance government operations in the city of Boston. By combining data analysis and visualizations with an engaged approach, the team works with departments to solve challenging urban problems and build a more effective government for Boston’s residents. Since early 2015, the team of 20 staff people has implemented projects like CityScore, which aggregates and displays key metrics from across the city that help the mayor and department executives spot trends that need additional investigation, and measure the impact of changes to process and policy. Other programs include using data to prevent firefighter injuries, improving traffic flow through a partnership with Waze’s Connected Citizen program, and engaging the public with the Open Data to Open Knowledge Project partnering with the Boston Public Library to re-imagine the city’s data as a resource for public knowledge.
Computer Vision for Conservation, National Oceanic and Atmospheric Administration
There are only around 500 North Atlantic right whales alive today, making them one of the most endangered animals on the planet. Individuals can be identified by photographs taken from vessels and airplanes, and then compared to the North Atlantic Right Whale Catalog run by the New England Aquarium. Knowing the individual identity of a whale opens up many possible avenues of research and conservation management including demographics, social structure, and informed disentanglement operations. The process of matching a photograph to the catalog can be time-consuming, and finding a way to automate this process using the latest in image-recognition technology would free up valuable time and resources so that scientists have more time and energy to devote towards the conservation of these endangered whales. In November 2014, the National Oceanic and Atmospheric Administration (NOAA) contracted with Kaggle, a platform for predictive modelling and analytics competitions, to crowdsource a technology solution. The competition ran from August 2015 through January 2016 with a $10,000 prize pool sponsored by MathWorks, and NOAA Fisheries provided the right whale aerial photographs and associated data set. Data scientists competed to create an algorithm to match a photograph of a right whale to its unique individual identity. The winning solution by software company Deepsense.io relied heavily on convolutional neural networks in their solution to achieve 87% accuracy. This is very different than other approaches to image recognition that typically seek to count the number of individuals in the photograph and classify them to species. This solution actually classifies the whales to their unique individual identity. NOAA plans to use this algorithm to create software to automate the process of identifying whales, thereby freeing up valuable time and resources.
Congressional App Challenge, House of Representatives
The Congressional App Challenge is the first effort to leverage the power and reach of Congress to inspire students of all backgrounds to code through app challenges hosted by Representatives. This challenge has unparalleled capacity to reach students nationwide, and represents the largest set of simultaneous challenges ever held. While companies and nonprofits may have widespread reach, no institution has the all-encompassing reach that the House of Representatives does. In 2014, 80 Representatives hosted challenges. In 2015, that number grew to 116. These 116 Members hosted challenges in 32 states, the largest number of simultaneous, independent challenges ever undertaken. More than 1,700 students signed up to participate and over 500 original apps were submitted. Thanks to deliberate efforts targeted at inclusion, the challenge surpassed tech industry averages on many scales: 30 percent of the participants were young women, while over 10 percent and 13 percent identified as Black or Hispanic, respectively. This turnout demonstrates the true potential of this innovative government initiative to elicit interest in programming while addressing the critical problem of diversity.
Digital Democracy, California State University
Digital Democracy delivers a first-of-its-kind online, searchable database of California legislative hearings, enabling users to search video archives by keyword, topic, speaker or date. Through advanced software technologies, inventories of dense, static bill text and vote counts give way to interactive multimedia clips that bring the lawmaking process to life. Prior to its inception, the IT staff within the California state legislature dismissed the concept as technologically infeasible and cost prohibitive. One state software developer estimated the cost to build the platform at $80–100 million. A rough proof-of-concept platform built by the Institute for Advanced Technology & Public Policy at the California Polytechnic State University was demonstrated in June 2014 and caught the attention of the Laura and John Arnold Foundation, which provided the $1.2 million of funding. Over the next six months, nearly 50 students built the full-scale platform. The site was released to the public in May 2015 and has been live without interruption since. Teams of students continue to develop and deploy new technology innovations to increase the site’s effectiveness. Since launch, significant technology improvements have been made to increase the automation and accuracy of transcription and speaker identification. Additionally, students have developed and deployed new tools requested by users, including an e-mail notification system and a custom video player that enables the clipping, montaging, and social media sharing and embedding of key video moments. With a growing ‘big data’ set underlying the platform, students are currently developing analytics to explore trends and relationships between special interests and lawmakers.
Federal Crowdsourcing and Citizen Science Initiative, Federal Community of Practice for Crowdsourcing & Citizen Science
The Federal Community of Practice for Crowdsourcing and Citizen Science (CCS) developed a collection of resources, designed by and for federal practitioners, focusing on two approaches to open innovation. Both approaches--crowdsourcing, in which organizations submit open calls for assistance from large groups of volunteer problem-solvers, and citizen science, in which public participants engage in any part of the scientific process--promote public engagement as a mechanism to address complex problems. These approaches represent new types of collaboration and engage members of the public, many of whom might not otherwise be consulted, in research and solution development, thus allowing researchers to gain valuable data and insights. Such open innovation in the federal government often faces challenges of awareness, culture, and institutional understanding. CCS initiatives help the federal government produce these new types of collaboration. CCS meetings and events have connected practitioners and built new capacity to develop broad federal understanding of the value of these approaches, as well as specific resources to support their implementation. Through a partnership with the White House Office of Science and Technology Policy, the Commons Lab at the Wilson Center, and the General Services Administration, CCS delivers these resources to the federal community and public through a centralized, high-profile site at CitizenScience.gov. The site contains three dynamic components: a portal to CCS for federal workers, a how-to toolkit, and a catalog containing over 300 federal projects.
CCS mobilized 125 of its members to develop the toolkit, which offers guidance to federal practitioners on every aspect of a crowdsourcing or citizen science project from design through data analysis. Complemented by case studies, the toolkit provides the resources needed to pitch, launch, maintain, and scale projects. The catalog follows up on the toolkit and gives agencies the opportunity to detail the opportunities, results, and benefits of projects ranging from tracking weather to transcribing historical records. These resources are continually updated and improved by federal citizen science and crowdsourcing project managers to reflect the most current information. CCS also worked closely with the White House to shape a memo providing federal agencies with high-level support and guidance to further expand their use of crowdsourcing and citizen science. After five years of momentum-building by CCS and others, on January 6, 2017, President Obama signed the American Innovation and Competitiveness Act, which for the first time gives clear, broad authority to all federal agencies to conduct citizen science and crowdsourcing projects. This bi-partisan bill points to an even brighter future for these approaches in government.
Fiscal Stress Monitoring System, State of New York
New York State Comptroller Thomas DiNapoli's Fiscal Stress Monitoring System (FSMS) provides an objective and transparent fiscal stress assessment annually for 2,300 local governments in the state, using self-reported financial data. Scores and reports published annually give local stakeholders robust tools for decision-making on budgets and service delivery. Interest in this project grew out of the numerous municipal fiscal crises across the country in the late 2000s and concern from New Yorkers that a similar crisis could happen in their municipality or school district. Given their role in overseeing the fiscal affairs of local governments, the Comptroller’s office convened an internal workgroup, with the charge to develop a statewide, objective process to examine local government financial condition without new reporting requirements. After months of research, consulting with experts and constituents, statistical testing and public comment, FSMS was introduced. Prior to implementation, some local officials voiced concerns that rating agency downgrades might result, along with public discord and political grandstanding, leading the FSMS team to work to ease concerns and emphasize the benefits of the program through regular stakeholder meetings, informational webinars, a dedicated webpage, and a self-assessment tool. To date the office has conducted 11 webinars for 1,300 local officials and 36 in-person FSMS trainings for 2,800 local officials. Internally, FSMS has led to more efficient and time-sensitive reviews of municipal reporting, and more robust verification. The communication process evolved, enhancing the way the Comptroller’s office communicates with all local officials (even for non-FSMS purposes) to reduce costs and provide more timely information.
Inspection Service Delivery, County of Pima, AZ
Providing building permit inspections in Pima County in a timely and cost-effective manner is a challenge for inspectors grappling with the sprawling geography of a county about the size of New Hampshire, and with the large population in the un-incorporated jurisdiction of 353,264. To improve customer service and reduce the cost to deliver the service, Pima County proposed implementing computer-based auto-routing to achieve the two goals of providing inspection clients with an estimated time of arrival (ETA), and reducing travel time and miles driven. A routing system was designed that automatically distributes the roughly 100 inspection appointments each day between six inspectors in the most efficient route possible, equitably distributing the work load. A pilot test of the ArcGIS Vehicle Routing Problem Solver (VRP) software over several months, compared the actual routes of inspectors with the auto-routes, adjusted the computer inputs and made corrections to the county’s GIS database of streets based on those comparisons. Inputs to the VRP include a calendar of inspector availability, inspector start location, estimated time required to complete each scheduled inspection, inspection location, assignment of geographic areas that require four-wheel-drive vehicles and identifying those inspectors with those vehicles. The VRP also utilizes the street network and associated speed limits that are in the county’s GIS system to calculate drive times. Using county data is essential, as commercially available mapping tools like Google Maps do not update often enough to reflect new subdivisions, a busy location for inspection, reliably. To mitigate the inability of the software to incorporate customer requests for specific inspection appointment times, the team developed a complementary program for just-in-time remote video inspections using Skype and a calendaring app. The client may schedule a remote inspection up to a few minutes prior to the requested inspection time, and inspections are performed by inspectors in the office while the contractor or homeowner acts as cameraperson.
LinkNYC, City of New York, NY
Los Angeles Express Park, City of Los Angeles, CA
For years, economists have advocated that parking rates for on-street meters should be set by the demand for those spaces, arguing that on-street parking spaces required an investment by the government and provided value to the public, particularly to the retail merchants. Because of the high cost of gathering demand data and the limits of parking meter technology, very little had been done to demonstrate these theories prior to the start of this century. By 2005, two important technical developments converged to make it possible to do large-scale testing of these theories: new technology parking meters provided real-time payment data and vehicle sensors provided accurate parking space occupancy data. In 2007, the Los Angeles County Metropolitan Transportation Authority received a $212 million grant from the United States Department of Transportation as a part of the national Congestion Reduction Demonstration initiative. The city of Los Angeles received $15 million from the grant to fund LA Express Park, an intelligent parking management system for the downtown area, with an additional $3.5 million budgeted by the city for this project. While infrastructure was being installed to support the project, the city engaged city council offices, business improvement districts, neighborhood councils, and other community groups to share details of the pilot and get feedback on how to best tailor policy to meet the needs of citizens. LA Express Park differs from other similar smart parking programs in several significant ways, including a contiguous project area, a more rigorous pricing algorithm, and its policy to make price changes in larger increments. Initiated as a one-year demonstration project, it has proved to be so successful that it is now part of the ongoing operations of the Los Angeles Department of Transportation.
Mobility Initiative, City of New York, NY
Begun in January 2014, the New York City Police Department (NYPD) committed to changing their approach to policing in the city, implementing a Plan of Action blueprint designed to deliver improved police services in the city of New York and break down the barriers between two parties who should be natural allies: the police and the people they serve. One essential and enabling element of this strategy involved a revolution in NYPD technology, making the police more accessible to the community, and delivering tailored information and analytics to police officers where they need it the most — in the field. To that end, the NYPD used the opportunity to build out its “Platform for Transformation” — a vision to put a smartphone in the hands of all 36,000 officers and place a tablet in every emergency response vehicle. By providing e-mail addresses and phone numbers to the entire uniformed workforce, the smartphones and tablets make officers, including the new Neighborhood Coordination Officers, directly accessible to the community. In addition, the devices come equipped with a number of custom-developed applications, which were created based on the ideas and feedback of officers in the field. These applications enhance the NYPD’s delivery of police services, including a 911 dispatching app, which alerts officers to 911 calls even before they come over the radio, with associated information about the location they are responding to and decreasing response times; a search app, which provides enterprise search of all NYPD and certain state and federal databases, streamlining investigations; form creation apps, which allows officers to take digital reports on scene in the field, and eventually provide them online to the public, paving the way for the NYPD to go paperless; blast messaging, which allows the NYPD to send critical informational bulletins, including officer safety alerts and pictures of missing persons direct to officers’ smartphones; and a training app, which allows for distance learning, which has the potential to fundamentally transform the Department’s approach to educating its workforce.
OpenJustice, State of California
OpenJustice is a transparency initiative that embraces data-driven criminal justice reform. Using core assets of the California Department of Justice (DOJ)—law enforcement data—and in partnership with academia, nonprofits, and the tech sector, OpenJustice applies cutting-edge data science, data visualization, and open data to improve accountability and public policy. OpenJustice was developed to be a start-up in government to address two issues: strengthen the trust between law enforcement and the communities they were sworn to protect, and provide crucial data that can help California understand how we are doing, where we are having successes, and where we can improve. The goal was to provide an open data portal where users — policymakers, researchers, advocates, and law enforcement — could download raw public safety data, as well as data stories and visualization tools that would highlight important insights in an easily understood and interactive way. With no allocated budget, it was a classic bootstrapped effort. A few senior policy advisors, web team, the Criminal Justice Statistics Center (CJSC), and experts from the Division of Law Enforcement (DLE) formed an informal working group to focus on the initiative, and partnered with academics from University of California, Berkeley. After months of research and design, v1.0 of OpenJustice was launched, including three key datasets: deaths in custody, arrest rates, and law enforcement officers killed or assaulted. The project was one of constant iteration with the DOJ and external stakeholders. Because local law enforcement support was critical to the success, DOJ staff worked diligently to inform them of their efforts and solicit input. The biggest feedback received was that releasing the data was an important step forward, but that they should also make sure to paint a complete picture so the data would be understood in the appropriate context, prompting the team to layer on information like population demographics and total calls for service.
Presidential Innovation Fellows, General Services Administration
The Presidential Innovation Fellows (PIF) program unleashes the principles, values, and practices of the innovation economy to address high-impact public sector challenges through the most effective agents of change we know: our people. These teams of government experts and private-sector doers take a user-centric approach to issues at the intersection of people, processes, products, and policy to achieve lasting impact. Inspired by “lean startup” methodology, PIF was designed to focus on high-impact projects that could quickly research users, build prototypes, test solutions, and iterate. On August 23, 2012, the first class of PIFs kicked off their tours of duty in government. Since then, what began as a grand experiment has yielded results through programs like Project RFP-EZ that proved a streamlined bidding process for small government contracts could lead to a 30 percent reduction in the annual $80 billion federal IT budget, saving taxpayers $24 billion per year. Project Blue Button was launched by young technologists working in the Departments of Health and Human Services and Veterans Affairs to help citizens access their own health information. To achieve and sustain these kinds of successes, PIF has adapted to challenges ranging from bureaucratic hurdles in government, to growing pains from expansion. Since federal government hiring processes typically take months, the need to hire fellows in weeks caused tension. Furthermore, some federal agencies weren’t prepared to onboard fellows quickly, which resulted in fellows waiting weeks after they joined for laptops and email addresses.In response, the fellows and those running the program developed hacks allowing them to achieve their missions, such as identifying aspects of the hiring and background check process that could be run concurrently and pre-ordering laptops and email accounts even though doing so was not standard procedure. Once PIFs proved they could deliver value, the demand for fellows from agencies skyrocketed, with 35 agency projects competing to hire PIFs in the second round of the program. There was also increased interest from candidates, with over 2,100 applicants for the second round.
Project Comport: Police Data Initiative, City of Indianapolis, IN
The city of Indianapolis’ leadership teams under several mayoral administrations have believed that transparency and the use of data and data analysis to drive decision-making make government more efficient and effective. When city leadership was introduced to the Code for America (CfA) organization in early 2013, the city’s then-mayor led the city’s charge in applying for a partnership to assist in quality of life opportunities in the city’s focus areas, creating a data retrieval tool for the computer-aided dispatch and records management system as well as understanding open data policies and portals. The city’s project team of the Indianapolis Metropolitan Police Department and the Citizens Police Complaint Office worked with Code for America fellows to create a data extraction tool for use with the Internal Affairs Pro data program. In order to help site visitors’ comprehension, some context and definitions of departmental policies and procedures are provided along with the data. ProjectComport is the tool that was created to marry data and context. It performs an auto extraction of data on a daily basis for upload and review before releasing live on the ProjectComport.org site.
Public Transit Performance: Analytics and Mobile, District of Columbia
The District Department of Transportation’s (DDOT) Public Transit Performance: Analytics and Mobile project provides real-time bus reporting and analytics tools for DDOT’s DC Circulator system by utilizing data gathered through affordable, consumer-grade smartphones. The District of Columbia is home to 681,170 residents and is frequented by millions of visitors. To accommodate the city’s many transit users, DDOT created the DC Circulator system in 2005 to provide links to the District’s many cultural, entertainment and business destinations. The DC Circulator system did not have high-frequency reporting capabilities, which sometimes led to unreliable bus arrival predictions. In 2014 DDOT decided to tackle this issue by improving DC Circulator service reliability. To this end, DDOT installed generic smartphones inside DC Circulator vehicles to improve tracking of the system’s fleet. Such a tracking method existed as proof of concept; many transit agencies were hesitant to adopt it and instead resorted to legacy “black box” solutions to track transit vehicles. DDOT embarked on a pilot project to track several DC Circulator buses through smartphones that were installed on DC Circulator buses. The vehicles were also tracked via customary “black box” hardware, which ultimately proved to be much less reliable tracking devices.
Since the initial pilot, the entire DC Circulator fleet has been outfitted with smartphones, which report bus locations in three-second intervals. Additionally, these smartphones allow DDOT to gain insights into operational issues, such as headway reporting, bus bunching, bus monitoring and other customized business intelligence tools. Aside from the improved data collection and analytics that have come from this project, this system greatly reduces legacy costs as well. DDOT now has a bus tracking and analytical service based on data collected through affordable, consumer-grade hardware (smartphones); it is not tightly coupled with a specialized hardware vendor. Hardware maintenance has been considerably simplified as well – replacing a phone is a relatively clear and simple task. Additionally, this initiative helps District residents and visitors by incorporating the real-time bus locations that are pulled from this project’s data into a mobile app (rideDC Trip Planner) that offers rail and bus predictions and a DC Metro map that features various transit options (rail, bus, bikeshare and carshare) near a user's current location.
Radio Frequency Identification in Utility Street Cuts, City of Dayton, OH
Within the City of Dayton, there are thousands of utility street cuts (“cuts”) within the roadways from utility excavation. A utility street cut is an area of the roadway that has been dug into by a utility company or their contractor to do work to their underground utility and then restored at a later date. If a cut becomes a safety hazard, or fails, it is up to the utility company as the “owner” to make full repairs for the lifetime of the cut. When previously trying to identify the owner of an unsafe or deteriorating cut, it could take minutes to days to determine the owner; typically, the city’s utility inspector would head into the field to investigate the complaint on site. Once on site, they would first confirm that this is in fact a utility street cut, and not a pothole, and would then look for any visible clues (manholes, water valves, etc.) to determine the owner. If no success, they would search through a Microsoft Access database on a laptop, based upon an address, for any permits issued. If this yielded no results, they would either look in the office for old paper records or call the Ohio Utility Protection Services to mark all utilities in the area of the cut. This could take days and further extends the unsafe roadway hazard. In 2011, city engineers began to discuss an idea to embed RFID tags programmed with the owner and utility permit number that would be placed in every cut during the restoration process in the field. In April 2013, after working with CDO Technologies, a local systems integrator, the team developed the technology and materials needed to officially begin this project. Since inception, when the utility company or their contractor pulls a permit with the city, they are issued preprogrammed RFID tags that will be placed within the asphalt during the restoration process of their cut. The tags are durable and strong enough to read several inches below the pavement. Now, when a complaint is received on a post-2013 cut, the utility inspector uses their handheld device to scan the cut for the RFID tag and instantly the owner is displayed on the screen. To date, the city has issued over 10,900 RFID tags to utility companies or their contractors with few obstacles observed, and has seen an increase in overall workmanship since inception.
Recreation Mapping Project, Bureau of Land Management
The Bureau of Land Management (BLM) has initiated the recreation mapping project to ensure that the public can go online from any device and access consistent, informative and interactive maps of BLM’s recreational opportunities. Information for recreational opportunities on public land has been inconsistent, difficult to locate, or does not exist, and none of it was available in a mobile-friendly format. BLM did not have the ability to portray any geospatial information for recreational activities. Due to declining staff and budget, the capacity of agency employees to obtain and validate quality data and information has been limited. Beginning in November 2014, BLM staff and leadership met to develop a strategy in which the agency could utilize the recreational community and partners to help gather and showcase specific recreational opportunities, beginning with mountain biking. The International Mountain Bike Association (IMBA), with a broad user base and proven use of technology, was enlisted to help gather website content and crowd-sourced geospatial data for twenty exceptional mountain bike destinations showcasing a range of trail experiences. The BLM, IMBA, local partners, City of Moab and the mountain bike community, engaged people by using social media to capture exciting trail photos and content. Participants in the event shared photos of the Top 20 sites from BLM National’s Flickr album or individual state pages, Tweeted photos and content on state Twitter accounts, and shared posts from the My Public Lands Instagram account and Tumblr. While the BLM was able to align its limited resources for one small effort, it has had to develop new processes, data standards and partnerships to grow this effort beyond 20 mountain bike trails. Since the October 2015 rollout of the mountain bike websites, agency staff have been working to develop processes for gathering and validating crowd-sourced data from numerous partners, and procedures for merging it with corporate data and serving it back to the public, partners and other agencies. A strategy for highlighting additional recreational opportunities by area and by activity is in place and efforts are underway to showcase at least two more specific activities and a state-by-state “bucket list” of exemplary recreational activities and destinations by the fall of 2016.
Regulatory Roadmap Initiative, State of Washington
When state of Washington businesses asked the state Department of Commerce (Commerce) to help simplify regulatory requirements, the agency set out to Lean state permitting processes. While businesses appreciated improvements at individual agencies, Commerce soon realized the problem was much more complex than any single regulation – businesses were spending large amounts of time researching all of the state and local regulations and then trying to navigate through them. For example, opening a restaurant can involve requirements from more than 17 different city, county and state regulatory agencies. Guided by businesses’ ideas of what would provide the most value, Commerce worked with the restaurant community, local jurisdictions and regulatory agencies to develop a better approach. The result was an online “roadmap” that distills all local and state requirements into easy-to-understand, sequential worksheets and checklists for opening a new restaurant. Examples of typical restaurant start-ups and planning tools that identify “trigger issues” help business owners avoid costly regulatory surprises. The pilot Restaurant Roadmap, started in Seattle in 2013, is saving prospective restaurateurs time, money and mistakes. The concept was adapted for other cities throughout 2015 and 2016. Building on this success, the roadmap approach is expanding into other industries. Commerce convened manufacturers to learn about their regulatory concerns, and heard once again that regulatory information was unpredictable and difficult to find. They wanted access to technical details to quickly make feasibility decisions – before hiring a consultant or architect. Based on the manufacturers’ input, Commerce produced a Manufacturing Roadmap that includes interactive tools to assess costs, timelines and overall feasibility of potential facility sites. Manufacturers reported that the roadmap could have saved them two months of combing through city codes and agency websites trying to understand if a potential site would pencil out. Like the restaurant pilot, the Manufacturing Roadmap is now being replicated in several other cities. A new roadmap for the construction sector is also under development.
Smart Chicago Collaborative, City of Chicago, IL
Smart Chicago was born in the conversations of the early to mid-2000s around closing the digital divide. The culmination of these conversations was the May 2007 report titled, "The City that NetWorks: Transforming Society and Economy Through Digital Excellence". There were eight central recommendations in this report, including that the city should recruit committed civic leaders to organize and launch the Partnership for a Digital Chicago, a new nonprofit entity, housed at The Chicago Community Trust and led by corporate, philanthropic, city, community and technology industry representatives. Its mission will be to ensure that all of Chicago achieves digital excellence and takes advantage of the social and economic opportunities that arise from universal use of digital technology. This idea — the Partnership for a Digital Chicago — became the Smart Chicago Collaborative. Smart Chicago’s first undertaking was to help distribute Chicago’s funds under the National Telecommunications & Information Association’s (NTIA) Broadband Technology Opportunity Grant Program. Since 2011, Smart Chicago has created and implemented several nationally recognized community technology programs that fill gaps in the civic technology ecosystem and disseminate the benefits of technology to middle and low income Chicagoans: the Civic User Testing Group (CUTGroup), Smart Health Centers, Youth-Led Tech, the Chicago School of Data, the Chicago Health Atlas, and Connect Chicago, to name a few. Smart Chicago has also documented and published all of its methods through blogs and books, encouraging and sometimes directly assisting other cities seeking to replicate its success.
Smartphones Enable Smart Supervision, State of Oregon
Outreach Smartphone Monitoring uses predictive technology to recognize changes in behavioral patterns by tapping into human/smartphone interaction. We believe that the smartphone is the perfect vehicle to distribute resources to individuals and collect the data needed to see what really reduces recidivism. Working with community supervision agencies, The Honorable District Judge Ann Aiken and Mark Sherman from the Federal Judicial Center, we designed an application that would replace the use of an ankle bracelet. This would be accomplished by incorporating a bluetooth biometric wrist band, remote blood alcohol testing using a bluetooth breathalyzer, providing rehabilitative resources and traditional electronic monitoring. The OSM smartphone and web application is now in use in over 20 states and customer groups include, Drug Courts, Pretrial Services, Probation, Reentry, Juvenile Supervision, Treatment Facilities and DUI law firms
Startup in Residence, City and County of San Francisco, CA
Startup in Residence (STIR) connects the public sector directly to innovative technology entrepreneurs to help solve challenges faced by City government, and make government more accountable, efficient and responsive. For 16 weeks, startups help departments unpack issues with data analysis and prototype solutions refined through user testing. Startups gain insight into civic needs to develop products that support critical community services. Announced in 2014, San Francisco, the first cohort had nearly 200 startups from 25 cities and countries apply to the program, from which the 6 most promising startups were selected to collaborate with government agencies across 16 weeks to build new products and services. All 6 of these collaborations resulted in innovative products for government. One of the most exciting outcomes from these collaborations was a solution to guide blind and visually impaired airport customers to their gate and other services. The application was built by a company from Vienna called indoo.rs in collaboration with our airport and in consultation with Lighthouse for the Blind, a SF-based non-profit that advocates for the blind and visually-impaired. The San Francisco International Airport installed nearly 500 ibeacons in Terminal 2 and shared detailed maps and resources down to the location of power outlets, and is planning to scale the technology and adapt the software into multiple languages. With the lessons learned from the initial cohort and a three year grant from the US Commerce Department, STIR was formally announced and expanded in January 2016 regionally with Oakland, San Leandro and West Sacramento. This multi-city collaborative has shared nearly 27 challenges for technologists and entrepreneurs to tackle. After the 16-week program, the government agencies and startups have the potential to enter into a commercial arrangement through the usual competitive process which means an RFP, and has been streamlined in San Francisco based upon our experience from 2014, reduced from months or years to weeks by having the call for startups be an RFP itself.
The Thingstitute, County of Montgomery, MD
The Thingstitute is a first-of-a-kind living laboratory for internet of things (IoT) technology, providing an unparalleled testbed for start-ups, established companies, and research institutions. Housed in Montgomery County’s Offices of the County Executive, the Thingstitute designs and operates test beds to enhance the quality of life for local residents. In 2014, the County launched its first IoT Project, Smart Community Alert Network (SCALE), in NIST’s GCTC. SCALE was envisioned as a testbed to pilot IoT Technology that would help seniors aging in place live independently longer and have easier access to services. Interest in SCALE grew quickly and it was eventually recognized by the White House Chief Technology Officer Todd Park at a GCTC event. Based on this positive response the County Executive decided to expand the project, and the Thingstitute was created. Announced in January 2015, the Thingstitute is the first initiative of its kind in a local government. Its sole focus is to create IoT testbeds that help attract the latest technology resources to the County through innovative pilot, prototype and proof of concept projects that help improve the lives of County residents. To date, the Thingstitute is operating three projects: SCALE, Smart Agriculture (enhancing economic opportunities in small-scale agriculture), and Smart Transit Spotlight project (improving transit rider experience and exploring connected vehicle technology). The Thingstitute has also integrated community education into its mission. To address concerns regarding privacy and security, the Thingstitute works with residents to help build awareness and acceptance of smart city, IoT-based deployments. The Thingstitute delivers workshops and media that help explain what new technology can and cannot do. Most importantly, the testbeds make the technology real. Policymakers, residents and community organizations can see, touch and experience the technology firsthand. This goes a long way towards building public trust in technology that could save lives if deployed.
Using Instagram to Address Blight, City of Mobile, AL
The City of Mobile is the first city in America to develop a comprehensive, digitally-mapped inventory of every blighted residential structure. To conduct the city-wide survey, Mobile, led by the city’s Bloomberg Philanthropy funded Innovation team, utilized Instagram to geo-locate blighted properties while documenting the impact to residents of more than $83 million in lost market value. Capitalizing on the capabilities of Instagram, the City created a brand new mobile app that allows rapid cataloging and city-wide assessment. With new data on the exact scope of the problem, the City can ensure the right resources are being deployed at the right time to the right property. The data collected through this effort revealed that blighted properties comprise two percent of Mobile's housing stock and 25 percent (13,188) of Mobile's homes are within 150 feet of blight. Each of those homes sees an average negative $6,300 to their value, an $83 million loss city wide. Historically, the Nuisance Abatement Ordinance has fallen short of producing dramatically better outcomes for neighborhoods in Mobile. Tougher penalties for owners of unsecured structures, which can become a haven for criminal activity and bring down surrounding property values, will remove incentives to abandonment and neglect. If the owner has taken no action after a violation, the City will take immediate steps to remedy the problem with the costs being borne by properties owners, and not the taxpayers. With stronger enforcement tools in place, the City will have the ability to free 2,600 homes from the effects of blight restoring more than $10 million in real estate equity to local homeowners. Beyond gathering the raw data, Mobile has worked to understand why blight occurs and how it became so pervasive. Over the next two years, the city will raise the profile of blight as an issue and look to businesses, nonprofits and community groups to invest in neighborhoods and help residents keep their properties updated. The Instagram initiative only helped define the blight problem, but it is not often a city process can be completely changed at almost no expense. The initiative has been hugely successful from that standpoint.
Village Green Project, Environmental Protection Agency
The Village Green Project provides local-level, robust, real-time air pollution measurements using low-cost monitoring sensor technologies housed in a community-friendly park bench. This solar- and wind-powered system is in demand by cities across the US because it provides real-time information on the local quality of the air. Each Village Green station continuously measures two common air pollutants (ozone and fine particulate matter), as well as wind speed and direction, temperature and humidity. The measurements are transmitted to a website every minute. The stations are currently all located in public environments, including elementary schools, public libraries, the National Zoo, a national park historic site, and a public children’s garden. Not only do the stations engage the public into learning about local air quality, but the stations have also been shown to closely compare with higher-cost monitoring stations and produce data suitable for research studies on air pollution trends. Engaging the public in learning about local air quality is just as important as the technological advances. While the first prototype was in its infancy, the Village Green research team engaged with the public through blogs and videos. Continuing outreach has been significant, with Environmental Protection Agency and partner social media efforts amplifying the project. In one case, the launch of the Philadelphia station had over 3 million impressions and nearly 300,000 accounts reached on twitter, and the Village Green Project has been covered in media outlets ranging from Fast Company to local news stations.
What Matters for Health, Community PlanIt, City of Boston, MA
Community PlanIt (CPI) is an online engagement game platform designed by the Engagement Lab at Emerson College to create a new space for conversations within a community that is used around the world on local planning topics ranging from water quality to youth unemployment. Frequently, planning meetings are beset by a lack of diversity, learning, and trust and a surplus of one-issue activists, incivility, and misunderstandings; CPI augments existing offline engagement efforts by stepping up where face-to-face meetings often fall flat. Structured within a series of time-limited missions where players are prompted to complete an array of challenges and respond to questions, CPI provides a playful framing that allows planners to guide citizens through the narrative of the planning process, creating opportunities along the way for learning, civil conversation, and meaningful input. Typical CPI games last three weeks and require about one hour of gameplay per week. The outcome of gameplay is data that is directly applicable to the planning process. Community PlanIt not only builds trust between citizens and organizations, but also is itself a powerful data collection tool that allows meaningful analysis of citizen input to incorporate into the planning process. Players learn about local issues, connect with each other, and suggest solutions to problems. Each game ends in a face-to-face Game Finale, where players meet with each other and discuss the results of the game with planners and decision makers. A Tech for Engagement Grant from the Knight Foundation provided seed funding to develop the platform in 2012 with implementations in the cities of Detroit (Detroit 24/7) and Philadelphia (Philadelphia 2035), and most recently with the city of Boston and the World Wildlife Fund (2016). Since then, the game has been used in a wide array of contexts that go well beyond city master planning: from setting public health priorities in neighborhoods to addressing wastewater management at the regional scale, and from social media policy-setting in individual schools to tackling the issue of youth unemployment in developing countries at the national scale.
WindyGrid/OpenGrid, City of Chicago, IL
WindyGrid is a real-time situational awareness platform that allows Chicago to break-down data silos and view activity in their city in street operations, licensing, and public safety. Recently launched as an open source project which can be adopted by others for internal operations or improvements to open data portals. Initial versions of WindyGrid in the spring of 2012 only existed as a complicated database that combined a handful of data sources: 911 calls, crimes, and bus locations. Later iterations grew the number of data sources, but was not usable beyond trained database administrators. Ahead of hosting a NATO conference that would require significant work from dozens of city agencies, the City of Chicago developed a user-friendly, map-based interface. For less than $100,000, the city was able to develop an application that allowed for a central query of public safety, licensing, transportation, social media, and weather data. Equivalent systems sold by large vendors are typically much more expensive and longer implementation cycles that the city likely could not afford or install ahead of the NATO summit. In the spring of 2013, the city rolled-out WindyGrid to other departments and continued to make improvements to the application. More departments used WindyGrid to aid in their operations, extending beyond the original scope around public safety. In turn, the city implemented new feature requests to help these teams. In 2015, the City created WindyGrid 2.0--a significant rewrite that was entirely comprised of open source solutions and introduced compatibility with mobile phones and tablets. At the same time, the City launched OpenGrid, which is the open source core that drives WindyGrid. Whereas WindyGrid is used internally, OpenGrid is open source and can be adopted by other cities and used for situational awareness. OpenGrid was also designed to be compatible with the rising number of open data portals. Whereas WindyGrid focused on internal city operations, OpenGrid provides the same ability to explore data for the public. The OpenGrid platform is free, whether used for internal situational awareness or navigating open data.