Data-Smart City Solutions

Search

By Chris Bousquet

Our From Research to Results series highlights articles from contemporary academic research that have important practical implications for policymakers.  

In the tech world of today, a century-old philosophical question has reared its head to vex designers, company executives, and lawmakers alike. That is the trolley problem, a thought experiment that asks us to imagine there’s a runaway trolley barreling down the tracks towards five people tied up and unable to move, but we can flip a switch to divert to trolley to a side track where there’s one person tied up. The thought experiment demands we make a decision, but what is the ethical choice? The contemporary version of the conundrum walks into even more contentious territory: should a self-driving car be programmed to kill its driver in order to save a group of kids who’ve wandered into the road, or hit the kids to save the driver?

 

This puzzle is illustrative of the host of poignant ethical questions that accompany many modern tech innovations, troubling regulators and companies in the tech space. Technology has implications, and governments and corporations must do their part to ensure that designers consider these implications and develop solutions when designing products. On the other hand, regulators also want to avoid stifling innovation or implementing regulations that will become obsolete as technologies change. The paper “Composite Ethical Frameworks for IoT and other Emerging Technologies” places itself in the middle of this debate, offering recommendations for crafting ethical guidelines to regulate emerging technologies.

 

The authors outline three potential approaches to developing ethical frameworks. The first is the deontological approach, which focuses on regulating the means of the engineering process and calls for specific principles to govern design. The second is the teleological approach, which emphasizes considering the consequences of technological decisions and developing procedures to ensure better outcomes. And the last is the virtues approach, which calls for technologists to develop ethical rules organically via experimentation and discourse.

 

The paper argues that the regulation of the Internet of Things (IoT) and other technologies should draw upon some combination of these three approaches. The authors outline the need for a better understanding of the end goals behind innovations and more informed discourse around “good engineering” to create better outcomes for the engineers, their institutions, and society at large. Governments and companies should encourage technologists to set baseline goals, research potential consequences, and experiment, collaborate, and iterate to create better outcomes.

 

The researchers emphasize that regulators should remain wary about imposing strict deontological structures on emerging technologies. With the rapid changes in technologies like IoT, artificial intelligence (AI), and machine learning, regulations of specific practices may become obsolete in a matter of months and may stifle innovation. However, the paper maintains that conversations about deontological guidelines—which often take the form of constructing a bill of rights or some other guiding document—are productive at all stages of technological maturity, beginning to develop governing principles that may evolve over time.

 

With respect to IoT, AI, and other contemporary technologies, the authors argue that deontological guidelines should not be the first or only structure regulators consider. For, these technologies raise a number of teleological concerns about safety (like the trolley problem), environmental side effects, bias, and automation and job displacement, among others.

 

Specifically in the realm of IoT, the paper calls for consideration of teleological concerns in four areas. The authors argue that all IoT design should prioritize (i) ease of use, (ii) reliability, (iii) safety, security and privacy and (iv) offline use. These considerations ensure that users are able to use these technologies effectively, even without internet access, and are able to trust the claims, data and decisions IoT systems make.

 

To shape guidelines based on these values, governments and companies should promote a virtue ethics approach that benefits from a (i) strong professional culture institutionalized in engineering associations and (ii) collaboration in larger, multi-stakeholder forums. The authors point to the value of discussion and collaborative problem-solving as well as the power of social groups to shape behavior among technologists. They quote sociologist Emile Durkheim, who argues, “Now there is only one moral power—moral, and hence common to all—which stands above the individual and which can legitimately make laws for him, and that is collective power.” And in addition to relying on the influence of these groups, governments can pick up on the insights derived from these collaborations in order to create more direct institutionalized regulations.

 

Professional groups are valuable for encouraging cooperative innovation to navigate these teleological questions and wielding collective power, but often only appeal to the one stakeholder represented by the group. On the other hand, multi-stakeholder forums engage many voices and develop more diverse, cross-sector solutions to these questions. The authors point to the IoT Dynamic Coalition at the United Nations Internet Governance Forum (IGF) as an example of effective multi-stakeholder collaboration. Working on ethical approaches to shape IoT policy and the market space, this group has sought input from academia, government, and the private sector to create a living document that is constantly updated to integrate new recommendations and accommodate technological changes. Governments may wish to convene similar groups in order to draw insights from a variety of sources, or participate in existing forums.

 

The authors remind us that because of the massive and ever-evolving implications of emerging technologies, the ethical responsibility of engineers and designers is at an all time high. In order to manage these rapidly advancing technologies, governments and companies should constantly consider and revise established deontological codes and promote deliberation over teleological questions in collaborative communities. 

About the Author

Chris Bousquet

Chris Bousquet is a Research Assistant/Writer for Data-Smart City Solutions. Before joining the Ash Center, Chris worked at the Everson Museum in Syracuse, NY and wrote for DC Inno in Washington, D.C., where he covered tech policy, cybersecurity, and startups. Chris holds a bachelor’s degree from Hamilton College.

Connect

Comments

comments powered by Disqus