Society for Social Implications of Technology Standards Committee
Expert committee developing ethical and sustainable global technical standards to ensure ethical and responsible use of emerging technologies and data

Blogs

These blogs and videos are intended for everyone interested in the work of the SSIT SC and the ethical and social implications of technologies.  All blog posts and videos represent the opinions of the authors and do not necessarily reflect the official policy or position of the IEEE, its societies and affiliates, or the authors’ institutions.  Submissions are welcome.

 

Why is SSIT involved in Standards Making?

Submitted by Dr. Beth-Anne Schuelke-Leech and Dr. Sara Jordan
SSIT Standards Committee

Published originally in the IEEE SSIT Newsletter, July 2019, https://newsletters.ieee.org/society/SIT/2019/July/ES_TA_1033_2019_07_SSIT_Newsletter_Email%20%281%29.html

Discussions of the societal implications of technology fill the pages of important magazines and blogs like Fast Company and Wired. Widely-read and highly-cited pieces point to the needs that engineers, developers, designers, and programmers have for practical advice on how to think through and measure the social implications of the technologies that they are creating and managing.

In developing and manufacturing any technology, engineers, technical developers, and managers use technical codes and standards. Standards are everywhere. They are frequently created by international organizations or professional industry associations. For example, the International Standards Organization (ISO) publishes quality management system standards (ISO 9001) and environmental management standards (ISO 14001). The American Society of Mechanical Engineers (ASME) develops codes and standards on mechanical systems from elevators, escalators, boilers and pressures vessels, to piping and plumbing systems. The American Society of Civil Engineers (ASCE) create codes and standards on model building codes. The Society of Automotive Engineers (SAE) creates many of the standards used in designing and manufacturing automobiles and other transportation vehicles. Compliance to standards indicates conformance to recommended industrial practices. It allows customers and other interested parties to trust the quality and technical performance of the product, service, or process that they are using.

IEEE has been involved in standards making since 1890 when standards were first recommended for self-induction [1].  IEEE now has almost 1300 active standards and another 600 currently under development [2]. While IEEE has been a significant player in the development of technical standards for over a century, they have only recently begun to bring together experts necessary to draft standards that are socially mindful and ethically informed. Since 2016, the Society for the Social Implications of Technology (SSIT) has collaborated with IEEE Standards Association (IEEE-SA) to create projects and working groups designed to develop standards which are as rigorously designed as those from IEEE’s purely technical sides but that are conscious of the broader implications of emerging technologies. Following the same procedures as required of all standards making in IEEE, the SSIT-Standards Committee (SSIT-SC) has formed four standards working groups to develop standards within the P7000 series. Closely associated with the Global Initiative for Ethically Aligned Design of Autonomous and Intelligent Systems [3], the P7000 series [4] brings the process of standards development to bear on issues pressing for consideration by experts in technology and its social implications.

The term “standard” holds many connotations, and it is often used haphazardly in general discussions of technology. For IEEE, a standard is a consensus-based document that outlines the expectations for performance of a product or process. For the SSIT-SC specifically, the consensus-based standards development process includes dealing with ethically contentious technologies (such as emulated empathy or facial recognition), as well as assisting in the recognition and incorporation of the ethical or social implications that may reasonably emerge from the processes of design, development, demonstration, deployment, or decommissioning of a product. There may be many issues to consider, but the goal of the standards development process is to provide practical recommendations and guidelines for engineers, technical developers, designers, and manufacturers throughout the world.

Note: See https://standards.ieee.org/develop/index.html for more details about the standards making process at IEEE.

[1] https://ethw.org/IEEE_Standards_Association_History

[2] https://www.ieee.org/standards/index.html. See https://standards.ieee.org/develop/index.html for more details about the standards making process at IEEE.

[3] https://ethicsinaction.ieee.org/

[4] http://sites.ieee.org/sagroups-ssit/useful-links/

 

Why ethical and societal concerns should be of interest to engineers and IEEE members?

April 17, 2019

By Beth-Anne Schuelke-Leech

For most engineers, their exposure to the ethical and societal impacts of technologies occurs during a mandatory undergraduate course on this topic.  After 1986, it often included looking at the case of Roger Boisjoly and the Challenger Space Shuttle Disaster.1  Boisjoly believed that there was a danger to launching the space shuttle at lower temperatures.  However, he was overruled and the Challenger launched, resulting in the loss of the space shuttle and the seven astronauts on-board.  The case of Boisjoly and Challenger provides a convenient case for looking back on a decision and analyzing the ethical implications of this decision.

With a few notable exceptions, ethical and societal impacts have never been the focus of engineering work.  We agree to professional and organizational codes of conduct that espouse protection of public safety and compliance with appropriate laws and regulations.  There is no question that these are essential components of our training, but they often seem tangential to our core activities, which rarely presents us with the kind of stark choice for safety that Boisjoly faced.

Many of the technologies currently under development and emerging in the marketplace have the potential to be disruptive to society.  Artificial intelligence, machine learning, automation, and robotics have the potential to displace humans, leaving some workers struggling to find meaningful employment but freeing up others to focus on value-added activities.  Autonomous vehicles have the potential to transform transportation making it safer for people by reducing the number of traffic fatalities and injuries, but also eliminating the need for human truck drivers.  Biometrics, biotechnologies, personalized medicine, and genetic engineering have the potential to improve and individualize healthcare, but may also result in manipulations of embryos and human life.  Smart cities, ubiquitous connectivity, and the internet of things may improve the provision and convenience of services for citizens and customers, but it may be at the cost of privacy, diversity, and democracy.

However, it is not just computer technologies that can be disruptive.  Biotechnologies and material science are other examples of fields that are rapidly changing and can provide significant impacts and benefits.  Innovations and new technologies have been changing individuals and society for many generations.  What is different is that many new technologies are less transparent and have the potential to create larger changes than in the past.  The recent case of the Boeing 737 Max 8, where the engineers used software to compensate for hardware changes thinking that the pilots would not know the difference between the original natural operations and the newer simulated ones, created problems when the sensor malfunctioned and pilots could not get the plane to respond as they expected it too.2   Another example is that of the Volkswagen Diesel emissions software designed to detect a regulatory emission test and engage an emission reduction system that otherwise did not operate.3   The engineers at VW clearly understood that what they were doing was illegal, but it is unclear if they really understood how it was unethical or the societal impacts of their decision (i.e., increased emissions and associated health problems) or if they acknowledged that these impacts were their responsibility.  Likewise, Elizabeth Holmes of Theranos literally created a black box that was supposed to take a small drop of blood and run thousands of diagnostic tests from the convenience of the customer’s home. 4  The box never worked.  It remains a question as to whether it was fraud or just an overly ambitious entrepreneur that never quite got the technology to work.

Were these engineers and developers unethical?  It is not as easy a question as it first appears.  Ethics is a function of both the individual and the group that the individual is in.   Evaluating the ethical and societal impacts of one’s work and the technologies that are being developed is not an easy task.  Often engineers and developers are so focused on the tasks that they have been given that it is difficult to contemplate the longer-term impacts that this work may have.

The recently released IEEE Ethically Aligned Design5, published as part of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems provides an excellent overview of some of the challenges and issues of incorporating the Ethical and Societal impacts into the development of AI and autonomous systems.   The first paragraph establishes the importance of the work:

As the use and impact of autonomous and intelligence systems (A/IS) becomes pervasive, we need to establish societal and policy guidelines in order for such systems to remain human-centric, serving humanity’s values and ethical principles.  These systems must be developed and should operate in a way that is beneficial to people and the environment, beyond simply reaching functional goals and addressing technical problems.  This approach will foster the heightened level of trust between people and technology that is needed for its fruitful use in our daily lives. (p. 2)

Technologies bring both opportunities and challenges.  Our job is to figure out how to develop and deploy technologies in a way that truly benefits people and reflects our values and aspirations.  SSIT Standards Committee was created to help facilitate the incorporation of ethical and societal concerns into standards, as well as for the development and oversight of standards that address specific ethical and societal issues.  Our goal is to build bridges between technical experts and all stakeholders who see the value, opportunities, challenges, and issues with technologies.  We all have a role to play in the successful, prosperous, sustainable, and ethical future of our society.

Notes

  1. Boisjoly, Roger M, (1987), “Ethical decisions: Morton Thiokol and the space shuttle Challenger disaster,” Paper presented at the American Society of Mechanical Engineers Winter Annual Meeting, Boston, Massachusetts.
  2. Martin, Kaste, (2019), “After Boeing Crashes, New Attention On The Potential Flaws Of Software,” Retrieved from https://www.npr.org/2019/03/24/705966447/software-is-everywhere-but-its-not-always-an-upgrade on April 17, 2019.
  3. Ewing, Jack, (2017), Faster, Higher, Farther: The Volkswagen Scandal, New York, NY: W.W. Norton & Co.
  4. Carreyrou, John, (2018), Bad Blood: Secrets and Lies in a Silicon Valley Startup, New York, NY: Knopf.
  5. Available at https://ethicsinaction.ieee.org/