PANEL 8 /// INFLUENCED BY TECHNOLOGIES. ETHICAL ISSUES
CONVENOR: STEFANO CALBOLI
All inquiries about the panel should be sent to [email protected]
The panel wants to promote the investigation of ethical issues in employing technologies (e.g., social robots, wearable technologies, and virtual agents) to purposely influence users through typically concealed means, e.g., nudges and persuasive attitudes. Literature concerning, on one side, the ethics of nudging through technologies (see Borenstein & Arkin 2015; Yeung 2016) and, on the other side, the ethics of persuasive technology (see Siegel et al. 2009; Fogg 2003) revealed phenomena that need of an in-depth investigation. Considering cases in which humans influence humans, nudges and persuasive attitudes are influences primarily investigated within behavioral sciences, cognitive sciences, and social engineering. Insights from such disciplines seem extensive and featured by a precision high enough to provide choice architects with means that effectively mold human behaviors. However, the relevance of such insights when human-technologies interactions are in focus would deserve a more exhaustive investigation. In particular, the ethical implications of influencing technological tools deserve further analysis, especially in those cases in which it is easy that its influences to remain unnoticed by those who are influenced. Evidence and ethical considerations regarding potential concealed influences exerted directly by humans could not be applicable, mutatis mutandis, to the cases in which technologies are instead employed. This potential asymmetry is the ratio behind the panel.
The panel aims to develop an interdisciplinary research agenda that connects behavioral economics, cognitive sciences, behavioral sciences, social robotics, social engineering, and captology. Questions to be addressed include – but are not limited to – the following:
● Should influences by technologies be expected to be as effective as when implemented by humans?
● Are there specific ethical challenges in place when technologies are exerted?
● Does the use of technologies open new solutions to the ethical challenges associated with nudges and means of persuasion?
● What kind of (if any) new knowledge or epistemic influences are typical of influences exerted by technologies?
● Could influencing technologies help us to taxonomize influences and identify the ethical issues specifically related to each kind of influence?
● How should the responsibility of detecting influences through technologies be shared among influencers and those who are influenced?
REFERENCES
Borenstein, J., & Arkin, R. (2015). Robotic Nudges: The Ethics of Engineering a More Socially Just Human Being, Science and Engineering Ethics, 22(1), 31–46.
https://doi.org/10.1007/s11948-015-9636-2
Fogg B. G. (2003). Persuasive technology: Using computers to change what we think and do. Morgan Kaufmann Publishers.
Siegel M., Breazeal C., Norton M., Persuasive Robotics: The influence of robot gender on human behavior, IEEE/RSJ International Conference on Intelligent Robots and Systems, 2009, pp. 2563- 2568.
Yeung, K. (2016). ‘Hypernudge’: Big Data as a mode of regulation by design. Information, Communication &Amp; Society, 20(1), 118–136. https://doi.org/10.1080/1369118x.2016.118671
All inquiries about the panel should be sent to [email protected]
The panel wants to promote the investigation of ethical issues in employing technologies (e.g., social robots, wearable technologies, and virtual agents) to purposely influence users through typically concealed means, e.g., nudges and persuasive attitudes. Literature concerning, on one side, the ethics of nudging through technologies (see Borenstein & Arkin 2015; Yeung 2016) and, on the other side, the ethics of persuasive technology (see Siegel et al. 2009; Fogg 2003) revealed phenomena that need of an in-depth investigation. Considering cases in which humans influence humans, nudges and persuasive attitudes are influences primarily investigated within behavioral sciences, cognitive sciences, and social engineering. Insights from such disciplines seem extensive and featured by a precision high enough to provide choice architects with means that effectively mold human behaviors. However, the relevance of such insights when human-technologies interactions are in focus would deserve a more exhaustive investigation. In particular, the ethical implications of influencing technological tools deserve further analysis, especially in those cases in which it is easy that its influences to remain unnoticed by those who are influenced. Evidence and ethical considerations regarding potential concealed influences exerted directly by humans could not be applicable, mutatis mutandis, to the cases in which technologies are instead employed. This potential asymmetry is the ratio behind the panel.
The panel aims to develop an interdisciplinary research agenda that connects behavioral economics, cognitive sciences, behavioral sciences, social robotics, social engineering, and captology. Questions to be addressed include – but are not limited to – the following:
● Should influences by technologies be expected to be as effective as when implemented by humans?
● Are there specific ethical challenges in place when technologies are exerted?
● Does the use of technologies open new solutions to the ethical challenges associated with nudges and means of persuasion?
● What kind of (if any) new knowledge or epistemic influences are typical of influences exerted by technologies?
● Could influencing technologies help us to taxonomize influences and identify the ethical issues specifically related to each kind of influence?
● How should the responsibility of detecting influences through technologies be shared among influencers and those who are influenced?
REFERENCES
Borenstein, J., & Arkin, R. (2015). Robotic Nudges: The Ethics of Engineering a More Socially Just Human Being, Science and Engineering Ethics, 22(1), 31–46.
https://doi.org/10.1007/s11948-015-9636-2
Fogg B. G. (2003). Persuasive technology: Using computers to change what we think and do. Morgan Kaufmann Publishers.
Siegel M., Breazeal C., Norton M., Persuasive Robotics: The influence of robot gender on human behavior, IEEE/RSJ International Conference on Intelligent Robots and Systems, 2009, pp. 2563- 2568.
Yeung, K. (2016). ‘Hypernudge’: Big Data as a mode of regulation by design. Information, Communication &Amp; Society, 20(1), 118–136. https://doi.org/10.1080/1369118x.2016.118671