When innovation goes wrong
Technological innovation is the lifeblood of our society, but at what point does it go wrong? According to Mary Shelley’s mythic story, Frankenstein, it’s when we’re no longer able to control our inventions. In Shelley’s myth, Victor Frankenstein learns this lesson after the machine-like creature he recklessly invents spins out of control and goes on a monstrous murdering spree. It was a terrifying vision, which is why Frankenstein monsters have captured Hollywood’s imagination ever since.
As mentioned in Part I of this article: Runaway technologies, Frankenstein monsters, like other mechanical monsters in science fiction, are mythological metaphors. They serve as a warning about how technological innovation goes wrong. When we lose control of our machines, they turn into monsters run amok. In this way, Frankenstein monsters are metaphors for runaway technologies. They symbolize the unintended consequences of technological innovation.
Frankenstein monsters and risk society
As mythological metaphors for runaway technologies, Frankenstein monsters convey a prophetic social message. Innovation assuredly brings many benefits. Unfettered, however, it opens a Pandora’s box of unforeseen risks with grave repercussions for all of society. Quite fittingly, German sociologist Ulrich Beck referred to this kind of uncertain situation as “Risk Society.”
According to Beck, Risk Society is a modern society forced to manage risks that it itself produces, often from accelerated technological transformations (Beck, 1992).
Consider, for example, the political controversy plaguing social media. Following the 2016 U.S. presidential election, researchers discovered Facebook was used to advertise and proliferate wide-ranging and divisive disinformation, including foreign propaganda to target and mislead voters. Obviously, it was an unforeseen risk. Facebook wasn’t intentionally designed for these nefarious purposes (even though it was designed for allegedly well-intended advertisers).
As its intelligence and security experts wrote in a report titled “Information Operations and Facebook“:
Facebook sits at a critical juncture. Our mission is to give people the power to share and make the world more open and connected. Yet it is important that we acknowledge and take steps to guard against the risks that can arise in online communities like ours. The reality is that not everyone shares our vision, and some will seek to undermine it.
Ethics of innovation
What are we to do in this brave new world? Alas, there’s no closing Pandora’s box. Coping with risk is the price we pay for innovation, and modern society depends on innovation. Therefore, if we can’t wish away the risks that come with innovation, we need to manage them. The question is, how?
Fortunately, Shelley’s myth gives an inkling of how to manage risks associated with innovative technologies. Essentially, it entreats us to think about the ethics of innovation. Frankenstein made a mechanical monster because he didn’t consider how his invention could impact his friends, family, and neighbors. As a result, this runaway technology put everyone at risk. Indeed, it was the death of many.
And yet, it wasn’t some unworldly evil. Frankenstein himself created the problem.
Hence, if runaway technologies are mechanical monsters of our own making, resolving those problems means we shouldn’t innovate without considering the ethical implications of our inventions.
Ethical design vs. Frankenstein monsters
As someone with a background in technical communication and user experience, I like to think about technology ethics from a design perspective. For instance, regarding the political controversy surrounding social media, here’s a question we might ask as to how information is harvested by big tech, especially on behalf of advertisers:
How could we redesign social media to safeguard the personal data of users so they aren’t surreptitiously targeted by misleading propaganda or disinformation?
One possible solution is for regulators to require Facebook and other social media companies to act as fiduciaries, meaning they’d be obligated to protect everyone’s personal data in the best interest of the user (as opposed to the commercial or ideological interests of advertisers). In short, social media would be designed to treat users as valuable customers (not resources for online ads) by respecting their privacy rights and sharing information only with trusted third parties.
Of course, that’s just one example, but by addressing such questions in theory, we can make innovation more ethical in practice. After all, morality resides not in technology per se but in each and every human being who designs and uses it. If we ignore this fact, we create Frankenstein monsters at our own risk.
References
Beck, Ulrich. (1992). Risk Society: Towards a New Modernity. London: SAGE Publications.
Shelley, Mary. (1994). Frankenstein. New York: Dover Publications, Inc.
If you’ve any thoughts on Frankenstein monsters, feel free to leave a comment below, or explore other Technology and Culture articles on this site.