Skip to main content

Critical Studies of Education & Technology: AI … Minding Our Language

As Melanie Mitchell argues in a recent article in Science, experts and non-experts alike have long struggled with finding the right words to describe AI. This means that AI is an area of technology that is full of metaphors, many of which convey the idea of AI tools being capable of doing things that humans do – training, recognising, making decisions and so on. Pertinently, many of the most-frequently used terms within the AI community align with human brains and thought processes – what Mitchell terms  ‘AI-as-mind’ metaphors. These include the idea of neural networks, machine learning and, of course, artificial intelligence.

During the early decades of AI development these mind-based allusions could be seen as aspirational. Indeed, Steve Woolgar argued in 1985 that framing AI as a quest to develop ‘intelligent’ machines gave AI developers a ‘seemingly endless’ remit to  develop new technologies. High-profile challenges like the Turing Test could be used to manipulate public perceptions of what machine-based intelligent behaviour might look like – teasing the idea that AI development was edging ever closer to technology that could go beyond simply appearing to be intelligent:

“The successful design of a machine which appears to work intelligently can be taken as the grounds for presuming that intelligence is, after all, something more that the machine manifests. Hence the need to build a machine which is ‘really intelligent’” (Woolgar 1985, p.565).

In the 2020s however, we are now seeing forms of commercial AI products that are explicitly designed to present themselves as sentient. We now have chatbots and genAI interfaces that ‘talk’ with their users and spew out outputs laden with first-person chat (‘what can I do for you?’) and other affective disclosures.  

Mitchell makes the point that the use of such language matters – shaping how society is responding to the rise of genAI. The more that we presume this technology to be thinking like a human then the more that we feel comfortable with human-related rights and responsibilities being afforded to the technology. Thus, we are now seeing organisations decide that it is acceptable to replace human counsellors with chatbots. We are now seeing legal debates over the moral rights of robots. We are now seeing credulous discussions of how AI software can ‘perform’ when taking an examination or an IQ test

Mitchell reasons that this is a slippery slope – leading to the inappropriate (and sometimes dangerous) application of AI technologies in contexts where humans are essential. If nothing else, it also serves to downgrade, downplay and demean what it means for a human to do the things that we are becoming happy to outsource to AI. For example, if we see ‘writing’ as the capacity to regurgitate other people’s written text, or counselling as the capacity to conjure up empathetic-sounding phrases then this diminishes what we expect from humans who are doing these tasks.

All told, Mitchell warns for the need to push back against the sloppy ways of talking about AI that captured the imagination of public and policymakers alike and be careful to use metaphors that accurately frame AI as a computational exercise in simulating specific tasks and processes. The ways in which we talk about technology matter!

References

Mitchell, M. (2024).  The metaphors of artificial intelligence. Science 386(6723). DOI: 10.1126/science.adt6140

Woolgar, S. (1985). Why not a sociology of machines? The Case of Sociology and Artificial Intelligence.   Sociology, 19(4), 557–572.

 

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Neil Selwyn

Neil Selwyn is a Professor in the Faculty of Education at Monash University in Australia. He has worked for the past 28 years researching the integration of digit...