Scattered around the office where I work are various books that frequently come to hand as I track down definitions or explanations of things I do not know or cannot recall – information about metallurgy or various process stages of metalcasting, or historical details about these things, or clearer explanations of concepts that have not clicked with me from previous research. That’s right, I describe this process as ‘research’, though it’s a fairly loose application of a fairly expansive term, and there’s nothing remarkable about my effort in these matters.
These efforts are made in order that I might fill in the various gaps in my knowledge. The results could never be understood as my own insights or creations, or accomplishments.
Yet these efforts are analogous to an emerging capability that Microsoft is making available to everyone, and allowing it to be understood as Artificial Intelligence. It is artificial, because it’s a work of artifice, but to understand it as intelligence is a stretch, and maybe a deception.
We have been living in the era of Artificial Intelligence (AI) for more than a decade, and its capabilities are easy to appreciate. In a metalcasting operation, it’s AI that helps to anticipate defects in mold filling or casting cooling, or numerous other process steps. AI speeds the work of designers and engineers. AI receives the requirements of a task and applies it to a set of data that an engineer or team of engineers could not possibly review and evaluate without conducting hundreds or thousands of hours of calculations and test runs. So, AI draws intelligence from those engineers. What is “artificial” here is the calculation/evaluation that is done but not by the engineers. The intelligence is the result of their work and input.
Microsoft has amplified the capabilities of its Bing search engine by linking that resource to another recent development called ChatGPT – a “content generation” function that expands on the capabilities we find in digital assistants. With ChatGPT, you can go well beyond asking your digital assistant the address of the nearest coffee shop or the score of Super Bowl III: you can propose highly specific questions and command tailored results, 100 words or less, or including particular words, for example. It will deliver such detailed responses more or less instantly by scanning the world wide web for word associations and frequencies, and referencing language and grammar models to compose a specific reply to your specific query.
If you sense that ChatGPT could write a student’s homework or draft a replies to school or job applications, you’re right. And that’s been at least part of the pitch from the developers.
Just to emphasize how emergent all this discussion is, ChatGPT arrived only last fall (2022), and instantly raised anxieties among people whose professional work is based in content generation –including writers and editors like me, as well as professional speakers – and content curators, like researchers, teachers, and publishers.
The upset it has caused is not unreasonable, and not unexpected. Also, it’s more than a bit reminiscent of the protests by drivers who object to autonomous vehicles, and machinists who resist the arrival of robots and cobots, or anyone over the past 175 years or so who objected to a mechanized or automated system that would do a job at least as adequately but more quickly or less expensively than they have shown they can do. Those writers and teachers, and others, realize they have been found out and they don’t want to be replaced.
This is an ironically appropriate reaction because the entire proposition of Bing/ChatGPT is to replace individuals’ thought processes. That intention is defensible enough to justify ones’ need to locate a nearby florist or dry cleaner… Who has an up-to-date Yellow Pages anyhow? Whether the evolution of search engines first into digital assistants and now into custom content generators is the work of some programming wizard, or the outcome of the programs’ own inherent “learning” capability, it plays on the same unhealthy impulse in human nature to avoid effort and present others’ work as one’s own. Writers, researchers, teachers, speakers, etc. have historically at least had a professional disdain for those caught misrepresenting others work as their own.
But still the objections to ChatGPT miss the greater threat it makes. It removes creativity and originality from the effort to solve problems. In effect, it removes the human factor.
The peril to all of us in the era of AI is to minimize the importance of intelligence simply because it takes too much effort, while over-emphasizing the necessity for solutions. The “intelligence” in these applications is actually crowd-sourcing – drafting a solution based on commonality or popularity.
Back in my office the books sit on shelves or stacked on the desk or a chair, ready to serve up the explanations I require. The intelligence is all there, not within me. And knowing the difference is encouraging.