Expert Column: The Classic Conundrum of ChatGPT

March 10, 2023

Artificial Intelligence inventions like ChatGPT and humans have a lot to learn about each other. That irony captures the contradictory eruptions of interest, hype, and concern about ChatGPT, a bot (short for robot) created and released by OpenAI only last November. ChatGPT uses technology somewhat inscrutably known as the Generative Pretrained Transformer. Most simply, think of it as a digital tool for writing prose. It is roughly comparable to using tools like a calculator for doing math, an online search engine for research, or word processing to ease composition, formatting, and revision of the text.  At its current stage, the experimental writing bot predicts and suggests language ideally for short responses to prompts submitted by users. ChatGPT operates by predicting upon request what the user wants to write, not unlike the way spell check and word processing function by anticipating desired spellings and words; and we have all experienced how that can be useful and annoying at times, even embarrassing.

A bot is a software application programmed to perform tasks independently and to simulate human activity by interacting with other computer systems and people without guidance. Bots are designed to perform tasks more quickly, reliably, and accurately than humans can, especially for jobs that are routine, repetitive, and vast.  AI applications only appear to engage in human thought.  They are not actually thinking. At least not yet.

Businesspeople, professionals, and educators who learn how to make good use of new advanced networked computer tools like ChatGPT, while managing risks and mitigating potential misuse, will have a competitive advantage over those who ignore, outright reject, or fight the inevitable widespread adoption of relentlessly advancing technologies. 

Easier said than done, as academics are known to be slow adaptors. This was so in ancient Greece when Socrates opposed students using paper and ink and continuously since then up to our own time. For reasons familiar to Socrates, many teachers and schools already banned the use of ChatGPT (over concerns about cheating, interfering with teaching and learning, and fostering bad study habits). Baby boomers remember when the now routine student use of calculators was suspect, and the digital natives of Gen Z still find many classrooms where teachers prohibit laptops.

Lawyers and law schools are also notoriously late adaptors, perhaps more so than any other learned profession, often for understandable and even commendable reasons. Those schooled in law give credence to weighing evidence before deciding. We value the probative value of the give-and-take argument which also takes time to develop. We are comfortable with precedent and established practices and understand that change can be disruptive, have unintended unforeseen consequences, and be unfair, especially for those who are unable to adapt or who relied on the status quo. 

In law schools and other academic settings, keeping up with a world where paradoxically the only constant is continuously accelerating change, can be encumbered by outdated conventional wisdom and regulatory constraints, market realities, and the unending repeating annual cycle of the academic calendar. That is, educators, do not have the luxury of putting their schools in drydock to scrape off barnacles and retrofit the institution. Instead, innovation and experimentation necessarily can only be attempted while operationally underway and involves navigating through complex internal governance, legal requirements, and the fiercely competitive, highly transparent, academic ecosystem that is acutely sensitive to unexpected external shocks which frequently occur. 

Even so, despite the challenges we should push ourselves continuously to consider prudent change and to pursue creative solutions to the age-old challenge of making the best use of new technology in business, professions, and every level of education. For example, consider the classic worry that innovations like ChatGPT promote cheating by students. Why not seize the opportunity to focus conversations on campuses about the importance of personal responsibility for intellectual integrity and understanding that academic fraud is both wrong and self-destructive, because cheaters are depriving themselves of getting more out of their education? Besides students and teachers will quickly learn that there are many ways to tell when writing comes from a bot.

Then too, surely teachers can incorporate ChatGPT-type applications into lessons about how to write better than bots. After all, there are unprogrammable elements of the human condition that are not done justice by algorithms. Flesh and blood writers in the end are irreplaceable because of humanity’s artful penchant for understanding, wisdom, judgment, purpose, abstraction, creativity, poetry, metaphor, unpredictability, love, friendship, compassion, empathy, joy, inspiration, dedication, sacrifice, surprise, terror, sadness, grief, physical and mental pain, laziness, complacency, negligence, stubbornness, irrationality, deceit, meanness, bias, hatred, deviance, antisocial behavior, illness, death, and of course faith hope and charity. Our desires for relationships, liberty, privacy, and safety also make every one of us, because of not despite all our luminous foibles, more like Captain Kirk than Mr. Spock or IBM’s Watson.

Technology-driven change–hopefully for the better–is inevitable. Yet while we should be open to all the exciting things new inventions can do to improve every aspect of our lives, there are no easy answers about how and when to adopt new technology. ChatGPT is imperfect and was released while still under development. It is prone to repetition and irrelevant content can lead to unintended plagiarism given how it relies on and uses massive databases of published material, and is not yet particularly useful for many workplace communications involving teamwork, evaluations, and nuanced personal conversations. To be or not to be early adaptors and improvers of ChatGPT. That is a classic question.

 

Allard is the founding Dean of the Jacksonville University College of Law. Previously he served as President and Dean of Brooklyn Law School. Throughout his career in government service, as a senior partner in some of the world’s most respected law firms, and as an innovator in higher education, he has been deeply involved in the impact of new technology on society, law and policy. Allard ‘s expertise is reflected in his extensive writings, teaching, and as a frequent speaker and commentator. He is particularly interested in historical comparisons with our contemporary age of digital and biomedical discovery. 

Author

Nicholas Allard, Esq.

nallard@ju.edu

All Stories

See All News