Artificial Intelligence (AI) is becoming more advanced by the minute, and evolving into a more integral part of society every day. Among the latest innovations is a software called ChatGPT, an AI chatbot that spits out humanlike conversational dialogue while learning exponentially from itself and its previous inputs. ChatGPT can research or write anything from original academic papers to company memos. This tool is also used in schools by both teachers and students.
As a recent college graduate who has seen those around me pick up ChatGPT, it’s hard not to notice that this incredibly impressive innovation can simplify day-to-day tasks like writing articles or emails (before you ask, yes, I wrote this myself). It can also make the workplace much more efficient. But should bosses credit work to an employee who gave a prompt to ChatGPT and received a document from the program? Should a teacher give a good grade to a student for a paper that amounted largely to asking AI to do their work?
Because it more touches on NC policy, let’s dive into the question of ChatGPT in schools in particular.
Certain districts, according to WRAL, in North Carolina are embracing ChatGPT in their middle and high school curriculums, most notably Wake County and Chapel Hill-Carrboro City. Others, like Granville County, are open to its use but have not developed an official policy yet.
ChatGPT’s advocates argue that the chatbot can bolster students’ experience in the classroom. To counter the worry that it hinders the development of necessary skills, some say that because the outputs of this AI is not always entirely accurate, students can hone their critical thinking skills by deciding what is true and what is not after it spits out information. Millbrook High School history teacher Mark Grow told WRAL that ChatGPT can help students access information quickly in the classroom, creating an avenue for them to have “higher-level” conversations more quickly and become more “methodical curators of information.”
Other districts, like Wilson and Sampson counties, believe ChatGPT ought to be banned from school servers and excluded from the curriculum, arguing that it is nearly impossible to determine if students use it to cheat on assignments or tests and plagiarize academic papers.
It seems to me that those who are skeptical of the technology’s impacts are seeing the situation more clearly. ChatGPT largely does students’ work for them, deteriorating their work ethic in and out of the classroom. Perhaps just as worrisome, ChatGPT will replace the need for still-developing minds to learn research and writing skills.
Why spend hours working on a project or a paper when it can be done for you in seconds, with almost no risk of being caught? Also, why hone study skills when chatbots find answers to questions for you in seconds? All of a sudden, students will find that their capability of retention is noticeably weakened. Essentially, educators run the risk of trading proficiency in AI usage for vital skills like deep reading, research, writing, (true) critical thinking, memory and retention, and creativity.
The impediment of creativity is less talked about, but equally detrimental. If students use AI to write their work or do their research for them, their creativity is squandered every time. For example, if the same assignment is presented to a class of students, and they all have ChatGPT do it for them, each one will come up with a unique product in the same way that they would if they did it themselves. But, importantly, they did not. Rather than let their unique perspectives shape their final product, they will watch as an AI chatbot instantaneously does it for them.
Here is a little story from Business Insider to put this into perspective: Earlier this year, the military created a highly sophisticated AI robot capable of detecting humans approaching from relatively large distances. However, when human creativity was employed in order to test the limitations of the software design, it was quickly discovered that eight out of eight times, a group of Marines outsmarted the wildly expensive Pentagon tool.
In order to avoid being detected, the Marines figured out that all they had to do was not walk like a human. Some of them did somersaults, while others giggled as they approached under a cardboard box or dressed up like trees. If human creativity is not cultivated the way it should be in middle and high school, and instead students become dependent on AI technology, they become no more impressive than this limited robot. Worst case scenario, over time they will not be capable of coming up with ways to outsmart the very technology that they are dependent on like the Marines did.
Schools should not simply embrace this powerful technology and assume students will use it responsibly and that they won’t be negatively affected by its easy answers. If schools want to teach students how to use it responsibly, I recommend offering extracurricular sources or optional electives for students in high school (middle school seems way too young — the earlier it is introduced, the less time students have for authentic development).
More importantly, since certain schools are already taking the leap of embracing it, it is now more important than ever for North Carolina to continue moving forward with the effort of school choice. It is one thing to opt into a ChatGPT elective, but it is another matter entirely to be forced to learn with it or to be surrounded by peers using it. Students and their parents ought to have the right to choose how they learn, and for those who would rather hone the skills that ChatGPT will hinder, they deserve the ability to attend a school where they can.