.

  • Written by Nicholas Tampio, Professor of Political Science, Fordham University
Should AI be permitted in college classrooms? 4 scholars weigh in

One of the most intense discussions taking place among university faculty is whether to permit students to use artificial intelligence in the classroom. To gain perspective on the matter, The Conversation reached out to four scholars for their take on AI as a learning tool and the reasons why they will or won’t be making it a part of their classes.

Nicholas Tampio, professor of political science: Learn to think for yourself

As a professor, I believe the purpose of a college class is to teach students to think[1]: to read scholarship, ask questions, formulate a thesis, collect and analyze data, draft an essay, take feedback from the instructor and other students, and write a final draft.

A man with glasses smiles.
Nicholas Tampio, Fordham University[2]

One problem with ChatGPT is that it allows students to produce a decent paper without thinking or writing for themselves.

In my American political thought class, I assign speeches by Martin Luther King Jr. and Malcolm X and ask students to compose an essay on what King and X might say about a current American political debate, such as the Supreme Court’s recent decision on affirmative action[3].

Students could get fine grades if they used ChatGPT to “write” their papers. But they will have missed a chance to enter a dialogue with two profound thinkers about a topic that could reshape American higher education and society.

The point of learning to write is not simply intellectual self-discovery. My students go on to careers in journalism, law, science, academia and business. Their employers often ask them to research and write about a topic.

Few employers will likely hire someone to use large language models that rely on an algorithm scraping databases filled with errors and biases. Already, a lawyer has gotten in trouble[4] for using ChatGPT to craft a motion filled with fabricated cases. Employees succeed when they can research a topic and write intelligently about it.

Artificial intelligence is a tool that defeats a purpose of a college education – to learn how to think, and write, for oneself.

Patricia A. Young, professor of education: ChatGPT doesn’t promote advanced thinking

College students who are operating from a convenience or entitlement mentality – one in which they think, “I am entitled to use whatever technology is available to me” – will naturally gravitate toward using ChatGPT with or without their professor’s permission. Using ChatGPT and submitting a course assignment as your own creation is called AI-assisted plagiarism[5].

A woman looks straightforward.
Patricia A. Young, University of Maryland, Baltimore County

Some professors allow the use of ChatGPT as long as students cite ChatGPT as the source. As a researcher who specializes in the use of technology in education[6], I believe this practice needs to be thought through. Does this mean that ChatGPT would need to cite its sources, so that students could cite ChatGPT as a type of secondary source according to APA style[7], a standard academic style of citing papers? What Pandora’s box are we opening? Some users report that ChatGPT never reveals its sources anyway.

The proliferation of free AI means students won’t have to think much while writing – just engage in a high level of copy and paste. We used to call that plagiarism. With AI-assisted plagiarism, this brings in the potential for a new era of academic misconduct.

The concern will come when students take higher-level courses or land a job and lack the literacy skills to perform on an exceptional level. We will have created a generation of functionally illiterate adults who lack the capacity to engage in advanced thinking – like critiquing, comparing or contrasting information.

Yes, students can and should use smart tools, but we need to hypothesize and measure the costs to human ingenuity and the future of the human race.

Asim Ali, instructor of information systems management: AI is another teacher

I teach information systems management, and in the spring of 2023, I had students use ChatGPT for an essay assignment and then record a video podcast discussing how AI will impact their careers. This semester I am being more intentional by providing guidance on the possibilities and limitations of AI tools for each assignment. For example, students learn that using generative AI on a self-reflection assignment may not help, but using AI to analyze a case study is potentially a great way to find insights they may have overlooked. This emulates their future jobs in which they may use AI tools to enhance the quality of their work product.

A man smiles. A brick wall is in the background.
Asim Ali, Auburn University[8]

My experience with adapting to AI for my own course inspired me to create a resource for all my colleagues. As executive director of the Biggio Center for the Enhancement of Teaching and Learning, I oversee the instructional design and educational development teams at Auburn University. We created a self-paced, online course called Teaching With AI[9].

Now there are over 600 faculty at Auburn and hundreds of faculty at almost 35 institutions engaging with the content and each other through discussion boards and practical exercises.

I receive messages from faculty sharing ways they are changing their assignments or discussing AI with their students. Some see AI as a threat to humans, but discussing AI with my students and with colleagues across the country has actually helped me develop human connections.

Shital Thekdi, associate professor of analytics & operations: What can you do that AI can’t?

This semester, I will ask students in my Statistics for Business and Economics course to discuss the question, “What is your value beyond the AI tools?” I want them to reframe the conversation beyond one of academic integrity and instead as a challenge. I believe students must recognize that the jobs they imagine will exist for them could be eliminated because of these new technologies. So the pressure is on students to understand not only how to use these tools but also how to be better than the tools.

A woman looks straightforward.
Shital Thekdi. University of Richmond

I hope my students will consider ethical reasoning and the role of human connections. While AI can be trained to make value-based decisions, individuals and groups have their own values that can differ considerably from those used by AI. And AI tools do not have the capacity to form human connections and experiences.

Students will remain vital contributors to business and society as AI tools develop. I believe it’s our responsibility as educators to prepare our students for a rapidly evolving cultural and technological landscape.

References

  1. ^ teach students to think (www.e-elgar.com)
  2. ^ Fordham University (www.fordham.edu)
  3. ^ Supreme Court’s recent decision on affirmative action (www.supremecourt.gov)
  4. ^ lawyer has gotten in trouble (www.nytimes.com)
  5. ^ AI-assisted plagiarism (doi.org)
  6. ^ specializes in the use of technology in education (scholar.google.com)
  7. ^ APA style (apastyle.apa.org)
  8. ^ Auburn University (www.auburn.edu)
  9. ^ Teaching With AI (aub.ie)

Authors: Nicholas Tampio, Professor of Political Science, Fordham University

Read more https://theconversation.com/should-ai-be-permitted-in-college-classrooms-4-scholars-weigh-in-212176

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more