- Artificial intelligence is becoming increasingly popular in educational settings.
- Initiatives such as Stanford University's CRAFT program are promoting AI literacy in schools.
- Concerns about the misuse of AI have led instructors to teach students how to use the technology ethically.
- This article is part of Build IT, a series about digital technology and innovation trends that are transforming industries.
Arguments for expanding the use of artificial intelligence are typically rooted in one of two ideas: “get ahead of the curve” or “if it ain't broke, don't fix it.” Regardless of which side of the coin one stands on, it is clear that the rapidly emerging technology will not be of any immediate use.
AI is now everywhere, including healthcare, marketing, social media, music, hair care, and more. We've seen this tool gain traction across industries and professionals, including educators, are starting to take advantage of it.
A report from Allied Market Research predicts that the AI in education market will reach $88.2 billion globally by 2032. In 2022, North America is projected to capture the highest revenue share and retain its dominance in the global market for AI in education, the report said. .
Last month, Arizona State University announced a partnership with OpenAI to “increase student success” by bringing ChatGPT into the classroom. ASU will be the first higher education institution to collaborate with an AI company, according to a press release from the school.
While some optimists focus on the benefits of AI in education, others worry that its use in the classroom could lead to cheating and misinformation.
This is where AI literacy comes in handy.
Projects such as classroom resources for the AI for Teaching initiative are working to close knowledge gaps and combat anxiety.
Created within the Stanford Graduate School of Education, CRAFT is a collaboration of Stanford education researchers, software developers, and curriculum developers. The goal is to give high school teachers easy access to AI resources that can be adapted to their subjects and lesson plans.
Victor Li, an associate professor at the Stanford Graduate School of Education who contributed to the development of CRAFT, said the initiative's team is building knowledge in multiple subject areas to “demystify and clarify the areas in which AI will impact the way we work. “We're working on really contextualizing it,” he told Business Insider. In the classroom.
“People who aren't very familiar with AI and what it's being used for are especially concerned and are being very cautious,” Lee said. “Providing more specific knowledge about AI, both examples of how it can be used productively and how it can exacerbate long-standing problems, will This will help alleviate those concerns.”
Lee, one of CRAFT's developers, recognized the importance of young students learning the basics of AI, especially as more and more workplaces lean toward AI technology. Resources like CRAFT can help address disparities that can widen when underserved students do not have equal access to AI initiatives.
Advise students to use AI tools ethically and carefully
Matthew Lutz, a professor at Montgomery College, said that when educators teach AI literacy, they need experience with the platforms they are using in their classrooms.
Before bringing AI to the classroom, Latz was using AI in her writing with Copy.ai, which has tools to generate and simplify content, and Jasper, an AI writing service that helps with marketing copy. I did. At Montgomery College, professors can decide whether he uses AI in the classroom, so Lutz chose to do so.
“If we use it, they want us to teach them how to use it ethically,” Latz told BI.
He added, “If my idea was to prepare students for the 21st century workplace, I knew that teaching them how to use AI effectively would be part of that experience.” he added.
Lee also told BI that AI literacy in the classroom “requires an awareness of where AI is effective and where special vigilance is needed.” AI use cases vary by subject matter, so “at this point, it's important to focus on specific applications rather than broad generalities,” Lee said.
One of the tools Lutz has introduced into the classroom is Perplexity AI, a search engine that uses natural language to answer questions. This will help English composition students conduct peer-reviewed research despite their lack of experience and help them learn how to properly cite sources, he said.
He also designed the work prompts to be open-ended and elicit creative thinking so that the AI cannot answer them for the student. For example, he had his students analyze the rhetoric between two of his TED talks, but this required independent thinking and made it unlikely that an AI would do the work. .
ChatGPT has worked particularly well for Ratz, who uses it to help his students achieve their goals and improve their writing skills, he added.
Last semester, he asked an AI tool to input papers and identify mistakes his students made. “I actually used the feedback from his ChatGPT to teach lessons about transition sentences and specific usage of punctuation that my students were struggling with, especially commas, semicolons, and ellipsis.” He said.
Concerns about AI in the classroom
Lutz said that while he is pleased with how AI is helping students improve their academic performance, “the ethical concerns behind AI are very legitimate.” This includes the racism and bias inherent in AI.
Erin Reddick is one of many who have these concerns. She said her AI chatbot will draw insights from Black, African American, and African sources to “ensure that advances in technology are ethically grounded and beneficial to all communities.” She launched ChatBlackGPT.
Another concern is that there are no federal laws regulating AI and relatively few state-level laws that address AI at all. Without laws to enforce the responsible and ethical use of AI, it will be difficult to prevent misuse and abuse of the technology.
“Is it ethical for teachers to use AI to grade student work? Probably not, but who would actually say, “This test can be written by AI, but it can't be graded by AI?'' Are you saying? Who's drawing that line there?'' Reddick said.
Still, Reddick said he is aware that more jobs require some level of AI knowledge. This means that educational institutions have a responsibility to introduce AI to their students. However, people need to fully understand that AI is just a tool and it is important to use critical thinking when interacting with AI, she added.
“I encourage people to never blindly enter the generative AI conversation,” Reddick said. “You have to be intentional about the information you want to receive, otherwise it can have too much influence on what you came there for in the first place.”
The future of AI in education
A 2023 report from education publisher and platform Houghton Mifflin Harcourt found that 38% of educators plan to implement AI tools in the 2023-24 school year.
What's slowing teacher enthusiasm for AI is a lack of knowledge of how to safely and productively incorporate AI into the curriculum. But programs like CRAFT aim to bridge this disconnect and support more educators across the country.
Lee told BI that Stanford plans to develop an internal apprenticeship program to help students “engage more deeply with the topic of AI in education.”
He said the school also wants to expand CRAFT's Teacher Code Design Fellowship, where fellows develop lessons in AI literacy. This will “allow more voices from regions across the country facing different needs and situations to shape the product,” he added.