In January 2023, shortly after the announcement of the public release of ChatGPT, the New York City Department of Education (DOE), the largest school district in the country, moved swiftly to ban its use in New York City schools. In framing the policy, schools chancellor David Banks cited fears that using the emerging technology would hamper student learning and creativity, accelerate plagiarism, and essentially wreak havoc on teaching and learning.
Similar policies were enacted in other large districts, including Los Angeles and Baltimore. The concerns about potential abuse, or misuse, of the tool cast a dark shadow over artificial intelligence and specifically over the use of generative AI tools for teachers and students.
Just four months later, in May, the New York City policy was reversed. ChatGPT was made available on school devices and networks, and teachers were encouraged to begin exploring easy ways in which generative AI tools could help students learn and teachers teach. In explaining the reversal, Banks wrote in Chalkbeat that “The knee-jerk fear and risk overlooked the potential of generative AI to support students and teachers, as well as the reality that our students are participating in and will work in a world where understanding generative AI is crucial.”
As we think about the role that AI should play in our schools, we must hear Banks’ comments as a call to action. The potential for AI to “support students and teachers’’ is real. The assertion that AI will play a key role in the future lives of our students is real as well. To fulfill our mission as educators, it is imperative that we work to incorporate these technologies into our schools in every way that we can to enrich the learning environment and prepare our students for the future.
Of course, the concern that these technologies will spawn an erosion of traditional notions of academic integrity is also very real. But I fear that our well-founded concerns about cheating, plagiarism and academic integrity will stunt our ability to fully embrace what is perhaps the most critical technology innovation we have ever seen.
So what is our path forward? How can we leverage AI to enhance teaching and learning and to prepare our students for the future without eroding our commitment to the highest standards of academic integrity? As director of technology at SAR in Riverdale, New York I believe wholeheartedly that the answer must begin with school faculty and administrators working to learn before we legislate. We need to dive into ChatGPT, Microsoft Copilot, Google Bard, Open DALL·E 2, Curipod, MagicSchool.ai, Canva AI, and the ever-growing list of online tools that leverage this new AI technology. We must work to understand the endless potential of these tools and the built-in guardrails and limitations inherent to them. We need to rethink the assignments we give, the writing we assign and the rubrics we use for grading. We need to stay steadfastly attuned to the evolution of these tools and find ways to work them into our curricula and our assignments.
Yes, it’s critical that schools and individual teachers craft clear guidelines surrounding the use of generative AI tools in their classrooms. But it is virtually impossible to do so without having hands-on experience with these technologies and to see, firsthand, how different they are from anything else that we have encountered before.
Here are some examples of how teachers and students are already using generative AI tools to enhance the teaching and learning experience. Of course, context is critical. Some of these examples might serve an important educational purpose in one classroom while the same work might compromise the pedagogy of a different classroom. It all depends on the learning objectives of each context. I’ve chosen to share examples that can apply to a range of teachers and students:
- Rewriting content for learners with different interests and abilities. Take existing content and ask the AI tool to rewrite it to match a specific interest, learning style or reading level. To do this, paste content into the tool and use a prompt like “Rewrite this history class content to make more sense to a ninth-grade student who loves sports. Use lots of sports metaphors in your writing.” Then continue to refine your prompt, in conversation with the tool, until you feel the writing suits your objective.
- Let the AI interview you to help light a spark. Instruct the tool to “Ask me 10 questions about my students, my class, the Gemara text we’re learning and the school where I teach to help me create a relatable assignment prompt.” (You can even paste in the Gemara your class is learning in Aramaic!) The process of answering these questions will help you refine your thinking and help the bot draft a prompt that fits with your objectives.
- Use a tool like Dream Studio to generate images that represent pesukim in Tanach. These visualizations, a mode of interpretation, can help clarify the text and bring it to life for the visual learner.
In each of these examples, using the AI tool effectively necessitates a back-and-forth process; it requires the user to work in conversation with the tool. The prompts are just a starting point.
So, are there guidelines we should follow in how to use AI, how much to use AI, and when to use AI? While no two people will have the same answers to these questions, we should have some goalposts to guide our use. Here are some guidelines adapted from Magischool.ai, which has become one of the most popular AI tools for teachers.
- Check for bias and accuracy. AI might occasionally produce biased or incorrect content. Always double-check everything before sharing.
- Use AI for initial work, but make sure to add your own touch and contextualize appropriately.
- Your judgment matters — see AI-generated content as a place to begin, not an end point.
- Know the limits. Each tool is “fed” a data set with finite parameters. For example, Magicschool’s knowledge stops at 2021, so it will be unable to reference anything more recent.
- Protect privacy. Don’t include personal details like names or addresses in public tools.
This is an exciting and potentially daunting time for schools and teachers. Teachers, administrators and educational technology professionals can work together to lean into the emerging technology of generative AI and embrace it everywhere we can in the service of our students, their learning and the collective future of our community.
Rabbi Avi Bloom is the Director of Technology at SAR High School. Avi received a Master’s degree in Jewish Education from the Azrieli Graduate School of Jewish Education and Administration at Yeshiva University, and received semikha from RIETS. He earned his BA in Psychology from Yeshiva College. Avi also serves as national facilitator of Prizmah’s Technology Director network and is a member of the New York State Association of Independent Schools Education and Information Technology committee.
Machon Siach was established in 2015 with a legacy gift from Marcel Lindenbaum, z”l honoring the memory of his wife Belda Kaufman Lindenbaum, z”l.