May 27, 2024
Close this search box.
Close this search box.
May 27, 2024
Close this search box.

Linking Northern and Central NJ, Bronx, Manhattan, Westchester and CT

AI and the Future of (Jewish) Education

Rabbi Reuven Spolter, a creative and talented Israeli educator, recently posed some questions about artificial intelligence and the future of Jewish education, and education in general. “Do you calculate simple math in your head, or do you need to use a calculator to answer simple mathematical equations?” The answer may very well be related to your age. Older people tend to do simple math in their head because that’s how they were taught. Once calculators became pervasive, people became dependent on them for even simple math.

If AI gives us all the answers, then we never really learn about anything ourselves or develop any skills ourselves. Some AI enthusiasts even question the purpose of reading. Do we really need to know how to read? Traditional educators respond that we don’t want to raise future generations of illiterates who only know how to ask their phones what they want to know or need. Should we be concerned about this trend?

For the past four years, Rabbi Spolter has been working to develop Kitah, a digital tool to encourage reading and learning (and not replacing it). Sadly, statistics indicate that many of our Jewish day school children are functionally illiterate in Hebrew reading and comprehension, despite parents spending literally hundreds of thousands of dollars on Jewish education.

Programs like Sefaria are great—a wonderful tool. But might it replace students’ ability to read, research and comprehend on their own? Have we begun to ask about the cost of this? Have we even noticed that it’s happening? As AI continues to develop at breakneck speed, educators must ensure that while students learn to use it to help them learn, it cannot replace the core skills that they need to thrive and grow on their own.

If we’re not careful, in 10 years our students will be experts on finding the app that will explain a Mishnah or pasuk—but there’s no guarantee they’ll be able to read the original text, or truly understand what it is they’re reading.

Students need to excel at things AI can’t do—and that means more creativity and critical thinking and less memorization. Education is what makes us human. It drives intellectual capacity and the prosperity of nations. It has developed the minds that took us to the moon and eradicated previously incurable diseases. This special status of education is why AI tools such as ChatGPT are likely to severely disrupt how education happens. We need to continue to build education systems that nurture and value our unique human intelligence.

We are being deceived into believing these AI tools are far more intelligent than they really are. A tool like ChatGPT has no understanding or knowledge. It merely assembles bits of words together based on statistical probabilities to produce useful texts. It is an incredibly helpful assistant, but it is not knowledgeable or wise. It has no concept of how any of the words it produces relate to the real world.

AI could be a force for tremendous good within education. It could release teachers from administrative tasks, giving them more opportunities to spend time with students. However, we are woefully ill-equipped to benefit from the AI that is flooding the market. It does not have to be like this. There is still time to prepare, but we must act quickly and wisely.

AI has been used in education for more than a decade. AI-powered systems can analyze student responses to questions and adapt learning materials to meet their individual needs. AI tools can also enhance teacher training and support. To reap the benefits of these technologies, we must design effective ways to roll out AI across the education system, and regulate it properly.

Staying ahead of AI will mean radically rethinking what education is for, and what success means. Human intelligence is far more impressive than any AI system we see today.

Intensive research is underway to understand the vast potential of AI, including generative AI, to transform education as we know it. Researchers at Stanford University—from education, technology, psychology, business, law and political science—have joined industry leaders in sharing cutting-edge research and brainstorming ways to unlock the potential of AI in education in an ethical, equitable and safe manner. Topics being studied include natural language processing applied to education; developing students’ AI literacy; assisting students with learning differences; informal learning outside of school; fostering creativity; equity and closing achievement gaps; workforce development; and avoiding potential misuses of AI with students and teachers.

Great teachers remain the cornerstone of effective learning. Yet teachers receive limited actionable feedback to improve their practice. AI presents an opportunity to support teachers as they refine their craft. AI language models can serve as practice students for new teachers. There are programs capable of demonstrating confusion and asking adaptive follow-up questions. AI can provide real-time feedback and suggestions to teachers (e.g., questions to ask the class), creating a bank of live advice based on expert pedagogy. AI can also produce post-lesson reports that summarize the classroom dynamics. Potential metrics include student speaking time or identification of the questions that triggered the most engagement. Research finds that when students talk more, learning is improved.

Is generative AI comparable to the calculator in the classroom, or will it be a more detrimental tool? Today, the calculator is ubiquitous in middle and high schools, enabling students to quickly solve complex computations, graph equations, and solve problems. However, it has not resulted in the removal of basic mathematical computation from the curriculum: Students still know how to do long division and calculate exponents without technological assistance. On the other hand, writing is a way of learning how to think. Could outsourcing much of that work to AI harm students’ critical thinking development?

AI has the potential to support learners’ self-confidence. Teachers commonly encourage class participation by insisting that there is no such thing as a stupid question. However, for most students, fear of judgment from their peers holds them back from fully engaging in many contexts. Children who believe themselves to be behind are the least likely to engage in these settings.

Interfaces that utilize AI can offer constructive feedback that does not cause the same self-consciousness. Students are therefore more willing to engage, take risks and be vulnerable.

Teachers know that learning happens through powerful classroom discussions. However, only one student can speak up at a time. AI has the potential to support a single teacher who is trying to generate 25 unique conversations with each student.

AI is not a panacea. There are significant risks. While ChatGPT spits out answers to questions, these responses are not designed to optimize student learning. AI models are trained to deliver answers as fast as possible, but that is often in conflict with what would be pedagogically sound, whether that’s a more in-depth explanation of key concepts or a framing that is more likely to spark curiosity to learn more.

Sometimes AI can produce a coherent text that is completely erroneous, especially in math, which requires precision, not just process. The chatbot can produce perfect sentences that show top-quality teaching techniques, such as positive reinforcement, but fail to get to the right mathematical answer.

The rapid progress of AI might deter future job prospects after many years of learning how to code. Many students may no longer know what they should be focusing on or won’t see the value of their hard-earned skills. The full impact of AI in education remains unclear at this juncture. Things are changing, and now is the time to get it right.

Dr. Wallace Greene is a veteran educator who dabbled in an AI program to teach Gemara in the ’80s.

Leave a Comment

Most Popular Articles