Six Steps to Build AI Accountability
Key Points
-
Why pay attention to Generative AI at all?
-
Because, in the right context and with the right guardrails, it may bring productivity gains for teachers, and some level of personalized learning for students.
By: Art Kleiner and Juliette Powell
Accountability with AI is critical, especially in schools and especially right now. Many educators are skeptical of AI in general and in particular, generative AI (GenAI). They may even see firsthand the biases and inaccuracies inherent in chatbots and content creation tools. However, the push to use the tools in schools is real – and alarming. No matter how pressed for money schools may be, GenAI is not a substitute for any level of human contact. The idea that the tools will evolve and improve is not a valid argument because they do not always improve, and we do not know how they will change.
Why, then, pay attention to Generative AI at all? Because, in the right context and with the right guardrails, it may bring productivity gains for teachers, and some level of personalized learning for students. It will give young learners new tools with which to express themselves and connect constructively. It will also be misused: in bullying, false identification, plagiarism, and many other ways. These tools reflect and amplify all the positive and negative qualities of the people who use them. Finally, they are going to be part of every student’s environment, and using them in schools will provide a safer way to learn them and some more thoughtful habits for using them.
The U.S. Department of Education recently released its own principles for “AI and the Future of Teaching and Learning.” Based on those and on our own research for The AI Dilemma: 7 Principles for Responsible Technology — here are five points that school leaders can use to develop their approach to responsible AI.
- Emphasize humans in the loop. Above all, don’t delegate teaching to AI; don’t shut kids in with a chatbot as their primary teacher. AI provides a seductive illusion of control, but real education requires consistent human-to-human contact. Use the tools, and work with the tools, but always with human presence and awareness.
- Embrace “creative friction.” Digital technology is typically designed to reduce mental tedium, but that frictionlessness can backfire, especially in a school setting. Quality use of GenAI in education requires conscious attention to its practices. Bring together groups of people with diverse perspectives (ideally including students) to decide what you will and will not do.
- Prioritize trust – especially with data. In schools, this means learning how to verify that data is used in trustworthy ways. Digital systems for student evaluation are often mistrusted because they reflect long-standing biases. The “cold data” – quantitative statistics about student performance – can often place students from vulnerable groups into special ed paths which they don’t fit, and which short-change their future. “Warm data,” as Nora Bateson calls it, should be part of every decision. This might include stories and observations that can be used to truly see children and help them realize their potential.
- Open the closed box. Aim for AI projects to be explainable, so that other people can question and learn from them. Provide visibility into the logic of the algorithm and the model of any student data project, including why the data was collected, and how it could be safeguarded. Train students and teachers until explainability becomes second nature. It won’t always be easy, because machine learning, by its nature, doesn’t always track its sources or reasoning. Learn to recognize how different assumptions, reflected in the model, can lead to different outcomes.
- Hold stakeholders accountable. As we’ve seen with social media, digital technology can be used to bully others. Students (and sometimes teachers) can use AI to create deepfakes and false information; some will be tempted to plagiarize. Make it clear why boundaries are necessary. Point out that the same GenAI program used for a class assignment may deliver the same draft to others. Misusers of AI may not always be caught, but they should know that these are high-stakes tools, to be handled with at least as much care as a car.
- Reclaim data rights for students and parents. This will be difficult. Like all institutions and organizations, schools are used to collecting personal data and choosing how they use it – within legal guidelines. With GenAI tools, students will create and collect their own data: about who they are, where they go, who they spend time with, what they look up, and what they think and feel. They should have control over how this data is used and be conscious about how it is shared.
In developing practices like these for GenAI in K-12 schools, educators are not just creating safeguards for particular applications. They are establishing risk awareness and safe innovation as a way of life for a generation of young people.
Juliette Powell and Art Kleiner are co-authors of The AI Dilemma: 7 Principles for Responsible Technology.
Juliette Powell is an author, a television creator with 9,000 live shows under her belt, and a technologist and sociologist. She is also a commentator on Bloomberg TV/ Business News Networks and a speaker at conferences organized by the Economist and the International Finance Corporation.
Art Kleiner is a writer, editor and futurist. His books include The Age of Heretics, Who Really Matters, Privilege and Success, and The Wise. He was editor of strategy+business, the award-winning magazine published by PwC.
0 Comments
Leave a Comment
Your email address will not be published. All fields are required.