Daniel Buck of Thomas B. Fordham Institute writes:
The bulk of commentary and school district policy relating to AI and education focuses almost exclusively on questions regarding cheating. What does it mean for a student—or an education columnist, for that matter—if a chatbot can write a ten-page essay in a matter of seconds? How can teachers assign any homework and know for certain who or what is actually completing it? These are real questions with which the education sector must wrestle.
But there are other risks of AI and thorny questions looming on the horizon that are worryingly overlooked. Student privacy is perhaps chief among them. This concern became real to me when I read the following theoretical example from renowned education scholar John Hattie:
You are a sixteen-year-old girl on your way to school. Just before you get to the school gates, you waver and decide you would rather go meet your friends at the mall. Your Robo-Coach has access to the messages you exchanged with your friends the previous evening, it can sense your elevated heart rate as you stop to consider the alternative course of action, and it notes from your GPS location and lack of speed that you have stopped at an unusual place. From the combined data, it assesses an 87 percent probability that you will not attend school, and it interjects with a timely and personalized nudge (in a soothing versus commanding tone; male versus female voice; using an emotional versus factual plea, etc.) to push you through the school gates.
Read more at Thomas B. Fordham Institute.