Putting the 'AI' in PLC
Part Meeting, Part Machine

Meghan Hargrave, Douglas Fisher, and Nancy Frey writing for ASCD, "Bringing Artificial Intelligence to the PLC Table":
In this ongoing analysis, it seems that team members would rather not disrupt the community by challenging ideas presented by their colleagues. When AI is used as a thought partner, the results are profoundly different. The end products are stronger because the team spends more time refining ideas, engaging with student data, and developing content based on the rich discussions had.
Look, I love this idea. I think any tool to help teachers facilitate conversations between one another in order to benefit teaching and learning is exceptionally important. I am all in on making PLC meeting time as efficient as it possibly can be. I've been in too many PLCs that have little direction, provide minimal analysis, and/or take too little action. It becomes a colossal waste of everyone's time.
The authors here highlight the planning and implementation of day-to-day strategies, which is invaluable in it's own right. The elephant in the room though, is how educators will use AI to analyze student data, which in my opinion is one of the most time-consuming aspects of these meetings.
The authors briefly address this idea near the end of the piece:
AI can also analyze data using an open-ended prompt such as, “What is this student already doing well, and what should they do next?”
Quick Note: For those that don't know, I am rather large proponent of ensuring the safety and security of student data and privacy in schools. In fact, this topic was the first article I published on this platform years ago. We are actively living out the Wild West of the LLM era. There is an open sea of possibilities for its use and few regulations to restrict future advancements. We need to be careful that educators know not to simply dump student data/work into for-profit platforms with shady privacy policies.
But let's assume that every teacher removes all personally identifiable information from these data reports properly, the expectations of educators also need to be appropriately dialed in. This technology cannot and should not do the thinking for you. It's a great buddy and brainstormer, but when utilizing such a broad prompt for the sake of analysis, teachers cannot just take the response as gospel.
I find so many of my colleagues treating an LLM as a person. Anthropomorphizing this tool using pronouns like "he" or "she," or discussing what ChatGPT "thinks" is astonishingly prevalent, years after it's introduction.
It's not thinking. It's not all-knowing. It's just guessing, and giving you want you may want to hear1. One part of me knows intuitively that this is just a shorthand for communication, but the other part is terrified that these words may start to shape our perceptions of LLMs.
The authors enunciate this point beautifully:
Of course, while chatbots are capable of producing human-like responses that can enhance the level of discourse, they cannot replace the insights of experienced educators.
Hargrave et al. provide a great use-case for AI beyond the classroom with solid guidelines. It is well worth a read and to keep in your back pocket as you approach meetings using AI. As educators and schools adopt discernible practices for AI usage, I surmise that the potential to become more productive and less-exhausted professionals is on the horizon. But we cannot become complacent. We need to keep ourselves in the loop and craft the critical decisions needed to set our students up for future success and improvement.
If our robot overlords have taken over by the time you are reading this, then I truly apologize for this ill-thought-out statement. ↩