Is ChatGPT safe to use with K-12 students?
For Australian teachers starting back at school for the year, ethical questions about whether AI is 'good' or 'safe' will soon make way for pragmatic questions about student access.
EdTech 101: Check the TOS
When a question recently popped up in a Queensland English teaching group about how people might use ChatGPT for learning and teaching this year, I wondered in the comment thread about whether students would even be allowed to access it. As most Australian educators I’ve been on summer break, so although the question had previously crossed my mind, I hadn’t followed it up.
The time had come to check the TOS. Sites like YouTube and Twitter ask users to be at least 13 years old, I expected ChatGPT to be at least the same.
It was a jaw-hits-floor moment after reading point one of OpenAI’s Terms of Use, about ‘Registration and Access’:
‘You must be 18 years or older and able to form a binding contract with OpenAI to use the Services. If you use the Services on behalf of another person or entity, you must have the authority to accept the Terms on their behalf. You must provide accurate and complete information to register for an account…'
Had we really all just nattered away for over a month about how AI was going to impact education without checking the Terms-of-frikken-Service?
Yes, yes, yes, I know…
Yes, I know… it can still be used by learners in higher education. (Not some first year undergrads though - keep that in mind!)
Yes, I know… it is still a promising productivity and creativity tool for educators in their work.
And yes, I knoooow… we can still model the use of AI tools without asking students under 18 to make accounts, and y e s, I k n o w many students already have accounts will make accounts of their own accord no matter what we say.
But ultimately, I had started thinking about ways to actively and purposefully incorporate AI writing into the creative writing process for my year 10s this term. And now I can’t do that. And I’m kinda bummed about it. There are many who would have been starting to make similar plans, and we shouldn’t act like students being unable to make accounts is no big deal.

Think of the students!
I can’t be the only one who feels sorry for the students in all this. When someone tells me it doesn’t matter that the tools are for users 18+, because teachers can still use them for productivity, the teenager in my heart shoots little dagger eyes. So, the grownups can use this hack to help write their lesson plans, but I can’t use it to help write my essay?! That sucks!
The way that my feed often alternates between posts about how we’re going to detect student ‘cheating’, and posts about how teachers can use the tools to ‘improve their effectiveness’, is telling. In terms of framing this as a learning tool, the discourse is just not there yet.
Now that’s out of the way, predictions
I am curious about what OpenAI tools being restricted (in theory) to users 18+ will mean for educators and education leaders who are currently forming opinions (and policies) about how AI tools might be used for learning.
Having worked in Queensland, Australia for 12 years, I can offer a prediction that take-up here in K-12 education will be tentative, and restricted at every turn. ChatGPT is already blocked in Department schools, but that’s not just about the age restriction. In QLD, student data is not permitted to be stored offshore, meaning that requiring students to make an OpenAI account here is out of the question in the forseeable future, no matter their age.
However, parents and carers, older family members, private tutors, and others will have no such restriction, and many will already have accounts. So another prediction is that most students will have access to this tool at home from day 1 of this year, either with an account of their, or by using a supervised account of an adult.
Just today, I showed ChatGPT to my 7-year old. Her first ever prompt, by the way, was:
Why do farts stink.
Classic.
The last prediction I can confidently make is that: unless school assessment is ready to change, not much else will. Until then, these tools will be framed as ‘unsafe’ frequently enough to account for why we haven’t engaged with them. Tell me if you agree in the comments!
Terrific post Kelli. I too had been thinking of ways that the tech could be used for good instead of evil. Considering that kids often falsify their age when signing up for social media I have no doubt that many will do this to gain access. My initial thoughts for detecting cheating is to head in the direction of teaching them how to cite in text properly. Use the AI if you wish but validate the information with a citation or two 🤔
Hi Kellie
Thanks for this fantastic post which I am going to be sharing with the Teacher Librarian community. I completely agree that until assessment tasks change, and there is a change in emphasis from content to process, tools like ChatGPT will be considered unsafe and seen as a threat to traditional ways of working rather than an opportunity. This change is needed regardless of the fact that we can't design learning experiences where K-12 students actually use the platforms; because as you say, we know that they will, and we do need to recognise (finally!) that the time of content regurgitation as assessment of learning is over.
Obviously it takes time to redesign tasks and rethink pedagogy, and I feel heartily for teachers who are the ones who bear the brunt of constant curriculum and pedagogy redesign; it's important work, but work that is so often unseen in terms of their role, and unrecognised in terms of the time, effort and expertise required.
Of course, this is where I feel the TL role could play a huge part. If every school had a qualified full time teacher librarian, they could be working with teachers to develop quality assessment tasks and redesign pedagogical approaches, providing professional learning, building awareness of AI, algorithms etc into digital literacies programs....so much!!
Thanks again for sharing your thoughts...
Kay.