Dr. Rob Wessman, Vice President of Ethics & Safety at SchoolAI

Starting today, Dot will no longer appear with a face in student and teacher interactions. In its place: a simple abstract circle. A clear signal that this is an AI tool, not a person.
We chose to proactively make this change, because we believe student well-being comes first and that they should know when they are talking to AI.

What we've learned
AI that is genuinely responsive can be a game-changer for learning. It can help a student stay in a problem longer, feel supported through a struggle, and try again after getting something wrong.
But responsiveness in learning is not the same as responsiveness in human relationships, and that distinction matters enormously for students' developing brains.
Researchers have a name for what happens when people form one-sided emotional bonds with media figures: parasocial relationships. The concept dates back to the 1950s, when scholars noticed that television viewers felt genuine warmth, loyalty, even grief toward personalities who had no awareness of them at all. Children are especially susceptible. They form real bonds, even with fictional characters, and those bonds shape their behavior and their sense of trust.
Television characters, though, are passive. AI is not. AI responds, remembers, and adapts. When that AI also has a name, a face, and a consistent personality, the conditions for parasocial attachment intensify. The research here is only emerging and is continuously evolving. We are a technology company run by educators, and we've been watching this effect carefully. For us, waiting for the research to tell us conclusively something is harmful is waiting too long. The question we kept returning to was simple: are we being clear enough with students about what kind of thing they're talking to?
We decided the honest answer was no. It’s not just enough to tell a child that they are talking to AI. It needs to be clearly relayed in multiple formats that resonate with developing brains, from the way it interacts to the visual design.
What we believe
Three beliefs guide every product decision we make:
Student well-being comes first: Not as a value we talk about, but as the actual bar for every product decision.
Human relationships are irreplaceable: Our AI has firm boundaries designed to promote human connections, and we actively equip educators to hold the relationships that matter most.
Students must know when they're talking to AI: We have a responsibility to go above and beyond so they can grasp what that actually means.
What's changing

The most visible change is the avatar: Dot's face is replaced by a simple abstract circle in all student and teacher interactions. Same creativity, same name, but much more clearly AI.
The less visible changes may matter more. We've continued strengthening the pedagogical stance built into every Dot interaction: what is called the "warm demander." When a student is stuck, Dot scaffolds and stays Socratic, asking the next question rather than providing the answer.

When a student expresses emotion, Dot acknowledges what it observes without pretending to feel it: "it looks like this is frustrating," not "I know how you feel." And when a conversation moves toward something that belongs in a human relationship, Dot redirects to the trusted adult in the room.
Dot is not the student's friend or confidant. It is a rigorous, attuned learning tool that is honest about what it is.
A note on the change
We know some educators and students genuinely liked Dot's friendly face, and that familiarity has value, especially for younger learners. We don't take that lightly. But we believe clarity about what AI is serves students better in the long run.
If your students ask about the change, here's a simple framing: Dot is still here to help you learn. We changed how it looks so it's easier to remember that Dot is a tool built to help you think, not a person. Your teachers, classmates, and other trusted adults are the people who know you and care about you.
What's not changing

Dot remains our mascot . You'll see the Dot you know in our brand and community, just not in learning spaces. Your instructions still shape how AI shows up for your students, and the teacher-directed, personalized learning experience remains.
Why acting now is the right thing
Kids are encountering AI at a formative age, before they have frameworks for understanding it. We believe the companies building AI tools for children have a responsibility to help build those frameworks: through honest design, holding the highest standards, and being proactive about potential harm.
We are proud to build this alongside the countless educators who continue to help inform our product to best serve students.
Questions? Reach out to our team. We're glad to talk through what this means for your school or district.

Transform your teaching with AI-powered tools for personalized learning
Always free for teachers.
Related posts

AI homework help: A teacher's guide for parents
Jennifer Grimes
—

Parent letter AI policy: How to communicate classroom AI use to families
Stephanie Howell
—

Teaching AI ethics through current events: A practical guide for educators
Stephanie Howell
—

AI literacy skills every student needs before graduating
Cheska Robinson
—





