By Chris Thompson
Last month, a company called Latitude, which analyzes how people will read, watch, listen, and learn in the future, organized a survey to figure out what the future of education might look like. They asked: what if robots taught children?
The survey was conducted with 348 children from Australia, Europe, South Africa, and the U.S., designed to help answer the question, “What if robots were part of your everyday life – at school and beyond?” The questions, which were created with partners LEGO Learning Institute, were framed in a series of “narrative prompts” that would tease out complex and interesting answers about how children could best interact with artificial learning guides. “This really works with kids, and without it, we wouldn’t have gotten the richness we did,” says Ian Schultze, Latitude’s Director of Technology and Business Development.
The children were asked to finish the following stories:
“When I got to school this morning, my teacher surprised me by giving me a robot to help me with my schoolwork and…”
“My learning group or classroom finished its work before class ended, so my teacher let us leave early with the class robot and…”
“I made friends with a robot today, so I invited it to come home with me after school and…”
Some of the results [PDF]:
- 64% of kids described robots as if they were “natural, human-like companions: as humanoid peers that could speak and communicate with ease, came ‘pre-loaded’ with smarts and useful knowledge, and were social naturals.”
- One-third of kids explicitly described their robots’ physical form as human-like, and 29% specified that their robots’ primary mode of interaction was speech.
- 75% of kids’ robots “acted patient and supportive in educational contexts.”
- 20% of kids saw robots as a natural part of their peer groups.
- 38% of kids want a robot to play with.
- 38% of kids want a robot to learn with.
The results were more than just a role-playing game. Without knowing it, the children were describing their ideal teachers and learning environment. And overwhelmingly, children asked for teachers who didn’t use shame or scolding if they answered a question wrong or didn’t understand a subject. Their ideal robot learning partner understood if a child wasn’t ready to move on from long division, and patiently went over the subject as long as it took.
“There’s a little bit of shame involved in being wrong or being behind,” Schultze says. “Robots can get that out of the way, because they’re not judgmental. There’s a sense that technology is almost coequal, as opposed to a sort of master-servant relationship.”
The survey also suggested that children have an almost eerily comfortable intimacy with robots, treating them as play partners and considering play, learning, and social behavior as part of the same continuum. Though this survey is a speculative game, the prospect of computers and robots being a part of children’s daily education may not be as far off as we think. Institutes like School of One and Carpe Dium in Arizona are using online tutors and computer-assisted instruction, and Japanese and Korean schools have been experimenting with robots in secondary education.
But there are lessons from this survey, Schultze says, that suggest children crave a tutor. Someone – or something – that listens and learns with them, not at them.
As Schultze described the children’s attitude, “They’re using robots as a projective tool about what they do and don’t like about the current learning process,” Schultze says. “At a very simple level, they’re answering questions about how to better drive their own learning.”