UW News

May 29, 2024

Q&A: How AI affects kids’ creativity

UW News

A series of three images show from left to right: a pixelated image of a person in a lab coat and a small person with a speech bubble above them that says “MAPPY RETTY”; an image of a person in a lab coat and second person leaning over a white car while lightning and a tornado strike a city in the background; an image of a person in a lab coat protruding from a small car while holding a clock. Above the images, text reads: “Marty gets bitten by a rattlesnake and gordon ramsey takes marty to the hospital”; “The delorean saves Marty and gorgen ramsey from the tornado in digital art style”; “The car Delorean becomes the president in digital art style.”

A UW-led team held six sessions with a group of 12 Seattle-area kids ages seven to 13 to explore how the kids’ creative processes interacted with AI tools like ChatGPT and Dall-E. Here, one of the kids created a visual story using Dall-E, a text-to-image model developed by OpenAI.Newman et al./CHI 2024 — AI-GENERATED IMAGE

Shortly after artificial intelligence models including Midjourney and OpenAI’s Dall-E went public, AI-generated art started winning competitions: one in digital art, another in photography. Concern rumbled that AI could replace artists — and even, by some metrics, be more creative than humans. But simultaneously, people were exploring these tools as ways to augment their creative processes, not replace them.

University of Washington researchers grew curious about how AI might affect creativity in children, specifically, so they worked with a group of 12 Seattle-area kids ages seven to 13 to explore how the kids’ creative processes interacted with AI tools. They found that for the kids to be able to integrate generative AI into their creative practices meaningfully, they often needed support from adults and peers.

The researchers presented their findings May 14 at the ACM CHI Conference on Human Factors in Computing Systems.

UW News spoke with the study’s lead author Michele Newman, a UW doctoral student in the Information School, about the study, the importance of support and the particular creativity of kids.

What was the impetus for this research?

Michele Newman: Before coming to UW, I was working on a project using natural language processing — AI, essentially — to measure creativity in elementary school children. When ChatGPT came out, I was at the UW working with KidsTeam, a program where adults and kids co-create technology products for children, and I really wanted to see what effects GPT might have on children’s creativity.

So much of the early experience around this new technology was fearful. People were saying, “Don’t use it to teach, it’s going to harm kids.” Many schools banned it. So part of the impetus of the project was trying to see what a medium stance looks like — where it’s not harming or taking jobs. It’s supporting and building meaningful experiences for kids. How can we look to the future and build ethical and meaningful practices around this technology?

How did you go about designing the study? And why did you use those design considerations?

MN: In KidsTeam the primary methodology is co-design, where kids are treated as equal partners when designing technologies. So one of our approaches was just putting the kids in front of technology — OpenAI’s ChatGPT and Dall-E, or Google’s music generator Magenta — to see what they do. What are their considerations? Where are they getting frustrated? What does it mean to have a tool that can actually kind of do the creation for you? A lot of creativity research talks about how process is very important. That’s kind of where the person’s individuality comes out. So we wanted to see the kids develop their creative processes.

We also gave the kids a more structured experience. It’s one thing to just look at a piece of technology and say, “Here’s what it can do.” It’s another thing to say, “Use this specific software to write a story.” In the sessions, we balanced the open-ended approach with more directed exploration and had kids use techniques like comic boarding, where they make comics about potential good and bad uses of AI.

What findings were the most interesting to you?

MN: Maybe the most important and practical finding is how clearly these systems are not built for children. The kids might know a lot about, say, a video game like Genshin Impact. If the AI system doesn’t know anything about it, the kids might conclude they’re smarter than the system. So there’s a mismatch between what children are expecting these systems to be able to do and what they can do. This type of technology is generally built with adults in mind. Likewise, children’s language just isn’t the same as adults. Things like this really become an issue for kids trying to creatively express themselves.

For more information on KidsTeam, see its website

The title of this paper is “I want it to talk like Darth Vader,” which is a quote from one of the kids. He was writing a story about Star Wars, and he turned to us and said, “I want it to talk like Darth Vader. I want it to be able to be customized.” He suggested that it would help him write a better Star Wars story. Obviously, you could prompt ChatGPT to talk like Darth Vader, and we helped walk him through that. But those aren’t things that the kids necessarily understand right away. They need extra instruction around that. Children’s creativity is unique. Because of their development and their experiences, they have different needs than adults do. They’re still building and understanding social norms, and what it means to create.

I was also fascinated by the kids’ ethical considerations.

MN: Yeah, when we asked about some typical things like cheating, the kids tended to reiterate things they’ve heard, that “I shouldn’t use it to cheat.” But when we asked them about things like whether their friend should use AI to write a birthday card for them, they started to have really nuanced takes. Some started asking how much the friend is using it. Is it to write the whole card, or just to help? Every kid starts to have different ideas. So then we’re considering how to foster an individual child’s expression.

We asked one 11-year-old how he’d feel if his favorite book series was written by AI instead of an author, and he said it would “dismantle” the joy of reading for him. We often don’t think about kids having these deep, existential questions about what it means to be an artist. But they are. They’re asking whether they lose some authenticity when AI rather than a friend writes a birthday card. Over the course of the study, we saw them changing and developing as they used these systems. By the end, it was great to hear them saying things like, “I don’t think this really expresses what I’m saying.”

But they started making certain types of adjustments to their creative process and their goals, which for me sometimes raised a red flag. Sometimes they’d add extra context to get it to do what they wanted. But other times they might try an idea and quickly say, “It’s not working, so I’m just going to change the idea.” That’s a hard problem. But we can’t just make systems that solve all these issues, because every kid’s process is different. Sometimes you do need to learn to give up on an idea. That can be part of the creative process. So the question with AI is how do you support kids and give them knowledge of their individual creative processes? Creativity is always happening in a larger context. The interaction is not just about inputting a prompt. It’s working iteratively with the system while being supported by peers and adults. And those networks of support make a meaningful experience with these systems much more likely.

Additional co-authors on this paper were Jin Ha Lee, a professor in the iSchool; Jason Yip, an associate professor in the iSchool; Grace Shin, an undergraduate in the iSchool; Ilena B Dalla Gasperina, Matthew Kyle Pedraja and Maia Song, undergraduates in human centered design and engineering at the UW; Ritesh Kanchi, an undergraduate in the Paul G. Allen School of Computer Science & Engineering; Rannie Li, who completed this research as an undergraduate in interaction design and psychology at the UW; and Kaiwen Sun, a doctoral student at the University of Michigan. This research was funded in part by the U.S. Institute for Museum and Library Services.

For more information, contact mmn13@uw.edu.

Tag(s):