To see what makes AI hard to use, ask it to write a pop song

beautiful-the-world-web.jpgresize1200600.jpeg

In the end most teams used smaller models that produced specific parts of a song, like the chords or melodies, and then stitched these together by hand. Uncanny Valley used an algorithm to match up lyrics and melodies that had been produced by different AIs, for example.

Another team, Dadabots x Portrait XO, did not want to repeat their chorus twice but couldn’t find a way to direct the AI to change the second version. In the end the team used seven models and cobbled together different results to get the variation they wanted.

It was like assembling a jigsaw puzzle, says Huang: “Some teams felt like the puzzle was unreasonably hard, but some found it exhilarating, because they had so many raw materials and colorful puzzle pieces to put together.”

Uncanny Valley used the AIs to provide the ingredients, including melodies produced by a model trained on koala, kookaburra, and Tasmanian devil noises. The people on the team then put these together.

“It’s like having a quirky human collaborator that isn’t that great at songwriting but very prolific,” says Sandra Uitdenbogerd, a computer scientist at RMIT University in Melbourne and a member of Uncanny Valley. “We choose the bits that we can work with.”

But this was more compromise than collaboration. “Honestly, I think humans could have done it equally well,” she says.

Generative AI models produce output at the level of single notes—or pixels, in the case of image generation. They don’t perceive the bigger picture. Humans, on the other hand, typically compose in terms of verse and chorus and how a song builds. “There’s a mismatch between what AI produces and how we think,” says Cai.

Cai wants to change how AI models are designed to make them easier to work with. “I think that could really increase the sense of control for users,” she says.

It’s not just musicians and artists who will benefit. Making AIs easier to use, by giving people more ways to interact with their output, will make them more trustworthy wherever they’re used, from policing to health care.

“We’ve seen that giving doctors the tools to steer AI can really make a difference in their willingness to use AI at all,” says Cai.

Credit: Source link