Neurography by Mario Klingemann, Training photos by Jaclyn Campanaro

Computers have a language. Not zeros and ones or binary code—though that’s a language, too—but a visual vernacular that helps humans make the connection that, yes, a computer was here and it made its mark. It comes in all forms: the perfect crispness of an illustration drawn on a Wacom pad; the trippy swirls of a Google Deep Dream image; the fuzzy imperfection of an AI-generated font. “We call it the computer accent,” said Claire Evans, lead singer of synth pop band YACHT.

For their recently released album, Chain Tripping, Evans, her bandmates, and a cast of creative AI experts explored how this so-called computer accent can be used to artistic ends. Sure, this is nothing new—you can’t really get more “computer accent” than Kraftwerk (Computer Love was released way back in 1981, to name the most obvious example.) But YACHT’s album takes things up a notch: its liner notes read like a white paper on creative coding with experiments that include AI-generated artwork, band photos made with machine learning tools, and wacky AI fonts that explore how computers interpret letterforms. They also partnered with Lisbon studio Counterpoint  to create an online tool called Bit Tripper that creates AI-generated bit fonts.

The band began thinking about Chain Tripping a few years ago during the initial surge in creative AI projects. Musicians and artists had been experimenting with machine learning for years at this point, but the technology and the tools were, by most accounts, designed for people in the computer science community. “It’s still a little wonky,” Evans said, adding that when she started the project, she had only a cursory understanding of the tools that would be required to make the album happen. The way she and her bandmates viewed it, making an album was a way for them to work through the questions they had around the increasingly ubiquitous technology. “The only way for us to wrap our heads around it is to immerse ourselves in it totally until we’ve internalized what we think are the most important lessons,” she said.

“We call it the computer accent.”

The band wanted to push the limits of the available tools to produce something creative, but it couldn’t be just music. For YACHT, an album is an excuse to explore a topic as an entire package, from sound to visuals, and machine learning was a particularly robust technology to play with. They knew from the start who they’d need to enlist. “There are very few people who have both the cognitive capacity to code at the level necessary to build and generate machine learning models and the specific kind of like aesthetic madness that it takes to make interesting art,” she said. “It’s just too much brain for one person, which means that making art with machine learning forces collaboration between technologists and artists across disciplines.”

Evans and her bandmates enlisted a grab bag of creative AI practitioners to help them create the cover and liner notes for Chain Tripping. Artist Tom White created colorful, abstract blobs for the cover and liner notes that were made by training an algorithm on thousands of images of objects taken from the band’s lyrics (things like a satellite, planet Earth, boots, an eyeball). Mario Klingemann created “neurographs” for the band’s photos, which train machine learning models on photos and video to create ghostly, glitched out images where the band members’ faces look smudged across a screen. The header text for the liner notes are from Barney McCann, whose AI-font (you might recognize) look like a computer-written ransom note. And the song names are spelled out in AI-generated bit font made with the aforementioned Bit Tripper developed with Counterpoint. All of this is contrasted with a perfectly legible text typeface called Computer Modern.

Created with Bit Tripper

Something like Bit Tripper is designed to let anyone explore this strange in-between space of not-quite-human, not-fully-computer. The tool lets people make their own bit font, which is the product of training a machine learning algorithm on a set of 3,000 bit fonts pulled from the internet. The algorithm processed every character, learning its features and style—essentially what makes a bit font A, B, or C look like an A, B, or C. From those feeder fonts, the AI was able to generate more than a million character variations of the letterforms, which all appear to be just a little off. “The current state of creative AI is very very imperfect,” said Tero Parviainen, a co-founder of Counterpoint. “It doesn’t quite capture the thing you’re looking for, but in those failures there’s a lot of interesting stuff.”

“In a few years when the systems are way more sophisticated, we’re going to miss the little failures, the places where the system has misunderstood us in ways that helped us revaluate things like beauty and meaning.”

It’s that ambiguous cusp between legible and illegible, and right and wrong that most interests YACHT. “That’s the sweet spot—right when it’s about to fall apart,” said Evans. She predicts that someday, we’ll miss the glitchy strangeness that was a feature of computer-generated artwork. In the age of artificial intelligence, the computer accent is eroding bit by bit as computer scientists work to make algorithms speak with a realistically human inflection. “In a few years when the systems are way more sophisticated, we’re going to miss the little failures, the places where the system has misunderstood us in ways that helped us revaluate things like beauty and meaning,” Evans said. “We’ll be nostalgic for that.”