
This is part I of the Processing oral history. Read part II here.
Computers and design have long been intertwined. This relationship dates back to the 1960s, when computational pioneers used code to create novel forms. Since then, programming’s grip on designers has only tightened, even as it’s been obscured by the user-friendly interfaces of modern creativity software.
The bridge between technology, design, and art has always been in the exploration of tools, and how those tools enable code to become a medium for new forms of expression. In the late 1990s, this dynamic was at play at the Massachusetts Institute of Technology (MIT), in a lab called the Aesthetics + Computation Group. Started by John Maeda, the ACG was an update to and continuation of the Visible Language Workshop, a research group founded by Muriel Cooper in 1975 that explored how computation could be used in the field of graphic design.
Cooper was concerned with how computers could be leveraged to expand the notion of traditional publication design and its accompanying elements, like typography. When Maeda joined the Media Lab in 1996, he pushed at the edges of Cooper’s work, exploring what code could enable formally, in the contexts of both art and design.
Cooper and Maeda established a long lineage of designers and artists who were interested in pushing the boundaries of what code could create. Among them were Ben Fry and Casey Reas, two research assistants in Maeda’s group. During their time at MIT, Fry and Reas began to question how programming was taught to visually minded students. They wondered: How could they make programming more accessible to designers and artists? And what would it look like for code to become both a creative medium and part of the creative process itself?
In the early 2000s, Fry and Reas started building a piece of software that would let people code in a simplified environment using a variation of the Java language. The software, called Processing, was designed as a digital sketchbook where novice and experienced coders alike could create interactive graphics. Processing was more than a tool, though. It was a community—one that would eventually be built by thousands of people who have contributed code to the open-source environment over the last 20 years.
What started in 2001 as a side project has since become a worldwide environment used by thousands of people to explore the creative possibilities of coding . Here is the story of Processing, in the words of the people who created it.

Casey Reas: co-founder of Processing, artist, professor in the department of Design Media Arts at UCLA
Ben Fry: co-founder of Processing, founder of Fathom design studio
Dan Shiffman: Associate Arts Professor at New York University’s Interactive Telecommunications Program
Casey Reas: I don’t think Processing would exist without John Maeda. The story starts there. Ben and I both came to MIT to study with John specifically because he was bridging ideas of computer science and design together. At that time, he was the person who was making those connections, and that’s what Ben and I were both interested in. We were both John’s research assistants in the Aesthetics + Computation Group, where we were working with John on Design By Numbers (DBN), which was a programming language and environment and system to teach designers the basics of coding.
Ben Fry: DBN was under-featured, and intentionally so. Like, can you have a programming language that’s eight different commands and a limited screen of 100 x 100 pixels, and only grayscale? It was a super minimal environment that was meant to just get people thinking about using computation at all.
Casey: I remember there were a few moments that were really pivotal around teaching DBN. One was at the AIGA headquarters [in 1999]. We did a workshop for designers—nontraditional students of all different experience levels. Ben, Elise Co, and I also taught a few-day workshop at RISD. Those workshops really opened my eyes. We could sit with a group of people who had never coded before—people who were designers—and within an hour, they were making stuff. But at the same time, there were also these questions of like, “Oh, I want to use color.” “Oh, I want it to be bigger than 100 x 100 pixels.” And that was really the start of what Processing became.
Ben: Casey and I had kept making an experimental version of DBN that looked the same but like underneath you could do OpenGL and 3D graphics and all these other much more complicated things. There were secret commands to make it a larger drawing canvas or add color or switch over into using Python instead of the DBN language. We had all these odd variants that we would hack in as things to try out. But in the end, we didn’t want to mess with DBN. There was something really wonderful about its simplicity, and there wasn’t a good way to have DBN ladder up into something more complicated.
Casey: Processing tried to take the minimal aspects of DBN but also allow it to extend to the point where it was no longer purely a learning environment, but actually a full design and studio environment.
Ben: What we were after was: Can we take what is so nice about DBN where you write a little bit of code, hit “run,” and something happens, and can we make something that’s more sophisticated like the Java language? Can we wrap that up in a way that still feels as efficient and friendly and responsive?

Casey: In the early days, the Processing editor had the exact same user interface as DBN. I did the DBN user buttons and just moved them over to Processing. Over the years we’ve really simplified it, so now it’s just really “run” and “stop,” as opposed to like 50 buttons you can find on some of these “professional” programming environments.
Dan Shiffman: Having the icon be basically the play button was a big innovation of the friendly language of Processing. It isn’t “compile”—it’s “play.”
Casey: That was intentional. If you were learning computer programming at that time you were doing classes in a computer science department. You were only working with text and math, and for visual people—people who are about the sensation of aesthetics—working in those classes, you would either stay for a few weeks and leave, or you would stay for a year and you’d be in pain.
Ben: I took some computer science classes during undergrad, and there was a friend of mine in the design department who was also taking one of the courses. I was so excited because he was super talented; this really brilliant person. And I’m like, great, I’ll have somebody else who’s in this class and we’re gonna be able to kind of commiserate or whatever. We got a couple weeks in, and he dropped out of it.
Dan: That’s a common story.
Ben: It’s this sort of thing that’s really burned in my memory. I remember bumping into him while I was on my way to class, and him being like, “Man, I just can’t do it.” And it killed me because we’re learning all this other bullshit about data structures and algorithms and whatever, and it’s so far from what he wants to do.
Casey: We saw a potential for learning how to code in a more essential and foundational way—that was really the vision of Processing. I don’t really think that was happening at the time. Instead of, to say it in a really crude way, dumbing down coding so designers can understand it, we thought designers and artists and architects can really do a great job with this. It just needs to be presented to them in a way that is motivating for them and engaging. And the web was just extremely essential. The idea that you could export your work, throw it up on a server, and anybody who had a network connection anywhere in the world could see it. Putting the code people have made right there, so everybody can access it and share and learn, was something we picked up from the browser automatically allowing you to view HTML.
“We saw a potential for learning how to code in a more essential and foundational way.” — Casey Reas
Ben: For me, learning how to code was entirely based on the willingness of other people to share their stuff. That was during a time before “open source” was a term that was coined, but there was just sort of a norm around it. If you were making stuff, you were probably going to share it. There was a willingness to do that. Later that started getting wrapped up in the open-source movement. There was a certain amount of “paying it forward” but also just going back to the idea of, how do you set a baseline for the community around the willingness to share? How do we try to get away from the obfuscation of things and closer to the explanation of them?
By 2001, Reas had graduated from MIT and was teaching, while Fry was working on a Ph.D. Both were developing Processing into a scalable coding environment on the side. What started as sketches in their notebooks became an amalgam of Java and new programming elements that people could deploy as sketches, the term Fry and Reas used to describe a Processing program. The change in terminology was both symbolic and practical. They wanted Processing to feel accessible, while at the same time enabling experienced programmers to use the language as a way to break free from the typically rigid coding environment. This novel approach to coding caught the attention of Shiffman, a recent graduate of New York University, who was interested in bringing the Processing language to the school’s Interactive Telecommunications Program (ITP).
Casey: The traditional way of writing computer code is to figure everything out and then to transfer that into source code. The idea we pulled from the arts was: You don’t really often know where you’re going. The only way that you figure out where you’re going is to walk the walk, and to go down all the different paths.
Ben: In the visual arts, you can sit with your sketchbook and scribble things out. If you screw up or do the wrong thing, you can flip to the next page and start again. And that’s necessary because as you start working through ideas, a lot of them suck.
“How do we try to get away from the obfuscation of things and closer to the explanation of them?” — Ben Fry
Casey: The goal was to make the creation process a little bit more informal. To get people to say, “I’m going to sit down for an hour, and I’m going to make five sketches, and then they’re all there in my sketchbook.”
Ben: The sketching aspect of it is essential to actually working out the idea because you can’t, unless you’re a genius, just completely conceive of everything in your head and have it come out and work all the time, right off the bat. We wanted this [software] to be able to give folks a way to do half a page of code and see what happens. Because if you wind up having all these other steps and you’ve invested all this time into just getting something on the screen, that makes you work in a really rigid way.
Ben: I think it works well for people who are getting started, but even if you actually understand this stuff well, there’s no reason to further burden yourself with all this extra nomenclature. We wanted to make sure that the things that should be simple or things that are going to be used a lot were really efficient and really clear.
Dan: I recall at the time that the two different tools I was using were Director and C++. I don’t know that I ever really liked Director, although I appreciated it. The interface drove me crazy. There was a timeline and a stage and all that stuff. I just wanted to play with math algorithms and visualize them. Or I was working with C++, and that also drove me crazy because it was really difficult to debug. Processing just landed in this perfect sweet spot between the two. It was kind of a right place, right time thing in that I was graduating, and I proposed (or maybe Red [Burns] suggested it to Dan O’Sullivan, I don’t remember) something like: I see Lingo being used, and I see C++ being used in this graduate program at NYU. I think Processing would be really great to take over. It’s never been about a competition—it’s a rising tide lifts all boats kind of thing—but being able to basically port the curriculum that used Director to Processing was one of the first things that I did [when I joined the faculty].
Ben: I remember around 2004, there was a transition from a lot of folks being like, “Oh yeah, Processing. I’ve heard of that; I might try it out.” To like, “Oh, Processing. I actually used that for my thesis.” And I was like: Did you graduate? Is everything okay? There was sort of an inflection point of it suddenly being in use, and we probably owed ITP a great deal for that.
Casey: When I was traveling, everywhere I would go, I would find all these people using Processing.
Ben: It just kind of grew very steadily from there. It’s sort of funny because we have a chart of the users for growth, and it’s this very straight line going steadily upward. Over the years there were various points in the history where we were like, “okay, this is as big as this audience will get,” and then it just kind of kept growing.
Casey: There were a number of institutions really early on who I would say took a chance, and it usually came from a student, or a young faculty member who wanted to do a workshop. This was way before there were classes around Processing, and a lot of the software development happened through needing to get out a new release for a workshop. Sometimes Ben and I were doing a workshop together, and we’d release the software at 3 a.m., and then be in class at 9 a.m. to teach. It was largely pushed forward by necessity.
Read part II of the Processing oral history.