Date: 2012 – 2014
Tools: Rhino, Grasshopper, Human, Lunchbox, FabTools, Illustrator
Although the use of parametric 3D models has become common, the use of parametric tools to produce evocative and engaging 2D representations has been less developed among architects, designers, and artists. Standard parametric modeling programs like Revit, although highly capable of handling relational geometries, are rather limited in terms of allowing for representational experimentation. Programs such as Processing have enabled the non-programmer to more easily enter into the word of procedural design, yet they still require working in traditional, text-based programming environments. And graphic design tools such as Illustrator are enormously powerful in creating evocative visual representations, yet largely the workflows within these software packages are highly manual and allow for little parametric functionality.The parametric modeling plugin Grasshopper offers a potential middle ground that provides parametric control, ability to manage complexity, and a highly intuitive graphic user interface.
This research was initiated through an interest in the body of work produced by the Minimalist, Op-Art and Conceptual Art groups in the 1950′ through 1970’s. Artists such as Ellsworth Kelly, Sol Lewitt, Eve Hess, Bridget Riley and many more explored the construction of drawings and paintings through procedural techniques. Since this period, a number of other artists including both analog and digital artists have explored generative rule-sets in the production of their work (Casey Reas, Jared Tarbell, etc). Yet in most cases, the work is heavily based on scripting, a technique difficult for many architects, designers, and artists to learn (even Processing). This project explores how more intuitive digital techniques can be leveraged to give artists, designers, and architects opportunities to develop experimental and generative drawings and diagrams.
The work has been conducted both as a solo project and as a collaborative project with colleagues and students. I would like to thank Andrew Heumann for his excellent Human plugin for grasshopper that has enabled much of this work. In addition, I would like to thank Adam Marcus who co-taught the first workshop in 2013 with me exploring these ideas. Finally, I would like to thank all of the students and attendees in the courses and workshops I have taught on this topic as your ideas and questions have contributed to the exploration of the work.