CLOSE AD ×

Review> The New Normal: Penn Symposium Explores Generative Digital Design

Review> The New Normal: Penn Symposium Explores Generative Digital Design

[Editor’s Note: The following review was authored by Gideon Fink Shapiro and Phillip M. Crosby.]

A generation’s worth of experimentation with generative digital design techniques has seemingly created a “new normal” for architecture. But what exactly are the parameters of this “normal” condition? On November 14th and 15th Winka Dubbeldam, principal of Archi-Tectonics and the new Chair of the Department of Architecture at the University of Pennsylvania, called together some of contemporary architecture’s most prominent proponents of generative digital design techniques for a symposium, The New Normal, examining how these techniques have transformed the field over the past twenty years. According to Ms. Dubbeldam and her colleagues in Penn’s post-professional program who organized the symposium, digital tools have “fundamentally altered the way in which we conceptualize, design, and fabricate architecture.” Participants were asked not only to reflect upon the recent past, but also to speculate on future possibilities.

Even among this select group of practitioners, the shared enthusiasm for digital techniques does not imply an affinity of beliefs or approaches. While Patrik Schumacher (who, notably, lectured at Penn one week later) would have us believe that parametric techniques will triumphantly lead to a New International Style, what the New Normal symposium revealed was not a singular orthodoxy, but rather a rich multiplicity of approaches. On the one hand, one perceives a renewed sense of craftsmanship in which computation and robot-assisted fabrication can “extend the potential of what the hand can do,” in the words of Gaston Nogues of Ball-Nogues Studio. On the other hand, ever-increasing computational and 3D-modeling power have nourished a whole field of virtual “screen architecture” that follows in the tradition of conceptual and utopian proposals.

In his opening keynote address, Neil Denari discussed several contemporary artists—from Gerhard Richter to Tauba Auerbach—who use or misuse tools to elicit unexpected results. Similarly for architects, the computer should be seen as a filter or intermediary tool between author and work, rather than a seamless executor of authorial will. More pointedly, Roland Snooks of Kokkugia asked, “What are the behavioral biases of digital design tools?” He then suggested that contemporary architects might need to invent and design their own tools (software plug-ins and algorithms) in parallel with the architecture. Simon Kim of IK Studio went so far as to attribute to machines an agency once reserved for humans. And Francois Roche of New-Territories Architects said, “We have to torture the machine” to stretch its conventional functions, teasing out new “erotic bodies” and “ways to tell a story” through playful cunning.

Lou Reed, David Bowie, and Jimi Hendrix were all invoked, but not by the speaker who wore sunglasses during his talk—Jason Payne of Hirsuta. Citing previously published remarks by Jeffrey Kipnis and Greg Lynn, Payne urged architects to test the assumed limits of their digital instruments, just as Hendrix pushed the limits of his guitar by playing it upside-down and incorporating electronic feedback in his radical performance of the “Star-Spangled Banner” at Woodstock in 1969. However, Payne cautioned, as he cued a slide of Eddie van Halen, the pursuit of technical virtuosity alone can lead to manneristic excess. Indeed, what made Hendrix’s Woodstock performance great was not only his innovative guitar work but also his subversive and liberating rendition of the national anthem at a time of social upheaval, sharpened by his insider-outsider status as an African-American rock star. The point is, instrumentation cannot necessarily be isolated from the substance of a work and the social conditions in which it is produced.

Tobias Klein gave voice to the digital zeitgeist in declaring, “We [human beings] are soft, malleable data sets.” Yet if everything is now data, including bodies and buildings, how and to whose advantage is that data analyzed and applied? Selection criteria are inevitably human constructs that may take the form of artistic judgment, energy metrics, economic models, or political values. Ben van Berkel of UNStudio hinted at the conundrum of data analysis in his concluding keynote, in which he listed “different scales at which information comes together”—namely the diagram, the design model, and the prototype. But alas the Dutch architect, an acknowledged master of the diagram, did not elaborate on how, exactly, his office wrangles messy information into a clear design mandate.

One notable absence from the slate of participants in the symposium was a critic or historian to situate the New Normal within both the history of architectural practice and the wider milieu of contemporary culture. While one of the most prominent theorists of generative design, Manuel De Landa, made important contributions to the discussions, his comments focused not on situating the discourse, but instead on the artistic repurposing of non-linear, morphogenetic tools developed by scientists to create more personalized digital form-finding devices. Also lacking were the voices of women, who numbered only three out of twenty speakers and moderators, including Ms. Dubbeldam.

What the relentless experimentation among the symposium’s participants suggests is that, while there may be a new normal for the practice of architecture, it has yet to become normative—and that is a sign of its vitality.

CLOSE AD ×