Keywords

1 Introduction

In most models of increasing automation in production, robots are replacing the human performance of formerly manual tasks with increases in efficiency, speed and precision. Such a reading implies the isolation of the design process as an almost exclusively human cognitive process, which is manifested in physical form in connection with 3D design software and digital fabrication tools. The promise of such a separation and streamlined process is the removal of the intermediary segments from the process pipeline and the promise of more authority and control for the designer. However, this separation also removes the human from being involved in the tangible execution of the design. Designer intervention in the production process as it happens is one strand to be discussed here. Another is the possible role the physical material can take on if sensor feedback is included in the fabrication process. Once the material state is known, the progression of its manipulation can be updated in response to material processes that go beyond the immediate manipulation and are incorporated into the design. Thus, the process can become one of embodied computation. The extension of this concept through the application of augmented digital and material interfaces enables a form of augmented materiality.

With the rising quantity of experimental projects which engage architectural robotics, it becomes increasingly necessary to outline a range of methodologies and motivations by which these projects can be placed within a broader discourse. In this chapter, we outline a subset of approaches and areas of interest for the workshop which are presently practiced and explored at Princeton University’s Embodied Computation Lab and by Greyshed. Rather than beginning with the framework of a specific physical procedure, the intent is to provide a conceptual and methodological scaffold upon which an array of procedures can be assembled. By establishing generalizable principles through a variety of research experiments, a conceptual “morphospace” (Menges 2012) is defined which provides a specific region of research to be navigated within the context of architectural robotics.

2 Informed Operator Fabrication

The rising popularity of industrial robotics in architecture runs in parallel to the increasingly expansive set of “open source and bespoken software applications” (McGee et al. 2012) which make these tools easier for designers to program. Such frameworks (as BootTheBot, HAL, KUKA|prc, Lobster, Mussel, Onix, PyRapid, Rhino2krl, superMatterTools, etc.) enable “highly informed” (Bonwetsch et al. 2006) operability with minimal development time for the end user. As high levels of complexity become easier to achieve, their intricacies can become more difficult to comprehend. While in some cases the fabrication process acts inherently as infographic (in example, the linear feed of a 3D printer or raster-based milling/etching/plotting serves as a physical progress bar for its completion), there is generally no clear corollary between the human operator’s assumptions about the movement of a CNC machine and it’s actual execution. Even with the simple addition of a path optimization algorithm, the machine’s movements become significantly less predictable than with standard tabular Cartesian movement, and it becomes virtually impossible for the operator to know the position or order of subsequent moves (Fig. 1).

Fig. 1
figure 1

The bed depth or build height in 3D printing acts as an infographic for its state of completion. In comparison, the correlation is unclear in robotic movements, i.e. drilling

In order to satisfy the intent to unify the design and fabrication process “into an open system, where design decisions can be made while the physical manufacturing process is in progress” (Dörfler et al. 2012, p. 83), it is not only imperative that the designer can influence in-progress fabrication, but that he is capable of perceiving its peculiarities. By adding a system of callouts and interactive overlays which sync the physical actions of the fabrication process with the information of the digital model and its numerical control, the human operator becomes more informed as to the global significance of any singular machanic movement. As the absolute position of the robot and any robotically machined part are known by default, three-dimensional overlays can be achieved without necessarily requiring the complexity of computer-vision based tracking. Whether by digital projection, tablet interface, smartphone or AR-headset, guides which indicate information such as future toolpaths, positions of I/O triggers, registration marks, part-to-whole relationships, assembly instructions or points of possible interjection should evolve as an integrated byproduct of the machine code (Fig. 2). The most basic example of this is simply embedding messages into the robot code that appear on the control pendant, prompting action or requesting inputs. That which guides the machine should also guide the user.

Fig. 2
figure 2

Toolpaths are displayed on the tablet and can be modified on the fly

3 Interaction

The integration of the design and fabrication process into a continuously interactive workflow stems from the basic desire to reconsider the role of the human designer in the face of increasingly complex automation in fabrication. Rather than taking the stance of the Luddite by treating automation technologies solely as a shift away from human intuition and production, it is necessary to think about these elements as developing in parallel, co-dependently. The concern, however, is not without merit: a variety of robotic systems are quite simply existing vehicles with the human element removed from the loop: the earth digger or crane precedes the industrial robot, the drone and quadcopter are pilotless planes/helicopters, and the autonomous vehicle is a driverless car. In attempting to develop intelligent machines and operations which perform functions traditionally controlled by humans, we are encouraged to rethink our own role in these processes rather than completely severing the ties.

While the mechanical and numerical control of the robot might sometimes surpass that of the human, there are still numerous instances where the computerized system benefits greatly from the augmentation of human skills, such as image processing or spontaneous decision making (Branson et al. 2010; Willman et al. 2012; http://www.darpa.mil/NewsEvents/Releases/2012/09/18.aspx). By keeping the “human in the loop,” the intuition and cognition of the operator augment the skills of the robot, just as the robot augments those of the designer. In such a loop, it is necessary to compress and “interlace” (Dörfler et al. 2012) the design and fabrication sequence such that there is a continuous exchange of information: essentially, operating on the scale of Byte to Robot rather than File to Factory.

In using interactive techniques to bridge “the divide between embodied input and embodied output,” (Willis et al. 2010) we utilize human control and sensibility to provide a level of logical determinacy while simultaneously embracing the indeterminacy associated with improvisation. Interaction is therefore a means to augment machine logic while imbuing the artifact with the aura of manual manipulation and the proportionality of human gesture. This approach echoes that of Oskar Schlemmer and Johannes Itten, who emphasized the role of human intuition in design in the wake of the second industrial revolution as a balance between the Apollonian regulations associated with mechanical production and the free, rhythmic, Dionysian movement of the human form. As Schlemmer writes, “the initial impulse should be emotion, the stream of the unconscious, free, unfettered creation…If mathematical proportions and measurements are called in, they should function as a regulative.” (Schlemmer, Oct. 1972)

Digital technologies have the potential of reuniting the architect with his a priori material intuition and design impulses while filtering those decisions through “regulative” computational tools which keep structural, proportional, or other coded guidelines in check. This is essentially the digital extrapolation of the idea that “all beautiful lines are drawn under mathematical laws organically transgressed.” (Ruskin 1894) Such organic transgression can vary in influence, from simple noise generation (Fig. 3a), to gestural formation (Fig. 3b), or a combination of human input with natural or material phenomena (Fig. 6)—all of which benefit from an underlying algorithmic control. “Computers let us improvise better, with more notational density, with more variations possible in real time, and with that particular merger of continuity and notation so difficult to achieve in material media.” (McCullough 1996, p. 271)

Fig. 3
figure 3

Left a Human as noise generator: the robot’s level of precision while drawing is linked to the real-time values of an EEG headset, such that the drawing becomes messier as the human loses focus. Right b [AR]chitecture: gesturally defining a loft surface and a brick wall simultaneously using the Kinect (Johns 2012)

4 Controlled Manipulation and Dynamic Components

The high degree of control provided by the robotic manipulator presents a means by which to experiment with more volatile unknowns such as human improvisation or material indeterminacy. In approaching robotic fabrication exercises, we generally focus upon one or several qualities, or robotic virtues, which lend themselves to the rethinking of traditional design/construction techniques through the assurance of certain stabilities. These properties are speed, strength, stamina, patience, precision, and synesthesia.

The predictability of these qualities serves to enable, among other things, procedural processes which engage dynamic, stochastic or embodied material properties. On example of such an approach is the “Procedural Landscapes” project of Gramazio and Kohler, which engages the material properties of sand (http://www.dfab.arch.ethz.ch/web/e/lehre/208.html). Another is our experimentation with “Buoyant Extrusion”, in which complex geometries are created by synchronizing relatively simple robot movements with the extrusion of thermoset polymers into a similarly buoyant medium (Fig. 4).

Fig. 4
figure 4

Path followed by robot (red) versus material result of extruding silicon rubber (blue) into liquid soap

5 Embodied Computation

In an extended framework, such material-based procedural experiments can be situated in the concept of embodied computation, and boundaries between design process and design artifact can be redrawn.

Embodied computation offers the possibility to shift part of the execution of formal manipulation from a top–down process of manipulating material in a static manner towards one where the robotic or human actions are only part of the operation and changes triggered by the action complete the process.

This opens up different forms of “Material in the Loop” possibilities. One is that of continuous dynamic interaction such as the robotic arm swigging a liquid in a mold such that the liquid distribution across the surface happens through the combination of gravity and centrifugal forces controlled by the robotic arm. The setup of a camera and the timed extraction of a single frame every half second allows for the visualization of a single, recurring liquid feature. This essentially demonstrates a simple design interaction with a liquid form generated through the embodied computation of the material and guided by a numerically controlled actuator (Fig. 5).

Fig. 5
figure 5

Clockwise from left a Robotic swigging. b Fluid form reemerges every 7th frame. c Simple and repeating three-point-toolpath indicated in red

This raises interesting challenges for linking the digital state of the model with the physical. The simulation challenge here is not a precise predictive model but one that allows for constant synchronization between the physical and the digital state of the design process. The approach expands computational processes from digital processes being executed into physical form to include the stochastic behavior of material into the form giving process.

Embodied computation can be extended to connect the design process with the designed artifact itself. The work by D’Andrea’s group at ETH Zurich shows an example where a computationally controlled quadcopter can recover algorithmically from a drastic physical change to its physical body (such as partial trimming of its rotor blades) through learning on the fly from the changed feedback it receives through its sensors (Mueller and D’Andrea 2011, 2012).

6 Augmented Materiality

6.1 Concept

The robotic manipulation of dynamic or stochastic materials has demonstrated the potential to result in novel constructs which take form through the application of the principles of embodied computation. Such constructs, while generally repeatable within certain tolerances, prove inherently difficult to generate based on specified design intent. As the resultant form does not develop as a direct parallel of a digital model, but rather as the result of a material reaction to a designed trigger, the outcome of such procedural experiments can be largely unpredictable or unexpected. While the richness of the indeterminate material reaction is desired, it is necessary to explore means by which we may direct these reactions towards a result which more closely approximates the intent of the human designer. Through the use of visualization techniques which make the computer’s algorithmic processes and material simulations apparent to the designer, and interactive tools which allow him or her to manipulate both the digital model and the material manifestation simultaneously, the result of such experiments can be more closely related to the embodied inputs associated with design intent. This type of workflow requires a quadripartite balance between the influence of the human designer, the robotic manipulator, the material properties, and the computer simulation, where no entity can operate without accounting for its impact upon the others. The concept of augmented materiality is understood as the encapsulating framework of such workflows. Essentially, it is a system in which interactive techniques enable the guided, real time manipulation of stochastic material systems—affording a degree of improvisation while maintaining the connection to the “highly informed” potential of digital models and tools. Augmented materiality can be understood as a means to imbue material craftsmanship with the qualities of digital fabrication such that algorithmic and robotic control act as additional material attributes. For the sake of this definition, we recognize craftsmanship as “simply workmanship using any kind of technique or apparatus, in which the quality of the result is not predetermined.” (David Pie, cited McCullough 1996, p. 202)

Augmented materiality stems from the concept of “Digital Materiality”, which “evolves through the interplay between digital and material processes in design and construction.” (Gramazio and Kohler 2008, p. 7) Augmented materiality engages this interplay while focusing upon the human position in this dynamic: through sensory feedback and the physical overlay of digital information, it compresses the process of digital/material conversion into a feedback loop which supplements algorithmic and manipulative processes with human improvisation and intent.

6.2 Sample Application

The concept of augmented materiality is illustrated in the Mixed Reality Modeling project, which uses a robot-mounted heat gun (the material manipulator) to iteratively melt away material from a block of wax (the material). In this process, the robot is equipped with a 3D scanner and RGB camera (the sensors) which provide a colored point cloud of the physical materials and interface. The human user can indicate desired structural load forces on the wax by placing physical blocks. These are scanned and automatically placed in the corresponding location of the digital model. The software then proceeds to evaluate the structural necessity of each region of the wax block through topological optimization, accounting for the user-placed loads and support conditions. Following this calculation, the robot proceeds to heat and melt away areas that are the least structurally necessary. As the amount of wax that will melt away during each melt cycle, or the direction that it will flow and accrue is not precisely known, the process requires iterative scanning and recalculation. This iterative manipulation with the “material in the loop” simultaneously allows for “human in the loop” modifications: at any point in the process, the human can shift the loading conditions, indicate desired void areas by coloring on the wax, or physically modify the wax. Through the combination of user tracking and digital projection, the human operator is constantly informed as to the three dimensional calculations of the software, and the projected toolpaths and operations of the robot (Johns 2014) (Figs. 6, 7).

Fig. 6
figure 6

Prototypical augmented reality interface. Projection of digital information onto physical artifact

Fig. 7
figure 7

Left topologically optimized digital model versus materially informed result. Right iteratively melted form with two loading points, three supports, and user indicated void

6.3 Generalization

In a generalized context, processes which engage augmented materiality must provide a means for embodied interaction from the human user (through the physical manipulations of objects or one of the increasingly large variety of intuitive human interface devices), and a means to inform the user as to the operations of the digital model and its physical manifestation (augmented reality). The physical process must inform the digital model through a network of sensors, and vice versa through a means of digitally controlled manipulation (robots or CNC devices of any form). This manipulation should not be entirely determinate in its effects, but according to the principles of embodied computation, should serve as a trigger for a more complex material reaction. The selected physical medium should therefore engage some level of indeterminacy or stochastic behavior, such as fluid dynamics, erosion, plant growth, animal behavior, etc. Thus, the designer is provided the capability to iteratively influence and craft stochastic systems while simultaneously benefiting from the informed control of computer algorithms which work to maintain coded parameters such as structural stability, volume, or program within established parameters (Fig. 8).

Fig. 8
figure 8

Conceptual framework for augmented materiality and embodied computation

7 Conclusion

Recent work in robotic fabrication serves to augment the human with the precision, power, and speed of an automated process. The claim here is to expand this notion of augmentation to include the augmentation of the material that is being manipulated. The inclusion of sensing and feedback which report back on the state of the physical artifact as it is being changed allows for a closer fusion of human and robot actions. The concept of embodied computation is introduced as a protraction of the design sequence to include physical and material reactions which continue to occur after and in reaction to the specified trigger. Augmented materiality is then presented as the human occupation and influence upon such a cycle, as enabled through an interactive and digitally mediated interface.