TangiBlocks

Idea/Vision

TangiBlocks is an AR embodied interface that enables low-abstraction design-to-fabrication workflows. It explores the potential of a largely overlooked area within digital design and fabrication, at the intersection of embodied interfaces and non-representational human-computer interaction. TangiBlocks provides an alternative to traditional widespread, commercially available digital tools and interfaces that rely on high-level symbolic communication between user and computer. At its core, it enables a design-to-fabrication workflow where the user can seamlessly move from mental representation to digital representation to physical materialization, without having to transcode their design ideas to different abstract encapsulations.

Motivation

Fabrication can be defined as a technologically mediated, bidirectional transaction between a human agent and a material formation. In a fabrication process, the human agent produces a change in the material formation via physical actuation, and the material formation induces a cognitive change in the human agent via information transferal. Technology mediates this exchange providing means of physical manipulation, communication, and data generation, processing, and storage. The employment of different technologies produces different fabrication processes that allow specific human-material transactions, and confer diverse degrees of agency to human agents and material systems.

In digital design and fabrication, the human-material entanglement and its technological mediation operates at many scales and levels. Generally speaking, digital technologies are so embedded within design and fabrication processes that the human agent’s agency is greatly reduced in them, some authors argue. The first theoretical approximations to this phenomenon can be traced back to the late 1980’s in authors like Donald Schön and Malcom McCullough. However, it wasn’t until the mid 2000’s that digital design fabrication technologies were advanced and pervasive enough for it to become really noticeable in design practice. Since then, the topic remains relevant for theoreticians, practitioners in design and HCI.

The core of this issue lies on the relationship between humans and the technologies that assist digital design and fabrication processes, i.e. computers and digital machinery. Under that understanding, the problem can be dissected in terms of the physical and logical interactions between them. The image below shows an abstract space of embodiment level, i.e. physical interaction, and representational abstraction, i.e. logical interaction. There, different digital design and fabrication case studies have been distributed based on the role the body plays in the human-material transaction, and the nature of the representations employed in the human-computer and human-machine interactions.

Original landscape of digital design-to-fabrication in terms of bodily involvement and representational abstraction. See referenced projects in bibliography. Click to enlarge.

This landscape can be divided in four large quadrants. From the upper-right area, going in clockwise direction, these regions are:

  1. Embodied fabrication: Design-to-fabrication workflows where the body actively engages with both the material and computer systems. Although strictly speaking, all human-computer and human-machine interfaces are embodied (as our bodies are our only input of data), some of them more explicitly and deliberately make use of the body, and make use of its advantages vis-a-vis computational systems. This type of workflow relies on high-level abstractions for human-computer interactions. Examples include Smart Tools, gesture-based fabrication interfaces, and tangible interactive fabrication interfaces.
  2. Traditional digital fabrication: Design-to-fabrication workflows that make use of high-level abstractions and do not make use of the human body in any especial way. This approach represents most of the mainstream, traditional digital design and fabrication technologies: industrial robotic arms, 3d printers, CNC routers, etc.
  3. Material agency: Design-to-fabrication workflows where the material plays an important role through stochastic behaviors that are not represented within the computer, or represented at a very low level. Examples include open-ended fabrication processes where machines  carry out a set of instructions and material behavior determines the actual final outcome, and in more elaborated cases, similar processes where machines are equipped with sensors that enable feedback loops between material and computer systems.
  4. Low-abstraction embodied fabrication: Design-to-fabrication workflows where the body is actively involved and the interactions between humans and computers and digital machines are mediated by low-abstraction representations. Although I cannot provide a clear description at this point, this low-level abstractions depend not so much in the medium as in the set of signs and signifiers they are transmitted through and how much they depart from the actual object they represent (I will expand on this point in the following section). As the graphic shows, this region remains a largely unexplored area in digital design and fabrication.

Theoretical Background

In order to better understand the possibilities latent on the space of embodied, low-abstraction design-to-fabrication workflows, it is essential to develop a formal representation of what design-to-fabrication is, from a representational perspective.

Original simplified model of a digital design-to-fabrication process. Click to enlarge.

The image above shows a simplified formal model of a digital design-to-fabrication process. The model separates the process into five different representations that mediate the original mental representation and the final physical object.

  1. Mental representation: The start of the design process, consisting on an idea to be materialized. This representation is made of images, semantic descriptions, and all sorts of non-symbolic mental constructions.
  2. Analog representation (1): All mental representations reside in a space somewhere inside our (extended or embodied or distributed) minds. The only vehicle this representations can use to exit this space is our body. Therefore, these analog representations refer to the bodily set of actions that translate the mental representation into a digital representation. One example could be the motion of our arms when moving around a mouse to draw on a computer screen; or the air vibrations produced by our vocal chords in a natural speech interface; or the motions of our eyes in a gaze tracking interface; or all the above in a multi-modal interface.
  3. Digital representation (1): The high-level set of representations the computer uses to communicate with the user. Please note this does not refer to low-level representation (e.g. bytes of information that constitute a piece of information) but the actual high-level representation that the low-level representation builds up to. Examples of these digital representations include a two dimensional drawing in a CAD environment; a three dimensional model of the same drawing; a parametric model of the same model; or a graph representation of the parametric model. In sum, these are high-level representations intelligible for humans.
  4. Digital representation (2): The low-level representation that the computer uses to communicate with a digital fabrication machine. These representations are often obscured to the user, although concurrent fabrication processes attempt to make them visible.
  5. Analog representation (2): The low-level representation that produces a digitally-fabricated physical artifact as an outcome . This is the actual set of motions, and physical and chemical transformations that mediate a virtual object and its digital representations, and a physical object in real world.
  6. Physical object: The final product of the design-to-fabrication process. Although the object is not a representation in and of itself, it can only be apprehended by a user as one.

Evidently,  different human-computer interfaces, and different fabrication techniques allow for many variations of this simplified model.  Below, three examples are provided. In all three scenarios the design idea is to build some sort of assembly using brick-like blocks, using a robotic arm for their construction. In the first example, the human-computer interface is a traditional mouse. Therefore, the analog representation of the design idea are the movements of the hand and the mouse, while the digital representation is an abstract two-dimensional drawing. The second example makes use of a natural speech interface to construct a parametric model of the block assembly. Thus the analog representation is the vocalization of information and the actual sounds emitted, while the digital representation is a set of relationships between the different blocks. The third example is TangiBlocks, an embodied interface where the user manipulates real blocks to manually “create” a digital block assembly.

Examples of design-to-fabrication workflows. Click to enlarge.

What is relevant to note in these examples —which is what this design-to-fabrication model is useful for— are the leaps in abstraction from one representation to another. In the first example, the mental representation of a block assembly has absolutely nothing to do with the actual analog representation —a set of 2d trajectories of the mouse in space. Therefore, the leap in abstraction from one representation to the other is enormous. In the second example, the mental representation of the brick wall is much closer to the natural language representation of it. Although there is a great deal of abstraction between one and the other, natural language directly translates an important part of what the brick assembly is —for instance, that block A is next to block B, and that block C is above both of them. However, in order to do that, it relies on high-level representations (in this case, linguistic signs).

What TangiBlocks tries to address from a design interface standpoint, is precisely the question of how these abstractions and representations affect design processes. Our mental representations are complex and multilayered, yet are, at least a part of them, very close in form to the actual, physical object. So how is the process of design affected by having gaps between one set of representations and another, sometimes with huge leaps of abstractions in between? The point is not to dismiss the huge power that abstractions provide to the creative exercise of designing. Rather, the idea is to just speculate on how a design process where abstractions are very low between one representation and another would be. Therefore, in the third example, the analog representation consists on actually building a wall assembly with the user’s own hands, to produce a scaled version of the final object. Other than the size and perhaps material, no other abstractions mediate the mental and analog representations. Furthermore, the digital representation simple is a 3d model, very close in abstraction to the mental representation as well. Moreover, this digital representation can be projected back to space in real-size by means of AR, which not only reduces the level of abstraction to the mental representation, but also to the final, physical object. Last but not least, the digital model is fabricated concurrently as the user designs it, which yet again reduces the abstraction (this time temporal and material) between mental representation and physical manifestation.

Beyond the discussion about representations and abstractions, TangiBlocks addresses two very relevant issues in contemporary digital design and fabrication. The first one, related to the above mentioned closure of the temporal and conceptual gaps between design and fabrication, and the second, related to the role of the body in design processes.

In architecture and other design related disciplines, there has been a growing demand for flexible digital fabrication systems that perform well in unpredictable creative design and fabrication processes. These flexible systems could challenge rigid and deterministic predominant digital fabrication praxes by increasingly merging design and fabrication workflows (Carpo 2011) through adaptive processes that allow real-time change and improvisation (Willis et al. 2011, Bard et al. 2014, Dubor et al. 2016). This would help reduce the separation of design as an immaterial process, and fabrication as the predetermined, automatic step that follows (Stein 2011). Additionally, researchers in digital design and fabrication have become increasingly interested in creative workflows that actively engage the human body. Examples of this trend include immersive design interfaces, gesture-based fabrication machines, and wearable human-computer and human-machine interfaces. These approaches contest the deficient use of bodily experience in traditional digital design workflows with creative practices where embodied cognitive processes inform novel ideas, and support decision-making (Treadaway 2009). Furthermore, the notion of implicit embodied knowledge informing digital design and fabrication processes is radically different yet complementary to mainstream ideas about the interplay of data —i.e. explicit symbolic knowledge— and materiality that is enabled by digital fabrication tools.

Related Work

TangiBlocks is grounded in three different areas of inquiry within the larger field of digital design and fabrication: embodied interfaces for design and fabrication, AR in robotic fabrication, and robotically-fabricated block assemblies. A few selected case studies will be presented on each category.

Embodied interfaces for design and fabrication

Embodied interfaces is an area of research that has received great attention in recent years (Vandoren et al. 2008, Vandoren et al. 2009, Payne 2011, Rivers et al. 2011, Willis et al. 2011, Lau et al. 2012, Zoran et al. 2013, Pinochet 2014)  . Out of the great body of work existing out there, one example will be described here. Nakano and Wakita (2014) present an augmented solid modeller using boolean operations with tangible objects. Their project makes use of 3d printed primitives that can be combined physically to produce boolean operations in a virtual space. While their work has an emphasis on the direct translation from physical action to virtual representation, they do not speculate on the implications of such workflow from a perspective of representation, and also do not consider concurrent design and fabrication as part of their interface.

AR in robotic fabrication

Several projects in robotic fabrication have made use of AR to visualize the robot’s path in space, or simply overlay relevant information over a physical model. In that regard, Johns et al. (2014) research “Design approaches through augmented materiality and embodied computation” is relevant to TangiBlocks. This theoretical work argues that “processes which engage augmented materiality must provide a means for embodied interaction from the human user, and a means to inform the user as to the operations of the digital model and its physical manifestation”. As part of their research they developed several experiments using AR and robotic fabrication. In one of them, they equipped the robotic actuator with a heat gun which was used to melt a block of wax. As the wax melted its change of shape was registered using rgbz sensors. Then, a computer ran a FEA simulation over the digitized object (there were weights located on top of the wax block, that could be moved in real-time) which was then projected back on top of the block of wax. While this work shares many similar aspirations with TangiBlocks, it did not explore in-depth the embodied involvement of the human in the process.

Robotically-fabricated block assemblies

Many research groups have developed robotically-fabricated, architectural-scale block assemblies. Examples include Pritschow et al (1996), Helm et al. (2012), and Dörfler et al (2016). However, these works mostly focus on the robotic automation of a construction process, rather than issues related to concurrent design and fabrication, the use of embodied interfaces, or abstraction and representations in HCI and HMI. One project with a different approach is the Endless Wall (Kondziela et al. 2011). Here, a robotic arm is equipped with rgb-z sensors to track a human user. The user draws on the floor using masking tape, and the robot follows up by building a block assembly using the drawn curve as a guide. The endless wall then operates both at the scale of embodied design interface and concurrent fabrication. However, it relies on a line as an abstraction of the block assembly, and greatly delegates the fabrication task to the robot.

Design and Implementation

TangiBlocks aspires to produce a non-representational design interface, which in the case of block assemblies, means having the user assemble the wall block by block. While a series of blocks with magnets for easy snapping were developed, limitations on AR tracking did not allow me to achieve this. Instead, I produced a more abstract interface where the user would move key blocks around a AR canvas, and have the computer complete the block aggregation.

TangiBlocks consists on the following hardware and software platforms:

Hardware

  1. Guide blocks: Three-dimensional cardboard AR tracking blocks used to define key blocks on the wall assembly.
  2. Control blocks: Three-dimensional cardboard AR tracking blocks used to parametrically control  attributes of the block assembly (e.g. height, size of blocks).
  3. HMD: Ideally, TangiBlocks would make use of a head mounted display such as Microsoft Hololens. In this first experiment, however, a laptop equipped with an external webcam was used.

Software

While ideally TangiBlocks would run natively in the HMD, at this point in time it runs simultaneously in diverse software platforms.

  1. Unity: Unity 5.5.2f1 was selected as the platform for visualization due to its robustness and flexibility.
  2. Vuforia: Vuforia 6.1 was chosen as the AR library to track the three-dimensional cardboard AR tracking blocks in the real world.
  3. Rhinoceros + Grasshopper: Rhino and Grasshopper were chosen as the platform to store and parametrically manipulate the virtual model created by the user. Different attributes of the block assembly and their respective parametric space were defined beforehand (e.g. height, percentage of overlap between blocks, rotation and scale of the structure as rose from the ground, etc.). An interesting parameter of the block assembly was the virtual block to be used in the digital model: while the AR trackers are simple cuboid prisms, the actual 3d model can reference much more complex geometries with a similar bounding box, that can also be adjusted parametrically.
  4. Node.js and SocketIO: In order to connect Unity and Rhinoceros to stream data and meshes between them, a library developed by Junichiro Horikawa was used.
Rhino-to-Unity. Click to enlarge.

Usage Scenario

The image below describes the envisioned use scenario of TangiBlocks. In its current version, only part one and two are fully implemented, while part three is under development. Part four has not been developed yet.

TangiBlocks’ envisioned usage scenario. Click image to enlarge.

The set of animations below capture the use of TangiBlocks as a narrative.

First, the a user interested in working with block assemblies, either as a final architectural product or just as a proxy during design tinkering develops a mental representation of their design object: the type of block, the overall aggregation, etc. It is a rough mental model to start from.

Initial thinking. Click to enlarge.

Then, using TangiBlocks, the user designs their base AR block using an interface built for that purpose,  then prints , lasercuts, and glues the blocks.

Block fabrication. Click to enlarge.

Afterwards, the user uses the interface to explore the design space of their original mental representation. The aim of the interface is to allow quick iteration between design possibilities using the body as the main design driver. The user not only uses their arms and hands in a way that mimics the way in which the actual block assembly would be built, but also experiences the object situated in real-world space, both in a reduced and in a 1:1 scale. The reduced scale allows for an omniscient experience, where the user observes and affects the design object as an object. In this design mode, several analytical (i.e. abstract, representational) visualizations can be presented: structural performance derived from FEA analysis, shadow studies using solar similation, CFD analysis, etc. The 1:1 scale allows for an immersive experience of the design object, where the designers gets the opportunity to visit their own design from within. Although this is something that can be commonly done in any CAD package, doing it in space, traversing it using one’s own body adds a much richer layer of experience. While the object scale advocates for a design experience that is intellectual, rational, and objective, the 1:1 scale fosters a phenomenological design processes, inspired by affect and subjectivity. To the best of my knowledge, there are not many design interfaces that exploit this possibility of designing between scales. Please note the 1:1 visualization has not been developed as of today.

Creating simplified AR model with blocks. Click to enlarge.
Parametrically adjusting the model and visualizing it. Click to enlarge.

The result of the process is a 3d model from which the code necessary for the robot to fabricate the assembly can be obtained. In the most simple scenario, the robot would build the assembly after the user has finished modeling it. A more complex scenario would feature a robot that can trace back its actions to reflect real time changes in the digital model. Finally, the most advanced scenario would feature a human-robot collaborative process where the user could modify the real, robotically built structure, and have the changes propagated back to the virtual model.

Effect of the control blocks over the block assembly (blue is block overlap, red is scaling, yellow is rotation, and green is using different blocks). Click to enlarge
Series of actual 3d models created with TangiBlocks. Objects changed rotation on the horizontal axis, and percentage of block overlapping on the vertical. Click to enlarge.
Series of actual 3d models created with TangiBlocks. The models change in rotation on the horizontal axis, and change in brick size in the vertical axis. Click to enlarge.

Contributions

This work has three main contributions:

  1. From a theoretical standpoint, the development of a model that dissects digital design-to-fabrication workflows in terms of bodily involvement and level of abstraction of its representations; and the development of a formal representation of design-to-fabrication processes in terms of the different representations involved in them.
  2. At a practical level, the development of an augmented-reality interface for 3d modeling using (relatively) low-level abstractions.
  3. At a technical level, the development of a pipeline connecting Rhinoceros Grasshopper with Unity with the goal of 3d modeling using AR.

Conclusion and Future Work

TangiBlocks is an embodied augmented-reality 3d modeling system that makes use of low-abstraction representations for human-computer interaction. A formal model has been developed to understand the space where TangiBlocks resides in comparison to other digital design-to-fabrication workflows. Future work will focus on the concurrent fabrication aspect of the project, which was left aside in this first implementation due to time and resource constraints. Also, more refinement on the theoretical model is required to fully understand the possibilities and shortcomings of embodied, low-level abstractions design-to-fabrication workflows.

Acknowledgements

I would like to thank Yujie Hong for her feedback on some of these ideas; Pattie, Judith, Xin, Arnav, and Oscar for their feedback and inspiration; and Junichiro Horikawa for his GH-to-Unity interface and his feedback in troubleshooting it.

Citations/Bibliography

Ardiny, Hadi, Stefan John Witwicki, and Francesco Mondada. 2015. “Are Autonomous Mobile Robots Able to Take Over Construction? A Review.” International {J}ournal of {R}obotics 4 (3): 10–21. http://ijr.kntu.ac.ir/article_13385_ba582e6a1d4952a0510eada09b2c65a3.pdf.

Dierichs, Karola, and Achim Menges. 2012. “Functionally Graded Aggregate Structures : Digital Additive Manufacturing With Designed Granulates.” ACADIA 12: Synthetic Digital Ecologies, no. 2011: 295–304.

Gu, N, S Watanabe, H Erhan, M Hank Haeusler, W Huang, and Hong Kong. 2014. “Augmented Solid Modeller Using Boolean Operations with Tangible Objects,” 117–26.

Helm, Volker, Selen Ercan, Fabio Gramazio, and Matthias Kohler. 2012. “In-Situ Robotic Construction: Extending the Digital Fabrication Chain in Architecture.” ACADIA 12: Synthetic Digital Ecologies [Proceedings of the 32nd Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA)], 169–76.

Lau, M.a b, M.a c Hirose, A.a d Ohgawara, J.a c Mitani, and T.a d Igarashi. 2012. “Situated Modeling: A Shape-Stamping Interface with Tangible Primitives.” Proceedings of the 6th International Conference on Tangible, Embedded and Embodied Interaction, TEI 2012, 275–82. doi:10.1145/2148131.2148190.

Payne, Andrew. 2011. “A Five-Axis Robotic Motion Controller for Designers.” Proceedings of ACADIA 2011, 162–70.

Peng, Huaishu, Amit Zoran, and François V Guimbretière. 2015. “D-Coil: A Hands-on Approach to Digital 3D Models Design.” Proceedings of the ACM CHI’15 Conference on Human Factors in Computing Systems 1: 1807–15. doi:10.1145/2702123.2702381.

Pritschow, G., M. Dalacker, J. Kurz, and M. Gaenssle. 1996. “Technological Aspects in the Development of a Mobile Bricklaying Robot.” Automation in Construction 5 (1): 3–13. doi:10.1016/0926-5805(95)00015-1.

Rivers, Alec, Ilan E. Moyer, and Frédo Durand. 2012. “Position-Correcting Tools for 2D Digital Fabrication.” ACM Transactions on Graphics 31 (4): 1–7. doi:10.1145/2185520.2335439.

Treadaway, CP. 2009. “Hand E-Craft: An Investigation into Hand Use in Digital Creative Practice.” Creativity and Cognition, 185–94. doi:10.1145/1640233.1640263.

Vandoren, Peter, Luc Claesen, T Van Laerhoven, J, Tom Van Laerhoven, Johannes Taelman, Chris Raymaekers, et al. 2009. “FluidPaint: An Interactive Digital Painting System Using Real Wet Brushes.” Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, 53–56. doi:10.1145/1731903.1731914.

Vandoren, Peter, Tom Van Laerhoven, Luc Claesen, Johannes Taelman, Chris Raymaekers, and Frank Van Reeth. 2008. “IntuPaint: Bridging the Gap between Physical and Digital Painting.” 2008 IEEE International Workshop on Horizontal Interactive Human Computer System, TABLETOP 2008, 65–72. doi:10.1109/TABLETOP.2008.4660185.

Vasey, L, T Grossman, H Kerrick, and D Nagy. 2016. “The Hive: A Human and Robot Collaborative Building Process.” SIGGRAPH 2016 – ACM SIGGRAPH 2016 Talks, 1–2. doi:10.1145/2897839.2927404.

Willis, Kdd, C Xu, and Kj Wu. 2011. “Interactive Fabrication: New Interfaces for Digital Fabrication.” Proceedings of the Fifth …, 69–72. doi:10.1145/1935701.1935716.