EUGENIE++: Exploring Tangible & Gestural Interaction Techniques

From Minority Report, a fictional–but very awesome–depiction of a gestural-interaction based system.

How are you interacting with this webpage right now? Chances are, you navigated to this page by typing on a keyboard or clicking with a mouse; you may have used a touch screen–a simple example of tangible interaction. Generally, the range of computer interaction methods available to you isn’t very broad; you can use touch or a mouse, but they both involve selecting and “clicking”.

The Eugenie team’s goal is to expand and build upon these methods by designing, testing, and evaluating new interaction techniques: ways of inputting and manipulating data that extend beyond ubiquitous mouse- or touch-based systems.

Google Glass, a real system that uses tangible, gestural, and audial input. (Our lab works with these, too!)

This summer, we focused on exploring new interaction techniques using active tangible tokens–physical objects that can sense and react to changes in their environment and to the ways in which they are manipulated. For example, a mouse, while tangible (i.e. you can hold it in your hand) is not active: it doesn’t change in any way based on how you use it. For active tokens, we used Sifteo cubes, which are small micro-computer blocks that have screens and sensors that can detect their orientation, rate of acceleration, and proximity to one another.

Sifteo cubes, our active tangible tokens.

In addition to the Sifteo active tokens, we also 3D printed passive tokens to act as constraints. The 3D printed blocks served as casings for our active tokens; we designed them to hold the Sifteo cubes without obscuring the screens, and to implicitly convey information about how the tokens should be used.

In order to develop and test interaction techniques involving tokens and constraints, we decided to build upon a design tool that we developed last year for the synthetic biology competition, iGEM. The application, Eugenie, which won a Gold Medal at iGEM 2013, is (put simply) a multi-touch application that allows synthetic biologists to (1) explore and search for biological parts from multiple databases, (2) specify structure of biological constructs, (3) specify behaviour between biological parts and/or constructs, and (4) view and prune results. To support collaboration, we created the application for the Microsoft SUR40 (using C# and the Surface SDK 2.0).

Below is a short video of Eugenie:

We added interaction with Sifteo cubes and 3D printed parts to Eugenie to create a second version: Eugenie++. The tangible interface replaces the first (exploration) and second (specification of structure) phases of the Eugenie application.

In the exploration phase, users interact solely with the Sifteo cubes. Our Sifteo application contains a small database of common biological parts. By tilting the cubes, users may scroll through a given category of parts. Neighbouring the cubes vertically allows users to view more specific categories (e.g. neighbouring cube B under cube A–which displays “promoters”–would load a subcategory of promoters in cube B). Pressing the screen locks a part to the cube: until the cube is re-pressed, it is associated only with the shown biological part.

In the specification phase, users combine the Sifteo cubes with the 3D printed blocks to specify structure. For example, to create a “NOT X” relationship, the user combines a cube containing the part X and a NOT constraint. The constraints are designed to implicitly limit user error; the blocks have puzzle-piece-like extrusions and indents that only fit together in certain ways.

Users interact with tokens and constraints on the Surface bezel.

Users interact with tokens and constraints on the Surface bezel.

Once the user finishes defining the structure using the blocks, they then “stamp” the construct onto the Microsoft Surface. The SUR40 uses computer vision and can therefore detect certain shapes, patterns, and objects. The cubes and constraints are attached to byte tags, a patterned image that corresponds to a given object. Eugenie++ uses this information to determine the structure of the object placed on the Surface, and it adds this to the ruleset accordingly.

Here’s a video of the Eugenie++ project’s conception and development, from start to finish:

Preliminary user-testing revealed that our interface was fairly intuitive, with most users intuiting the behaviour and meaning of the constraints from their physical characteristics. Most users also reported finding the experience fun and informative.

We published a work-in-progress extended abstract and presented a poster of our results at UIST 2014 in October. We are also excited to announce that a peer-reviewed paper of our results was recently accepted to TEI 2015!!

Stay on the look-out for future posts about our experience at UIST, as well as TEI in January 2015.