Malte Weiß
HCI Research · Software Development

Bringing Haptic General-Purpose Controls
to Interactive Tabletops

by Malte Weiß

We are living at the transition from traditional desktop computers, involving a screen, a keyboard and a mouse, to the era of surface computing. Since the introduction of the iPhone five years ago, multi-touch devices are becoming more and more ubiquitous in our everyday lifes. Multi-touch interaction, allowing users to directly interact with digital interfaces by using their fingers, is nowadays a default feature in new smart phones and tablets.

Interactive Tabletops


Fig. 1: Interactive tabletop.

My PhD thesis deals with the next upcoming manifestation of touch-based interfaces: interactive tabletops, large horizontal multi-touches surfaces. These devices provide a collaborative setting for multiple users to simultaneously interact with digital content using fingers or by employing physical objects (Fig. 1). They combine an intuitive way of input with the benefits of collocated collaboration. In the recent two decades, interactive tabletops have gained much interest in the research community, and first commercial products have been released, such as Microsoft PixelSense or the SMART table. It is likely that in only a few years interactive tabletops will be part of our everyday environment.

Issue: Lack of Haptic Feedback

Multi-touch interaction is a natural and intuitive way of input. An interactive tabletop does neither require a keyboard nor a mouse. A user can trigger a virtual control by just touching or dragging it. The learning curve for touch interaction is low, which makes these devices approachable for novel users.


Fig. 2: Multi-touch surfaces
lack haptic feedback.

The great benefit of multi-touch surfaces is their flexibility: The software can dynamically change the graphical user interface (GUI) depending on the current context. However, on the downside, touch surfaces only provide very limited haptic feedback. When interacting with a touch screen, users interact with a planar surface that does not offer any haptic features. They cannot feel the interface. Virtual buttons do not provide a "click" feeling like the buttons of a physical keyboard. A virtual slider cannot guide the users' motion like its physical counterpart (Fig. 2).

Since interactive surfaces do not address the sense of haptics, they require visual focus. For example, typing on a touch-screen without looking leads to typographical errors as users tend to drift over time. However, when visually focussing on input controls instead of the data they manipulate, other events might be missed. This is one reason while operating touch screens while driving is not allowed in Europe. Due to the size of the finger, interacting with an on-screen control by touch is also less precise than using a real physical control. Finally, the lack of physical controls significantly disadvantages visually impaired users.

The ultimate goal of my thesis is to bring haptics back to interactive tabletops. And more precise: To provide general-purpose controls that combine the flexibility of graphical user interfaces with the benefits of physical controls.

SLAP Widgets: Translucent Physical General-Purpose Controls


Fig. 3: Prototypes of SLAP Widgets.

Fig. 4: SLAP Knob.

Our SLAP Widgets are the first step towards this goal. SLAP Widgets are physical general-purpose controls, such as knobs, sliders, and keyboards, that can be used to manipulate digital data on interactive tabletops (Fig. 3). Being made of acrylic and silicone, they employ the table's back-projection to change their visual appearance on the fly (Fig. 4).


Fig. 5: SLAP Keyboard.

An example is the SLAP Keyboard (Fig. 5). Let's say a user wants to edit text on the surface. She can just grab a silicone SLAP Keyboard and "slap" it on the surface. Our table detects the keyboard and immediatly shows labels beneath each key. The user can now start typing. The shape of the keyboard provides haptic feedback while its translucency allows dynamic relabelling. For example, a user can change the key layout from German to U.S. at run-time.

SLAP Widgets are passive controls; they do not require any electronics, batteries or cables. Therefore, they are low-cost, easy to build, and robust. The hardware setup is simple: A projector beneath the surface renders the graphical user interface, while an infrared camera inside the table detects touches and widgets. Each SLAP widget is mounted on a set of paper-based markers. The specific arrangement of markers communicates a widget's type, position, and state to a camera, and, thereby, to our software.

Our studies have shown that the physical SLAP Widgets can significantly outperform pure on-screen controls in terms of task completion time and accuracy.

Madgets: Actuated Controls for Bidirectional Interaction

While the lack of electronic parts inside the SLAP Widgets is a clear benefit, it is also a limitation in terms of flexibility. The software can change the visual appearance of a control but not its physical position and state. This can lead to inconsistencies between a control's physical and visual state. For example, imagine a user employing a physical slider for video navigation. When she starts video playback, the slider cannot follow the video position during playback because SLAP Widgets are passive physical objects. In this case, the position of the slider's handle becomes incorrect and misleading.


Fig. 6: Madgets Knob.

Fig. 7: Electromagnetic array.

In order to ensure a bilateral communication between user and system, we developed Madgets, magnetic widgets. Madgets are SLAP Widgets with permanent magnets attached to their base (Fig. 6). By synthesizing electromagnetic fields we can apply repelling or attracting forces to every magnet and, thereby, move a control across the surface or reconfigure its physical state. This is made possible by an array of electromagnets that is mounted directly beneath the surface (Fig. 7). We can change the power and polarization of every single electromagnet and, thus, apply arbitrary forces to the permanent magnets of the control. You find a demonstration of electromagnetic actuation in this video.


Fig. 8: Radiobutton Madget.

Fig. 9: Bell Madget.

Madgets allow applications to move, rotate, and configure physical controls on the surface. Concepts known from desktop GUIs, such as load/save and undo/redo can now be implemented in the world of physical tabletop controls. Since the electromagnetic fields reach beyond the surface, physical objects can also be actuated into the third dimension. Examples are the radio buttons in Fig. 8 and the bell prototype in Fig. 9, which both make use of vertical actuation. Quick repolarization of electromagnets lets permanent magnets vibrate, which is a useful output channel. Although Madgets are still passive objects that do not require electronic parts or batteries, we can transfer electronic or mechanical power to them, e.g., via induction (Fig. 10) or via the principle of electromotors (Fig. 11).


Fig. 10: Induction Madget.

Fig. 11: Gear Wheel Madget.

FingerFlux: Haptic Feedback Above the Surface

Not all interactive surfaces allow the use of physical controls, e.g., small or vertical screens like those that we find in a car (Fig. 12). Yet, electromagnetism enables haptic feedback without the need for tangible objects. Magnetic fields that are exerted by our electromagnetic array reach well beyond the surface. We used this effect to develop a novel method for 2.5D haptic feedback: FingerFlux.


Fig. 12: Touch screen in car.

Fig. 13: FingerFlux concept.

FingerFlux combines an electromagnetic array behind a multi-touch screen with a permanent magnet attached to the user's fingertip (Fig. 13). If a user approaches an activated electromagnet with her finger, the magnetic field creates – depending on the polarization – either an attracting or repelling force to the permanent magnet. This force is transferred to the user's skin and creates a haptic force sensation. This principle is fundamentally different from existing haptic feedback methods, such as vibration motors in mobile devices. First, FingerFlux provides feedback above the surface. Second, FingerFlux allows to apply both attracting and repelling forces.


Fig. 14: Vibration feedback above surface.

Fig. 15: Guiding the user.

A straight forward application of FingerFlux is to add a haptic feedback channel to visual touch screen interfaces, enabling users to feel the interface before touching it. For example: Using repelling electromagnetic forces, on-screen buttons can be rendered as "haptic bumps". Quickly repolarizing electromagnets create vibration feedback to the finger in order to, e.g., warn the user before pressing a criticial button (Fig. 14). Furthermore, using attracting fields into a certain direction while applying reppeling ones from the opposite direction allows an application to guide the user across the surface (Fig. 15). This is especially useful for visually impaired users. Finally, FingerFlux solves the problem of drifting when operating touch screen buttons without looking. Our user studies reveal that we can significantly reduce drifting when applying attracing electromagnetic fields beneath on-screen buttons.

Future Trends

There is an unstoppable tendency to replace conventional controls with touch interfaces, even in safety-critical systems such as cars. Thanks to advancing technology and falling prices, it is almost certain that interactive tabletops will be introduced to the consumer market within the next five to ten years. Improving the haptics of touch screens is a crucial goal to ensure an interaction that is as efficient as keyboard, mouse, and other conventional controls.

My thesis contributes to the solution of the issue of limited haptic feedback. Although my research bases on the design, implementation, and evaluation of prototypes, a commercial development is just around the corner. The concept of SLAP Widgets is also being transferred to mobile devices. For example, the commercially available Touchfire keyboard (released into 2011) is a well-engineered implementation of the SLAP Keyboard concept. In the more distance future, it is imaginable that bio hacking, i.e., the implantation of user interfaces into the body, might play a greater role. For example, members of the body modification scene have already experimented with implanting permanent magnets in order to retrieve a sixth sense. If in-body interfaces became an established technology, FingerFlux would be the next logical component for interactive surfaces.

If you are interested in further details, have a look at my thesis.

Developed by Malte Weiß — Impressum