CanHaptics Lab 3: Communicate something with Haply

Task: create a vocabulary of 3 words and communicate them using only haptic feedback

Rúbia Guerra
7 min readMar 12, 2021

Goals

In this lab, we aim to explore different ways of communicating a small vocabulary using haptic feedback.

Materials

My code for this assignment can be found here.

Tools

Haply device, Processing Development Environment.

tl;dr

In this exercise, Hannah E. and I worked together in trying to represent the words sticky, bouncy, bumpy, and sandy using different haptic rendering features. We had a chance to try our sketch with four users (thanks, Kattie, Unma, Spencer and Preeti!). Also, thanks, Hannah, for providing the hand-drawn sketches and videos!

Summary of our vocabulary, rendering features and user reactions

Sketching words

Hannah and I used this lab as a way to test some haptic feelings for our project. After some brainstorming, we came up with 4 words*, which were implemented as 4 modes, each activated by a key press (1–4):

1. Sticky or syrupy
2. Curvy or bouncy
3. Bumpy or spaced
4. Sandy or grainy

*we thought that creating an extra word was the requirement for working together, oops! In the end it was fun to explore another rendering mode.

Word #1: Sticky

From our project iteration, we learned that high damping causes our avatar to move slower for the same amount of force applied to the end effector, which resembles the feeling of wading in a dense, viscous liquid. Playing around with this concept, we sketched three regions with high damping:

Sketch for mode 1

Following how the water layer was implemented on Haply’s Maze Physics example, we decided to change the damping of the avatar itself to create the illusion of passing through a sticky region. Our reasoning behind sketching three regions with increasing damping was to recreate the feelings of moving in water (damping = 700), honey (damping = 800) and mud (damping = 950, to signify that mud feels a bit denser than the other two). We came up with these values after incrementally increasing the damping factor until we started perceiving a consistent drag when moving the end effector around.

if (s.h_avatar.isTouchingBody(l1)) {
s.h_avatar.setDamping(700);
} else if (s.h_avatar.isTouchingBody(l2)) {
s.h_avatar.setDamping(800);
} else if (s.h_avatar.isTouchingBody(l3)) {
s.h_avatar.setDamping(950);
}
Mode 1: feeling “stickiness” through viscous layers

Word #2: Curvy/bouncy

For our second word, we tried to represent the concepts of curvy or bouncy. Our initial thought was to use guided movement to represent parabolas in space, as if you drop an object and it bounces around until it stops.

Initial concept of a bouncy movement (source)

This idea came up before we tried tuning a PID for Lab 4. In light of the difficulties in creating a stable movement, and that wouldn’t require that our users tuned for their device as well, we decided to explore other alternatives. We pivoted to using solid, static objects, that push the end effector in the opposite direction after a collision:

Sketch for mode 2

Each element of our “bouncy” wall consists of an impenetrable circle. We chose this shape since tracing circles should also communicate the feeling of something bouncy and curvy. Since we were using the fisica package, we were able to model impenetrable objects by creating bodies with setSensor(false):

c1 = new FCircle(10);
c1.setPosition(2, 12);
c1.setSensor(false);
c1.setStatic(true);
Mode 2: “bounciness” with large static circles

Word #3: Bumpy/spaced

For our third word, we aimed to simulate a texture using static objects, instead of using damping and force components. We came up with the notion of something that feels bumpy and spaced, like running your hand through spaced bars of a fence or a surface with small bumps.

Inspiration for recreating a bumpy texture with small, static objects (source)

Starting with the previous example, we planned on adding several small circles, spaced far enough so that the user feels a depression between two circles, but close enough that the avatar can’t go through in between bumps.

Sketch for mode 3

Adding circle by circle would take a long time, so we decided to create multiple bodies using a loop. We realized this would be possible when Hannah remembered that on Lab 2 she was able to add multiple walls to her maze using the same variable.

for (int x = -10; x < worldWidth + 10; x++) {
b1 = new FCircle(0.5);
b1.setPosition(x, 10);
b1.setFill(0);
b1.setDensity(500);
b1.setSensor(false);
b1.setNoStroke();
b1.setStatic(true);
b1.setName(“mode_3”);
world.add(b1);
}

One challenge of this strategy is removing all objects from the screen when switching modes, since the variable “b1” will only store a reference for the last object created. We were able to work around this issue by implementing a function that removes bodies from the fisica world based on the object’s name.

void removeBodyByName(String bodyName) {
ArrayList<FBody> bodies = world.getBodies();
for (FBody b : bodies) {
try {
if (b.getName().equals(bodyName)) {
world.remove(b);
}
}
catch(NullPointerException e) {
// do nothing
}
}
}
Mode 3: textures with small static objects

Word #4: Sandy/grainy

Finally, we wanted to play around with textures. Discovering how to simulate different textures will be useful for our final project, since we plan on incorporating these feelings to our notation elements, such as notes and staves. We wanted to intensify the bumpy feeling we achieved by using small forces in our project iteration to recreate what would feel like when touching sandpaper.

Sketch for mode 4

At first, we attempted rendering texture using a randomly generated image:

Randomly generated pixel intensities, varying from 0–255 (source)

Our idea was to set forces proportional to the gradient of color intensity as the Haply moves through the region. We tried retrieving the pixel intensity using loadPixels()and pixels[loc], in which loc corresponds to the location of the end effector in the screen. However, we quickly learned that this approach is very costly in terms of computational complexity, and slows down the simulation significantly. As a work around, we experimented with random() to generate the forces applied to the Haply:

if (s.h_avatar.isTouchingBody(l4)) {
PVector xDiff = (posEE.copy()).sub(posEELast);
posEELast.set(posEE);
// threshold is very low, just to capture movement
if ((xDiff.mag()) < threshold) {
s.h_avatar.setDamping(700);
fEE.x = random(-1, 1);
fEE.y = random(-1, 1);
}
}

With this approach, we were able to achieve a dynamic, grainy texture, that resembles “static” tv noise. When the forces were too high (e.g. random(-1.5, -1.5), my whole desk would vibrate. In contrast, if the range of forces is too low (e.g. random(-0.5, 0.5), the texture is too subtle to be perceived. At (-1, -1), the forces are strong enough to create the texture we aimed for, but not too strong that the experience becomes unpleasant. The damping factor also plays an important role, slowing down the movement and “softening” the effect of the force feedback.

In my experience, the use of random forces and damping create the illusion of physically rendering “static” noise
Mode 4: texture rendering with damping and force feedback

Evaluation

We were able to test our words with four users (one became available after Hannah had finished her blog post). Three of them have experience with haptics through the CanHaptics course or their own research, and the other (eval. #3) is novice in the field who never had formal haptics training. Aside from explaining how to switch between modes, we gave our users complete freedom to explore the space. Comparing our evaluators’ responses, we were pleased that our haptic rendering choices were able to convey at least partially the words that we first proposed. Also, after talking to our participants, the results indicate that we were likely the most successful in communicating modes 1 and 4, while 2 and 3 elicited different feelings amongst them.

The only visual cues present on our interface were instructions on how to switch among the modes and an indicator of the current mode. In the evaluations table, “chipper” corresponds to “wood chipper”

Reflections

When Lab 3 was due in mid February, as originally planned, Hannah and I spend a few hours on a Zoom call sketching possible ideas. At that moment, we came up with the words: bouncy (using guided movement), star (using a star-shaped static object), and heavy (our plan was to explore the concept of thrust, by pushing an object into a dense liquid). After defining our vocabulary, we realized that we had little idea on how to implement any of the words, except for star. Even then, creating the static star-shaped FPoly object took hours.

Fast forwarding to the current version of Lab “3”, we decided to (almost) start over. We reframed our old ideas into concepts that could also work as haptic experiences for our music notation viewer project. We were able to pivot during our coding sessions a lot quicker than before, since now we have a better idea of how the mechanisms in the fisica package and in the Haply API work. Also, having lectures on different strategies used in haptic rendering was quite helpful in giving us an intuition around what is possible to be replicated with grounded force feedback.

One thing I missed throughout our labs and project interactions is a more complete documentation on the Haply API, particularly offering examples of haptic rendering. Although we are starting to understand how to render different physical experiences with Haply (such as textures), it would have been extremely helpful to see other implementations and understand how they work. In this sense, I hope the content we are generating through our lab blogs can serve as a stepping stone to future adventurers in the haptic world.

--

--

No responses yet