CanHaptics Team Crescendo Project Iteration 1: Multimodal Music Notation

My team and I are creating a music notation viewer with haptic, visual and auditory feedback. This is my take on what happened in our first development iteration

Rúbia Guerra
10 min readMar 8, 2021

There are three main components that go into building our multimodal music notation viewer: the visuals, the sounds, and the haptic interactions using a Haply device. Since we are using Processing as a development system for the Haply, and it doesn’t natively offer a music notation viewer system, we split our team efforts into crafting audio-visual and haptic components. While Juliette R.and Sabrina K. tackled a way to neatly display scores on the screen and make them sound nicely, Hannah E. and I ventured into sketching possible haptic feelings to our music elements.

Initial sketch of the system

Goals

In this iteration, we wanted to tackle the base notation elements: staves and notes. We also intended to start looking at how the users will move through the score.

  • Staff: how should the staff feel like, if anything at all?
Staves
  • Notes: how should notes feel like? How far apart should they be? And how many notes can we display at once?
Notes
  • Guided movement: should the user be guided through the notes on the screen? If so, how should that movement feel like?
Initial sketch of how the user could interact with the notation viewer. This idea is based on using force feedback to guide the user vertically through the notes, while the user has freedom to move horizontally through the screen

Materials

Haply device, sketching tools (for me, pen and paper)

Tools

Processing Development Environment, fisica package.

tl;dr

Hannah and I worked together, on Zoom, for over 10 hours, on haptic sketches. My direct contribution was sketching and implementing note and staff feelings (including using force feedback), moving the notes on the screen, and a (failed) attempt at enabling consistent guided movement using the Haply. Hannah worked on the mockup GUI, note implementation, note movement, and spacing the elements on the screen. Our code can be found here.

Goal 1: feeling the staves

Our main idea for the staff was to achieve something that felt like subtle bumps, just to nudge the user into knowing that they are on top of a bar or that they just passed over one. As Hannah nicely described: we were not trying to have them feel like strumming a guitar, but a subtle, small groove instead. We were looking forward to feelings of bumpiness or stickiness.

We started by playing around with Haply’s Maze Physics example in an attempt to explore how the viscous layer was implemented.

Base code: Haply’s Maze Physics example. We were interested in the viscous layer, represented in purple/blue color at the bottom part of the image.

Working remotely and not being able to experience things on the same device meant that we had to rely on visuals and a common vocabulary to describe what we were feeling. After playing around with our base code, we settled on trying two parameters related to the viscous layer: setDensity() and setDamping().

/* Set viscous layer */
l1 = new FBox(27,4);
l1.setPosition(24.5/2,8.5);
l1.setFill(150,150,255,80);
l1.setDensity(100); // parameter of interest
l1.setSensor(true);
l1.setNoStroke();
l1.setStatic(true);
l1.setName("Water");
world.add(l1);
.../* Viscous layer codes */
if (s.h_avatar.isTouchingBody(l1)){
s.h_avatar.setDamping(400); // parameter of interest
} else {
s.h_avatar.setDamping(4);
}

The first words we tried to achieve were: mud, honey, water, sticky, sluggish, drag. Like mud and water, we wanted to achieve the feeling of moving in layers that had noticeably distinct densities. We started by creating a few layers with different densities and performing small changes (100, 150, 200, …) on setDensity, but soon realized we weren’t feeling a significant difference. We cranked up to 700~900 and were able to feel a very slight difference, although still not sticky enough.

At this point, we turned to setDamping and repeated the same process. This parameter is related to how the translational movement of our avatar dampens when moving inside a layer. Again, we were only able to achieve the words that we proposed when setDamping ranged from 700~900. At 700, we felt something close to waving your hand around underwater, and at 900~950, felt like submerging your hand in a puddle of thick mud and moving it around.

Now that we had a noticeable feeling, Hannah resized our layers and stacked them in thin lines, to represent how staves should look like in a score.

First mockup of our staves with haptic feedback

Goal 2: feeling the notes

To get a feeling for the notes, we started brainstorming words to describe the feelings we wanted to achieve. Our word list included: drag, sluggish, and obstacle.

In drag or sluggish, the idea was to amplify what we feel for staves so the user has a clear feeling of entering the note. Like for the staves, this was accomplished by playing around with setDensity and setDamping. We also ended up with values between 700~900 for our notes, represented as circles slightly bigger than our avatar.

Mockup for notes that feel “sluggish”

In obstacle, we wanted to convey the feeling of “hitting” a note. We thought making an impenetrable core would give a sense that the user is hitting something, and help understand where the note is in relation to the visual space. However, we quickly learned that by having the obstacle in the center, we forced the Haply to move around the note, without ever being able to fully reach a note. Since the end effector is pushed away from the center — either upwards or downwards, this could add extra noise when a user is trying to understand the positioning of a note in relation to a bar or a groove. We decided to abandon this idea.

Mockup for notes that feel like an “obstacle”, or “impenetrable”

Putting things together

After some discussion, we agreed that feeling both the staff and the notes at all times might overwhelming. To give the user some freedom, we added a switch for toggling the feedback on the staff on and off:

if (toggle_lines && (s.h_avatar.isTouchingBody(l1) || s.h_avatar.isTouchingBody(l2) || s.h_avatar.isTouchingBody(l3)
||s.h_avatar.isTouchingBody(l4)||s.h_avatar.isTouchingBody(l5))){
s.h_avatar.setDamping(700);
} else if (!toggle_lines){
s.h_avatar.setDamping(4);
}

We also started thinking about how would it feel if the notes are moving on the screen. To that end, we implemented an option to make notes move in a constant horizontal translation:

if (moving) {
x = c1.getX(); // c1 corresponds to note 1, c2 to note 2, etc.
y = c1.getY();
c1.setPosition(x - 0.005, y);
x = c2.getX();
y = c2.getY();
c2.setPosition(x - 0.005, y);
}
Example of notes moving horizontally

At this point, we had the bare bones of feeling notes and staves, but we were far from satisfied with what we have achieved. We realized that our model had two major flaws:

  1. The feeling was inconsistent depending on the direction of movement towards the note. Sometimes we didn’t feel anything when reaching a note or a staff from certain angles — which also varied inconsistently between the two of us.
  2. Depending on where they were placed on the screen, some notes produced no feeling. This again varied inconsistently between Hannah and I.

I played around with many other parameters trying to solve this problem:

  • setFriction: what if instead of damping, we could give the notes a certain “texture”? I tried using friction at 0, 10, 100, 1000 on the note objects but didn’t feel any difference. I played around for a while until we learned that on fisica, the friction parameter only work for non-static bodies, since setFriction actually means “loss of velocity at friction”.
  • setStatic: in an attempt to make friction work, I tried to set our notes as non-static objects. This quickly failed since we weren’t able to keep them in a fixed spot.
  • setSensor: I also tried enabling collisions (setSensor(False)) to make friction work. However, by enabling collisions, we wouldn’t be able to move the end effect across the note, only around it. For the same reason that the obstacle implementation was discarded, we also decided keep collisions disabled.

Playing with forces

What if instead of relying only on the physical properties of the elements (density and damping), we could nudge the end effector every time it passes over a key element? I remembered trying Juliette’s code for Lab 3, and in one of her communication modes, she implemented something that for me felt constrained and checkered, like small square-shaped grooves that hold the end effector in place, but that are shallow enough that you can jump from groove to groove.

I went back to her code and tried to understand how that was done. My initial idea was to make notes feel like small circular bumps by pushing the end-effector away from the note:

if ((s.h_avatar.isTouchingBody(c1)){
PVector xDiff = (posEE.copy()).sub(loc); // loc: location of c1
if (xDiff.mag() < threshold) {
force.add(xDiff.mult(-1/F));
}
}

I wanted the force to be dynamic in the sense that it will always lightly nudge the end effector in an opposite direction of the movement. Also, I wanted to force to only be applied if the user was within the visual space of a note. This implementation had a few problems:

  1. The Haply’s encoder position considers the center of the screen as (x=0, y=0), while the remainder of the elements in our simulation have the top left corner as (x=0, y=0). This means that points outside of the center of the screen often would not satisfy our threshold. Again, the feeling among different notes became inconsistent. To solve this problem, I shifted the encoder’s relative center when calculating the distance between the end effector and a note:xDiff = new PVector(posEE.x — c1.getX() + worldWidth/2, posEE.y — c1.getY() + worldHeight/2);
  2. The encoder reading is inconsistent enough among Haply’s that both Hannah and I had different experiences using a threshold of the same magnitude. Hannah was able to feel notes at positions that I wasn’t, and vice-versa.
  3. Related to the last point: having the magnitude of the force depend on the encoder reading also resulted in inconsistent experiences among the notes.

One easy way to create a stable and reliable feel through the notes was to have a “static force applied to the end effector. I chose to make the forces point towards the top left corner of the screen, since I believe the movement in our notation viewer will mostly occur in the opposite direction — or at least from left to right. The magnitude of the force still needs to be adjusted, as the “sweet spot” that makes notes feel bumpy enough but not too strong seemed to vary between Hannah and I. Also, I coupled the force model with translational damping to slow down the movement inside a note to also highlight the feeling of stickiness.

if (s.h_avatar.isTouchingBody(n1)) {
PVector loc = new PVector(n1.getX(), n1.getY());
PVector xDiff = new PVector(posEE.x - n1.getX() + worldWidth/2, posEE.y - n1.getY() + worldHeight/2);
s.h_avatar.setDamping(900);
if ((xDiff.mag()) < threshold) {
fEE.x = -1;
fEE.y = -1;
}
} else {
s.h_avatar.setDamping(4);
}

In parallel, Hannah was fixing our GUI mockup. We obtained our final mockup for iteration 1 by joining both the visual and the interaction parts.

Final mockup for Project Iteration 1

Goal 3: guided movement

Since both Hannah and I spend the majority of the time attempting to create a pleasant experience with the note and the staff, the amount of time devoted to this goal was compromised. I tried to make some progress by playing around with my code from Lab 4, but I am yet to figure a good combination of PID values that reliably enables stable and smooth movement between notes. Moreover, after a lengthy discussion with my team, we realized some of us had different ideas of how the guided movement should be incorporated as a feature in our final notation viewer. For this reason, we decided to take a step back and brainstorm ideas in this direction that can be implemented during the next iteration.

Reflection

For me, this project iteration felt like joining our experiences from all of the course labs so far. Lab 1 was useful in in trying to come up with quick mental sketches that represent haptic feelings (a bucket of water, a pool of mud). Lab 2 helped us be able to understand the virtual space and be able to play around with the positioning of bars and notes relatively painlessly. Lab 3 was very helpful in having us think of nouns and adjectives that describe the feeling we are trying to convey. Finally, Lab 4 was a good starting point to understanding how implement guided movement, although we were not able to advance much on this front.

We (or at least I) envisioned tackling guided movement and trying out other notation components within this iteration. I didn’t anticipate how long we would spend trying to craft the feeling of bars and notes, and how many things we would have to try until we achieved something that felt good. Tough we achieved less than what I expected in quantity, I am satisfied that we were able to create a pleasant experience, and it was rewarding to receive a positive feedback from our other team members, Juliette and Sabrina.

Moving forward, some questions come to my mind:

  • Using guided movement, should the Haply guide the user horizontally, as well as vertically? Or should the notes move towards the user, and the Haply only guides the user upwards and downwards according to the changes in pitch?
  • Should we want to represent the staff feeling even in regions where the staff isn’t visible? Would that help the user discern more quickly between notes that are too low or too high?
  • How should different types of note feel?

--

--