Team Crecendo: Iteration 2

Team Crecendo is creating a music notation system to help build intuition about pitch through haptic, auditory, and visual feedback. This is what my team has done and what I have done for iteration 2.

Hannah Elbaggari
13 min readMar 29, 2021

Team goals

After iteration 1 we aimed to bring our design from segmented prototyped parts to one more cohesive sketch. We aimed to implement Twinkle-Twinkle Little Star with note texture and experimented with some approaches to nudge the user in the right direction of the notes. For this iteration we focused on the overall feel of the system features. We split our team by timezone. Rubia and I aimed to implement texture and distinguishable note duration through haptic feedback.

My contributions were:

  • Prototype of pen end effector to help communicate height
  • Iterations on textures/forces for notes

Rubia and I worked together via zoom to test and discuss our approaches to textures. We used the prompt of Lab 3 to help brainstorm how we wanted the notes to feel: bumpy, sandy, dip. Source code.

End effector

I conducted a small user study with one person who did not have any experience with haptics, but was knowledgable of music notation. This informal evaluation revealed the end effector may be incorrectly influencing pitch through height changes. With the Haply-provided end effector, fingers grab onto the end effector while the wrist rests on the table, similar approach to a computer mouse. Because of these finer motor movements, height changes are more difficult to observe from one note to the other. The participant stated they would prefer something where their whole arm or wrist moves, since these larger movements are more engaged throughout the body and easier to tell with height changes.

I aimed to prototype a joystick or pen style end effector. This approach would map well to the conceptual model of tracing over sheet music with a pencil to gain intuition about how to read the song.

Prototype 1:

I took the Haply end effector off and inspected the way it connected to the arms. I found a dime sitting on my desk and it magnetically fit over the end effector spot well. To tell if this would work for a pen or joystick I hot glued an eraser holder plastic piece (shown below, left). This was appropriate for an initial sketch, but this end effector was not very compatible with force feedback since it would slip off the arms if pulled too hard. I then brainstormed ways I could 1) keep the end effector in place while pulling/pushing against forces and 2) replicate the pen tracing action better through a joint at the end of the end effector

Prototype 2:

I made a hole in a small plastic tube to put the ball of the Haply end effector though. This allowed for movement against forces without sliding off as well as more dynamic movements with the wrist and arm while interacting with the system (shown below, right). This echoed the cognitive model of pen and paper better with more fluid, natural movements. I sent photos of both of these prototypes to my teammates and realized I would probably have to find a way for them to also replicate the end effector. Since I used materials that were one-offs and hard to find, I aimed to prototype an easily replicable prototype to share the experience while designing remotely.

Left: Prototype 1 of a pen end effector. Eraser holder hot glued to a dime. Right: Prototype 2 of pen end effector. Eraser holder inside plastic tube with original Haply end effector ball connected to the end.

Prototype 3:

Prototype 3 took replicability into account, since in order for the prototype to work well I would need my teammates to have access to the modified end effector. I was able to experiment with cutting off the end of a very commonly found, inexpensive marker (shown below). I wrote up some instructions and materials list for my teammates and was able to share with them. The movement can be seen in the videos in the texture category. Overall, the pen end effector did not allow for resting of the wrist on the table, thus requiring more wrist and arm movement. We hope that this communicates height better than the finer, finger movements from the Haply end effector.

In a following meeting we discussed the actual feel of the pen end effector compared to the Haply one. While the feelings of the forces were slightly dampened with the pen end effector, by adjusting the forces in the code a bit, the pen end effector felt remarkably similar to the Haply end effector. This is based on my evaluation of the feeling, so I am excited to see if my teammates are able to construct the prototype to give more input about the overall feel.

Prototype 3 end effector. Materials are a marker with top cut off and the original Haply end effector ball connected to the end.

Texture

In our previous iteration Rubia and I experimented with the different feelings of forces for the notes and staff lines. What was problematic about this approach was that while these haptic effects were communicating the location of the notes and staff lines, the forces were distracting and sort of an avoidance illusion when it was hit (shown in video below). This general consensus was shared by all team members, so we aimed to compare the feeling of a more subtle, textured note approach.

Sandy

From our lab 3, Rubia and I experimented with random() for simulating a grainy/sandy texture (in bold below). This served as our first implementation and attempt at texture within the system:

PVector force(PVector posEE, PVector velEE) {
PVector posDiff = (posEE.copy().sub(getPhysicsPosition()));
final float threshold = 0.005;
if (posDiff.mag() > threshold) {
return new PVector(0, 0);
} else if (this.state == NoteState.NOT_PLAYING) {
this.state = NoteState.START_PLAYING;
}
return new PVector(random(-1.5, 1.5), random(-1.5, 1.5));
}

As you can hear in the video, the sound of the motors helps accentuate the texture. The video is with notes haptics on only, so no system sound or staff line forces yet. We isolated the notes to focus in on how they feel individually first. The texture was able to be communicated through the overall feel as well as some of the sound from the motors. Alongside the audio and visual channels, the note location was more effectively communicated than our previous approach.

Dip

In attempt to improve the avoidance illusion we were facing with the forces before, we aimed to make the notes with an open shape (half and whole notes) have the feeling of going inside the note like a dip. To do so we experimented with different ways in which we could draw this ring of forces. In our group meeting we discussed what this might look like visually:

Our first approach was inspired by some of the texture discussions from lecture. We thought it might be possible to assign a force setting to a grey scale pixel, but upon further testing this approach took so long to run that it broke both my and Rubia’s systems, so we implemented forces a different way.

Brainstorming, we came up with a few possibilities. Initially, Rubia had the idea of a radial vector field for texture. We used the equation listed on LibreTexts to do so.

PVector force(PVector posEE, PVector velEE) {
PVector posDiff = (posEE.copy().sub(getPhysicsPosition()));
final float threshold = 0.005;
if (posDiff.mag() > threshold) {
return new PVector(0, 0);
} else if (this.state == NoteState.NOT_PLAYING) {
this.state = NoteState.START_PLAYING;
}
float m2 = 1;
float fx = m2*(posEE.x/pow(posDiff.mag(),3));
float fy = m2*(posEE.y/pow(posDiff.mag(),3));
fx = constrain(fx, -1.5, 1.5);
fy = constrain(fy, -1.5, 1.5);
return new PVector(fx, fy);
}

After implementing this radial vector field and testing the feel, I noted that the direction in which the forces felt were in fact not a circle, but more similar to a directional pull out of the note still. This was due to us not considering a 3D field vs a 2D field. We aimed for a 3D field with the overall feel, but upon further discussion we realized this actually should be plotted like a 2D field to simulate the feel of the dip over a flat surface.

Left & middle: Radial vector image and equation was from LibreTexts. Right: sine and cosine gif

Rethinking our approach, we aimed to find a way to implement something like a simulated 3D sine wave on a 2D field — somewhat like a torus. I suggested exploring a sine wave to accomplish this.

In this implementation (bold below) Rubia and I aimed to find the place in which the open space in the middle of a half and whole note felt realistic enough to distinguish them from a quarter or eighth note. The goal was to make sure that the force values always added to a magnitude of 1 (the force that we found to feel noticeable, but not overwhelming when using the Haply).

PVector force(PVector posEE, PVector velEE) {
PVector posDiff = (posEE.copy().sub(getPhysicsPosition()));
final float threshold = 0.005;
if (posDiff.mag() > threshold) {
return new PVector(0, 0);
} else if (this.state == NoteState.NOT_PLAYING) {
this.state = NoteState.START_PLAYING;
}

float fx = cos(atan(posDiff.x/posDiff.y));
float fy = sin(atan(posDiff.x/posDiff.y));

fx = constrain(fx, -1.5, 1.5);
fy = constrain(fy, -1.5, 1.5);

return new PVector(fx, fy);

}

The video below shows this implementation compared to the other textured notes:

Sandy + dip

To assign the different note types the different force implementations, we had to use the already determined classifications to assign interactions with the notes. Juliette made these quite easy to find so we used the following implementation to combine the note forces:

PVector force(PVector posEE, PVector velEE) {
PVector posDiff = (posEE.copy().sub(getPhysicsPosition()));
final float threshold = 0.005;
if (posDiff.mag() > threshold) {
return new PVector(0, 0);
} else if (this.state == NoteState.NOT_PLAYING) {
this.state = NoteState.START_PLAYING;
}

float fx = 0;
float fy = 0;
switch (getText()) {
case "\ue1d2": // whole
fx = 1.15*cos(atan(posDiff.x/posDiff.y));
fy = 1.15*sin(atan(posDiff.x/posDiff.y));
break;
case "\ue1d3": // half (stem up)
fx = 1.15*cos(atan(posDiff.x/posDiff.y));
fy = 1.15*sin(atan(posDiff.x/posDiff.y));
break;
case "\ue1d4": // half (stem down)
fx = 1.15*cos(atan(posDiff.x/posDiff.y));
fy = 1.15*sin(atan(posDiff.x/posDiff.y));
break;
case "\ue1d5": // quarter (stem up)
fx = random(-1,1);
fy = random(-1,1);
break;
case "\ue1d6": // quarter (stem down)
fx = random(-1,1);
fy = random(-1,1);
break;
case "\ue1d7": // eighth (stem up)
fx = random(-1,1);
fy = random(-1,1);
break;
case "\ue1d8": // eighth (stem down)
fx = random(-1,1);
fy = random(-1,1);
break;
}


fx = constrain(fx, -1.5, 1.5);
fy = constrain(fy, -1.5, 1.5);
return new PVector(fx, fy);
}

In the informal evaluation I ran I used this version. Notes were either “sandy” or “dip” and the staff lines did not have any feeling yet. As you can see in the video below, the audio, visual, and haptic channels synchronized well and were able to communicate interaction with the notes at this stage.

Aside from the comment about changing end effectors, the participant gave us some excellent feedback on the differences between the force styles. They stated that the notes were a bit too different from one another and made it difficult to imagine the entire piece of music as unified. This is not what we intended to communicate (actually it’s quite the opposite) so I experimented a bit with how these forces could be different, yet still seem cohesive.

From a little bit of playing around with the above implementation, I was able to implement the dip notes with random() to have them all be a sandy texture. Both the solid sandy texture and the dip sandy texture were working well, but I was not able to test this with a user to confirm whether the notes were cohesive, yet distinguishable. One of the problems we faced was the oscillations from the notes when the end effector was no longer moving. Because of the approach of the random() forces were constantly changing and moving the end effector unpredictably when the user stopped moving on a note. We thought that this may be problematic, so we implemented a small condition if velEE.mag() > 0.00 before the switch statement to pause the movements while the end effector is not in motion.

Rubia also explored different ways she could implement a more cohesive, yet distinguishable note variation. She was able to do so by exploring the abs(randomGaussian()) to create a more controlled, directional generation of random forces. The comparison to the previous sandy implementation was not very noticeable, yet this way the texture generation was more controlled by following a normal distribution (implementation shown below).

PVector force(PVector posEE, PVector velEE) {   
PVector posDiff = (posEE.copy().sub(getPhysicsPosition()));
final float threshold = 0.005;
float fx = 0;
float fy = 0;

if (posDiff.mag() > threshold) {
return new PVector(0, 0);
} else if (this.state == NoteState.NOT_PLAYING) {
this.state = NoteState.START_PLAYING;
}

if (velEE.mag() > 0.00) {
switch (getText()) {
case "\ue1d2":
if (posDiff.mag() > 0.0025) {
fx = -velEE.x/abs(velEE.x + 0.001) * 2 * abs(randomGaussian()); // added 0.001 to ensure not dividing by 0
fy = -velEE.y/abs(velEE.y + 0.001) * 2 * abs(randomGaussian());
}
break;
case "\ue1d3":
if (posDiff.mag() > 0.0015) { // smaller hollow center
fx = -velEE.x/abs(velEE.x + 0.001) * 1.5 * abs(randomGaussian());
fy = -velEE.y/abs(velEE.y + 0.001) * 1.5 * abs(randomGaussian());
}
break;
case "\ue1d4":
if (posDiff.mag() > 0.0015) {
fx = -velEE.x/abs(velEE.x + 0.001) * 1.5 * abs(randomGaussian());
fy = -velEE.y/abs(velEE.y + 0.001) * 1.5 * abs(randomGaussian());
}
break;
case "\ue1d5":
fx = -velEE.x/abs(velEE.x + 0.001) * abs(randomGaussian());
fy = -velEE.y/abs(velEE.y + 0.001) * abs(randomGaussian());
break;
case "\ue1d6":
fx = -velEE.x/abs(velEE.x + 0.001) * abs(randomGaussian());
fy = -velEE.y/abs(velEE.y + 0.001) * abs(randomGaussian());
break;
case "\ue1d7":
fx = 0.75 * randomGaussian();
fy = 0.75 * randomGaussian();
break;
case "\ue1d8":
fx = 0.75 * randomGaussian();
fy = 0.75 * randomGaussian();
break;
}
} fx = constrain(fx, -1.25, 1.25);
fy = constrain(fy, -1.25, 1.25);
return new PVector(fx, fy);
}

Staff

Rubia and I also adjusted the feelings of the staff lines. We noted from our previous iteration that we did not focus much on the staff lines. Because our focus was so heavy on the notes, we did not realize that the staff line forces may interfere with our note force generation. Below is a video of how the notes and the staff lines interacted when we combined the forces:

With the staff lines and the note forces together, we are still facing some issues:

The oscillations seen in this video were a result of forces from the notes on top of the forces from the lines.

private PVector staffForce(PVector posEE, PVector velEE, PShape line) {
PVector linePos = getPhysicsPosition(line);
PVector force = new PVector(0, 0);
// Include dead band to prevent oscillations if you're staying on the line
if (abs(posEE.y - linePos.y) < 0.0005 && velEE.mag() > 0.01) {
force.set(velEE.copy().normalize().rotate(PI));
force.x = 0;
force.setMag(2);
}

return force;
}

This video is taken with the staff line magnitude reduced a bit as well force.setMag(1.2), but we still faced this issue while moving through notes. This also made it particularly difficult to move along a line to other notes of the same pitch.

While I was testing out the combination of these two features in our system I aimed to categorize how this may be haptically confusing:

  • With audio and visual feedback channels, the haptics interacting with one another is not actually distinct enough from one feature to the next. With the haptics of the lines and different note types, it is almost too much to pay attention to to really get that these features are different
  • The staff lines are quite close to one another. Because of this, even moving through the features it feels as though when you move the end effector off one feature, the next is already there for you to feel, thus creating an overload of haptic information

We will aim to address some of these issues in our next iteration.

Reflection

Overall, this iteration was unifying the system with the previously developed prototypes and textures. Transferring some of our textures from the fisica package to the Crecendo prototype proved challenging, since I spent quite a bit of time with the code to understand Juliette and Sabrina’s contributions from the past iteration as well. Once I understood the different aspects of the code, it became easier to implement a replication of our Lab 3 into some of the note feelings.

Some of the approaches Rubia and I took were a flashback to math classes. I don’t have a background in physics and it has been a while since I have used my “math brain”, so I found connecting on different ideas and reviving my math brain was useful on a zoom call. It was also helpful for Rubia and I to connect this way since there are some issues when Processing + Zoom are running.

One huge improvement on this iteration was getting some outside input. Sabrina conducted a user study which revealed the importance of testing our prototypes. My informal user study also redirected our approach quite a bit. As a designer, it is important to have our designs based in some sort of theoretical, conceptual models, but it is equally as important to test and confirm these models through outsider point of views. I think it is possible our team has suffered from being too close to the project. It is important to sometimes take a step back and get some sort of evaluation — especially in the sketching stages. This will give us more perspective and also step back from the problem to look at the bigger picture more often. Because we are designing for users with dyslexia, I wonder if it is possible to find users to try this out, though I think this is also going to be a challenge due to restrictions.

Either way, I am excited for our next iteration and ready to combine these different approaches together. I hope to continue to create these distinct features in the system, while continuing testing with users. Hopefully the rest of the team is able to test out the pen end effector and possibly compare these approaches to converge on how we should proceed on our final iteration.

--

--

No responses yet