Team Crecendo: Iteration 1
To give a little overview, Team Crecendo aims to use the Haply to enhance the experience of reading and learning to play music. We will create a multimodal interface that will guide users through musical notation through touch, audio, and visual channels. Our goal is to help people build intuition about music notation without having to look at the staff (the lines of the music).
Our initial motivation of this idea comes from our group member, Sabrina. As a child, Sabrina had severe dyslexia which inhibited her ability to read music, specifically she could not make sense of the positions of the notes on the staff. Through a multimodal haptic approach, Sabrina would have benefited from a non-visual way to build intuition about pitch and note positions. Learning of Sabrinas experience, our team believes this approach to learning music could help others with learning disabilities, like dyslexia, to gain intuition about music notation.
For the logistics of our approach we split the work by timezones. Sabrina and Juliette in EST and Rubia and I in PST. Sabrina and Juliette focused on the software frameworks and audio/visual details of the haptic interface. Rubia and I focused on the haptic interaction and sketches to implement the “feel” of the different features.
Rubia and I coded in tandem with over 10 hours of zoom calls outside of regular group meetings to discuss in real time what we were feeling. We shared code back and forth and were able to discuss what felt the best and work through problems we faced.
Individually, I worked on: construction of the mockup GUI (staff lines and avatar), note implementation (not including forces), moving the notes (first iteration), and spacing of elements. Code can be found here.
Approach
We began our approach to guided music notation by conducing a few group meetings to collaboratively sketch. Our sketches took various forms, first beginning with a rough sketch in a Zoom meeting and as discussions and ideas kept flowing, we came to a more solidified idea of what our approach may actually look like.


In our split group, Rubia and I wanted to create the a GUI to test staff lines and notes. We touched on the idea of doing a guided movement task, but we prioritized getting the feeling of the elements first, so we did not end up getting to a guided movement yet.
Mockup GUI
In order to test out the interactions with the different features I created a mockup GUI for testing the feeling of the elements. We used the Fisica package to implement the haptic and visual elements.
The first step was to place the staff lines. Using the idea of a density layer from the Haply maze demo, Rubia and I discussed how we would like this to feel. Ideally, we wanted each staff line to have some amount of feeling so while the user was exploring they would be able to know where within the staff lines the notes were. If the staff feeling was too harsh it would feel like jumping through them or strumming guitar strings, which we did not want. We brainstormed the words that would best explain this: mud, honey, water, sticky, bumpy. What we ended up implementing was a subtle, small groove feeling that closely related to the idea of sticky lines.

In order to do so I started out with the lines pretty far apart. The lines were large and the avatar small. Also this implementation was missing one line to be all of the staff lines, oops. Although this was a good layout for testing the feeling, what I wanted to work towards was the feeling for when the lines were closer together and multiple octaves could be represented on the same screen.
After consulting with Rubia, we ended up moving the lines closer together. This resulted in shifting the force of the lines a bit lower so it wouldn’t feel too “bumpy” for what our intended feel was. Code snippets below describe how we implemented the staff lines and the interaction.



We moved the staff lines closer and changed the GUI to represent something closer to musical notation.
Lastly, the size of the avatar was something Rubia and I discussed. When we had our first iteration of the staff lines the avatar was quite small and difficult to move accurately within the negative spaces. By moving the lines closer together the avatar was easier to direct towards objects like notes. We came to the decision that the avatar should be a slightly smaller size as a note to feel as though the user is moving through the music “to scale”. This may be adjusted as we finalize the visual representation of the sheet music in our next iteration.
Notes
Next I implemented some notes on the staff lines. The notes actually took a different form at first. We played around with feeling a lot for these.

To implement the feeling of hitting a note Rubia and I brainstormed what might feel real when hitting a note. We brainstormed the words that would best explain this: drag, sluggish, obstacle. We initially thought that making notes impenetrable in the centre would be a realistic depiction of hitting an object (shown left). With much reflection we scratched the idea. With a few test rounds, we found that having an impenetrable centre actually detracted from the feeling of hitting it. We found that with this small impenetrable box, it felt more like you were going around a note rather than hitting it. Thus, we implemented the notes as circles with a set force (shown in video below). While Rubia worked on the feel of the notes, I worked on implementing them in the space. Below is the template for the notes.

I oriented each note to be in reference to the staff lines. By doing so, it will be easier to place more notes within the musical notation since spacing matters and placement matter for notes in music.
Initially, I played around with the setDensity
of the notes to make them feel distinctly different from the staff lines. In the end, it actually turned out that the setDamping
didn’t result in much feeling from the Haply and the damping had more influence. To find the difference in what features we were changing in the elements, we referred to the documentation (linked above) of the Fisica package. Damping manipulated the translational movement within the world, which essentially made the movement of the avatar hitting the note body or staff lines. I did not have time to play around with removing the density, but will be doing so in the next iteration to clean the template.
Moving Notes
In our ideation (seen above in approach section) we want the notes to move and the user to have their end effector hit the notes. I took on the first iteration of the note movement. Note movement was important to implement at this stage of our design because while the static elements are important to feel, we also would like the user to be able to get a sense of the overall “shape” of the music variation.
We cannot display an entire piece of music on the screen, so we are taking a zoomed in approach, focusing on one or two bars of music at a time. Because of this, for a tempo-driven guidance through the music, the notes will have to move.
I implemented the note movement across the screen. In the left below image, if the note is moving, it increments across the screen. When the note reaches the end of the visible world, start back at the beginning. This just got the notes going somewhere along the staff line and mimicked a continuously moving score.

We ended up keeping this same structure for moving the notes and having them repeat in this iteration, but we have used a different variable to standardize note names. With the keyPressed
function we are able to move and stop the notes, which simulates user control when wanting to explore the pitch and location of the notes.
In our next iteration we will ideally have the notes be. The way the notes currently move across the screen is not ideal for a tempo-driven approach since the position scale in the Fisica package is different from the way we would like to represent tempo. This approach worked for simulating tempo, but more work on the visual information presented to the user is needed to show where along a sheet of music the system is showing.
Reflection
Overall, I feel as though this iteration was a great lifting off point for moving forward as a group. With our group split into two logistically, communication sort of broke down and it was difficult to know how “into the problem” other people were. At one point during a meeting we realized we were developing with different mental models of the same system. Both ones we plan on working towards, but some assumptions were not fully articulated. Each person working on a specific area was helpful to building a foundation, but now I feel as though joining all of our individual contributions together is going to be a challenge.
Rubia and I spent more time than we predicted on getting smaller things working like how we want to format the different elements as well as rich discussion on the feel of everything. Even though we were still in the sketching phase, I feel as though it was not as quick as we would have hoped to get all of our ideas explored. For example, Rubia and I noticed that even though set at the same force output, our Haply felt different — one more stronger than the other. Without being on a call, I don’t know that we would have caught this difference. I am happy with the final result for this iteration, but at some point there had to be a choice of how and why we want things to feel a certain way and go from there. We sent our code off to Sabrina and Juliette for a quick evaluation and got very positive responses on the overall feel.
As for logistics, for us to get two people on the same page and coding in tandem, being on a zoom call felt necessary. We were constantly passing code back and forth and iteratively building on each others work. As we combine our work with the rest of the group, I think we need to discuss how we are going to do this in detail, since between four people (and multiple timezones) it may be difficult.
As we move onto our haptic sketching and iterating stage I aim to keep these next few things in mind that we have yet to explore/implement:
- Octaves and how we want to represent them (spatially and haptically)
- Note duration (types of notes)
- Guided movement towards notes (in reference to the overall shape of the music)
- How to represent multiple measures and/or a whole piece of music
For our next steps, we have a scheduled brainstorm session in which we are going to discuss the different ways we are all imagining the multimodality of our system to work. In doing so this will be a takeoff point for our haptic sketches in which we will implement multiple and narrow them down as we go.