Hand Music Mashup – User Testing

For context, here’s my 75% complete project:

and here’s my 90% complete project:

So as I was developing my project, I was constantly trying to user test my own project before I user test it on others as I would know best how to break my own project which resulted in many enhancements in the UI, design and how it functions overall:

    • I realized early on that there would be no ‘buttons’ to be clicked, rather places that could be hovered on for some time and some specific task will take place (If I refer to buttons in my post I mean places you can hover click)
    • Volume control of either side initially was controlled by simply hovering anywhere vertically in the two halves of the screen which has proven to be clunky. I now limited volume control to be 1/8th of the screen on the left corner for the karaoke and 1/8th of the screen on the right corner for the acapella.
    • I needed to make the cursor easier for the user to follow. I found out how to leave a trail of an object in processing from http://www.science.smith.edu/dftwiki/index.php/Creating_a_trail_of_moving_object_in_Processing as I couldn’t figure out how to do it in the draw function because of the background that kept resetting. I learned that for this to work there shouldn’t be a background reset, rather a rectangle that’s the size and width of the screen with an opacity.
  • Initially the volume control was simply assumed to be understood by scroll hovering on either side but I added a visualizer for the volume as I felt like it wasn’t clear to the user of how exactly the volume would increase and reduce and the current volume the sound is at. 
  • Integrating the kinect into the project was troublesome as it was very buggy. Initially, I put made the cursor move with the kinectTracker class that Dan Schiffman made but the cursor would move all around when I went away from the kinect, so I changed the kinectTracker class to return the PVector to the center of the screen if it doesn’t find any rawDepth that is lower than the threshold, that way there .
    • I made some optimizations to the kinect tracking class as it didn’t work well with my project. One thing I did is set up two integer variables called handArea and totalArea that were incrementally increased in the for loop that detects the depth using the threshold. The handArea would only increase when the raw depth is less than the threshold and the totalArea int would increase with the for loop always. I then made the cursor be able to move only if handArea/totalArea was less than 0.065 (tested to get that value) because the hand would only be less than 6.5% of the total kinect visiblity.
    • I also made the cursor move back to a specific point away from all buttons when there is no depth less than the threshold. By doing this random clicking of buttons would be reduced.
    • I made the hover time for every function about 0.5-2 seconds in order to reduce unintentional clicking.

User Testing 1

Leave a Reply

Your email address will not be published. Required fields are marked *