Jehoshaphat I. Abu
std::steam

std::steam

100 Days Of ML Code — Day 065

100 Days Of ML Code — Day 065

Jehoshaphat I. Abu's photo
Jehoshaphat I. Abu
·Sep 12, 2018·

2 min read

Recap From Day 064

Day 064, we looked at how gesture follower works. You can catch up using the link below. 100 Days Of ML Code — Day 064 Recap From Day 063medium.com

Today, we’ll look at another method that has be designed for expressive variations in gesture execution. Which means that it is more able to understand intentions in expressive changes in the gesture and then to use this information in real-time performance. For instance, to control sound synthesis parameters.

Working with time

Gesture Variation Follower

The method is Gesture Variation Follower(GVF). As we said, the idea is to follow, we say or subtract, the variations in gesture execution instead of considering them as noise. More precisely, if we want to slow down when we’re executing the gesture, GVF is able to estimate the decrease in speed dynamically.

Similarly, if we want to perform a gesture bigger than the one recorded, GVF is able to estimate the increase of relative size dynamically. These variations, are so dynamical because we could clearly start a gesture faster than the recorded template and then finish slower.

This is a very important feature because it means that these extracted variations are not relative to the gesture globally, but the gesture that is continuously changing and consequently they can be used as continuous controls while the gesture is being executed.

Below is a video demonstration of GVF

That’s all for day 065. I hope you found this informative. Thank you for taking time out of your schedule and allowing me to be your guide on this journey. And until next time, be legendary.

References

*https://www.kadenze.com/courses/machine-learning-for-musicians-and-artists-v/sessions/working-with-time*

 
Share this