Jehoshaphat I. Abu
std::steam

std::steam

100 Days Of ML Code — Day 042

100 Days Of ML Code — Day 042

Jehoshaphat I. Abu's photo
Jehoshaphat I. Abu
·Aug 20, 2018·

2 min read

Recap From Day 041

In day 041, we continued with video features: Compare pixels from one frame to the next. We saw that if we are interested on how things are moving in front of the camera, rather than trying to identify what is in front of a camera, there are a few simple things that might work. One simple thing we can do is compare pixels from one frame to the next.

Today, we will continue from where we left off in day 041

Video Features Continued

Optical Flow

Optical flow or optic flow is the pattern of apparent motion) of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene. By estimating optical flow between video frames, we can measure the velocities of objects in the video. In general, moving objects that are closer to the camera will display more apparent motion than distant objects that are moving at the same speed.

[Source](https://cdn.hashnode.com/res/hashnode/image/upload/v1632827136169/OCfroch6p.jpeg)Source

If we are interested in knowing a bit more about who things are moving, we can use optical flow. Optical flow looks at two subsequent frames of videos and tries to match each pixel in one frame to a pixel in the next frame. So ideally, the pixel showing the bicycle tyre above will be matched to the pixel showing the same bicycle tyre at some later point in time, even if the bicycle moved.

Optical flow estimation is used in computer vision to characterize and quantify the motion of objects in a video stream, often for motion-based object detection and tracking systems.

[Source](https://cdn.hashnode.com/res/hashnode/image/upload/v1632827139272/kLvu6ajIo.png)Source

They are lots of simple ways to compute good features with optical flow. For instance, we could take the average speed of motion across all pixels, or within particular regions. We could also take the average direction of motion across pixels, or within particular regions. Or we could do a windowed analysis like we did with audio, and look at how the speed or direction has recently changed over time.

Awesome. That’s all for day 042. I hope you found this informative. Thank you for taking time out of your schedule and allowing me to be your guide on this journey. And until next time, be legendary.

Reference

*https://www.kadenze.com/courses/machine-learning-for-musicians-and-artists-v/sessions/sensors-and-features-generating-useful-inputs-for-machine-learning*

*https://en.wikipedia.org/wiki/Optical_flow*

*https://www.mathworks.com/discovery/optical-flow.html*

 
Share this