Despite everything going on within the frame, videos are still a passive experience to observe. You can't reach in and mess with the objects you're watching — until now. An MIT researcher has pioneered new technology that lets you "touch" recorded things, which are simulated to respond like you'd fiddled with them in the real world.
So how do you predict which ways an already-recorded object will move when tweaked? The system, called "Interactive Dynamic Video" (IDV), needs five less than a minute of footage to track movement possibilities. It does so by analyzing how it shifts when intentionally jostled: In the video example below, a researcher slams the table on which a humanoid figure is resting, which lets the system see how it vibrates across different frequencies. Then it extrapolates how the item should behave when viewers reach in with their cursors and jostle the object in video.