Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Did you see that it is only stabilizing rotation? If it looks so stable that's because the lens of the iPhone is at the exact center of the wheel. If you were to move your phone along a X or Y axis, I don't think those movements would be stabilized like you could get with a software stabilizer. I guess the main target of this app is people turning their device when shooting a video.


Translation of the camera changes the perspective, i.e., the relative positions of different objects in the image. There is no way a piece of software can get rid of that, because the information isn't there in the first place.


The information is there in sequences of images through parallax precisely when the camera translates. We know how to do 3D reconstruction of scene geometry from a single moving camera.

https://www.youtube.com/watch?v=fLnd9ucUu9Y

Skip to 1:15 to see a textured reconstruction.

The first real-time example was Davison et al's MonoSLAM paper of 2007, I think.


Keep in mind that SLAM assumes a static scene. To stabilize a video with a moving target would mean that there's still going to be parallax otherwise you're going to get shadows (see the cool video on Oculus time warp: https://www.youtube.com/watch?v=WvtEXMlQQtI)


People are now managing to segment parts of the scene by motion, so that moving cars and pedestrians, for example, are known to be distinct from the camera's ego-motion. The static scene assumption is not strict any more.


Don't forget that an iPhone has sensors other than the CCD, so it knows when the camera is being twisted / translated around. You would need some depth perception to recover the image, but lucky you, that's exactly what you get if the camera is moving.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: