Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is stabilizing, just not in the way (or as much) as I imagined, is the issue. I tried to describe it, not really sure how else to. But it's not really the same kind of output I'm expecting from a deshaking filter. I'd expect black margins etc. to bleed in for example, but I don't see that here.


>This filter generates a file with relative translation and rotation transform information about subsequent frames

>which is then used by the vidstabtransform filter.

The first pass is meant to find the rotation, then I assume the next pass cancels it out.


Maybe my terminology is wrong, but what I'm talking about is basically what you see with the biker and pen here, or the washing machine at 3:44:

https://www.youtube.com/watch?v=I6E6InIQ76Q&t=9s

They stay stable in the center whereas the rest of the frame is transformed. This necessarily requires introducing black/white crop frames into the image with all kinds of shapes and sizes, but it turns out incredibly smooth and doesn't lose any of the frame. It also requires no noticeable blurring at all from what I can tell. But that doesn't seem to be quite what's happening with these commands though.


I believe what you're looking for is what Reddit's popular bot u/stabbot does.

I was able to find this comment thread where a user posted some ffmpeg scripts to replicate the behavior. Am on mobile currently so I can't verify they do what you're looking for but here's a snippet.

//PART 1 [Defaults: shakiness=5:accuracy=15:stepsize=6:mincontrast=0.3:show=0] ffmpeg -i shaky-input.mp4 -vf vidstabdetect=shakiness=5:accuracy=15:stepsize=6:mincontrast=0.3:show=2 dummy_crop.mp4

//PART 2 ffmpeg -i shaky-input.mp4 -vf scale=trunc((iw0.90)/2)2:trunc(ow/a/2)*2 scaled_crop.mp4

//PART 3 [-strict -2 ONLY IF OPUS AUDIO] - [Unsharp Default: '5:5:1.0:5:5:0.0'] ffmpeg -i scaled_crop.mp4 -vf vidstabtransform=smoothing=20:input="transforms.trf":interpol=no:zoom=-10:optzoom=2,unsharp=5:5:0.8:3:3:0.4 stabilized_crop-output.mp4

https://www.reddit.com/r/stabbot/comments/9f7ayj/comment/e5x...


Stabbot is open source. It's written in Python and calls out to ffmpeg with vidstabdetect/transform: https://gitlab.com/juergens/stabbot

I don't think vidstab can do exactly what's being asked for here, I've never seen it rotating the frame completely to keep a rotating pen stabilized, or move the frame inside a larger canvas to keep a moving subject centered. I think you have to use video editing and do that manually.


Yeah that kind of looks like it. Thank you! I'll try it out.


Despite the above example apparently applying an actual blur filter, you always might get a bit of blur on videos in general because of the motion blur inherent in the source, which becomes more obvious once the motion is removed. You can't really do much about that except for refilm the video with a shorter shutter speed.


You can also reuse previous and following frames instead of using solid color borders




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: