We've all been there. You have something whose motion you can obviously follow, and yet no matter what you try, Mocha refuses to cooperate. Its "manual tracking" functionality seems to work, but then when you export, it's like you didn't do anything at all. You'd track it by hand, but it has some scaling or rotation or whatever that makes it hard to follow. You can easily reposition a copy of the shape to follow the footage, and if you had tracking data, a-mo would be more than happy to do the calculations for you, but... argh.
Introducing DeriveTrack, a new entry in the Aegisub-Motion submenu. Derive acts as a sort of inverse to the usual Apply macro: where Apply takes position, scale, and rotation data from AfterEffects tracking data and uses it to generate tracked typesetting, Derive instead looks at the position, scale, and rotation of tracked typesetting in order to produce AfterEffects tracking data, presumably to then be applied to other not-yet-tracked TS.
Alternatively:
General usage is as follows:
- Spend upwards of 30 minutes struggling to get Mocha to do its job (optional, recommended)
- Give up (implied)
- Draw a shape that traces a contour (or just some edges or corners; anything you can track with your eyes, really) from the base content
- Track the shape manually on each frame (multi-frame events are fine)
- Select the tracked range
- Run DeriveTrack
- Wonder why it uses the log window instead of a proper dialog
- Copy all the output text to clipboard
- Apply the tracking data exactly as if you'd gotten it from Mocha
- Bask in the satisfaction of a sign well tracked (optional, may require positive outlook on life)
Happy tracking!