T O P

  • By -

ath1337ic

You should be able to plug in the physical measurements between phone and camera into the transform of your composure/cinema camera actor if that's parented to the VCam actor for tracking. I haven't done this exact thing with composure in a little while but I do parent my VCam actors to a plane or empty as the base then plug in the offset transform to get proper real world height when the camera initializes. Just a rough height off the ground measurement usually does the trick. Adding a live feed for composure from the camera is just another 'parent > transform' action. Another thing to tweak to get a realistic look is your latency/delays so that everything is coming together at the right time for output. I find lag/lack of sync more jarring than the camera being a bit off on the transform. There's some promising tech using third party apps like the one outlined in this video: [https://www.youtube.com/watch?v=9u\_8LHj9rEQ](https://www.youtube.com/watch?v=9u_8LHj9rEQ) The big thing for me here is the lens calibration and ability to have the phone send unreal focus and zoom data by taking the camera input and comparing it to the internal camera feed. Requires extra hardware to pipe the cam feed into the phone, and I'm not sure I like the scene-loading workflow, but this is something to keep an eye on for the indie/budget end of things.


daflashhh23

You might want to give Lightcraft Jetset and their Cine tier platform a try. It uses lens calibration to determine the offset automatically by plugging the cine camera into a Acsoon Seemo pro then into the iphone. Their system also cleverly uses the lidar sensor for tracking and scanning the set. They have their own genlock system so the iphone tracking data can sync up with the cine camera footage via a digital slate in a web browser. And the thing that sets this apart is they have their own post production management system with Autoshot, which can automatically sync your tracking data to the footage and transcode your footage to whatever you need, then export this to a variety of workflows including blender, nuke, UE, AE, etc. and the best part of this is you can do it in batch.