I'm trying to figure out a way to pipe ARKit's video camera feed into RenderSettings.customReflection so I can fake reflections in AR! Any ideas?
The main components I'm starting from are found in Unity's own ARKit plugin. First is the [YUVShader](https://bitbucket.org/Unity-Technologies/unity-arkit-plugin/src/890282116ef7f5324d9d29fda24b2eafb3baae68/Assets/UnityARKitPlugin/Plugins/iOS/UnityARKit/Shaders/YUVShader.shader?at=default&fileviewer=file-view-default).
This shader seems to combine a couple textures that make up the video feed (which comes in a biplanar YCbCr (also called YUV) data format for some reason, [more on that here.](https://developer.apple.com/documentation/arkit/displaying_an_ar_experience_with_metal))
[Then there's a component that uses this shader.](https://bitbucket.org/Unity-Technologies/unity-arkit-plugin/src/890282116ef7f5324d9d29fda24b2eafb3baae68/Assets/UnityARKitPlugin/Plugins/iOS/UnityARKit/Helpers/UnityARVideo.cs?at=default&fileviewer=file-view-default)
Together they magically put the feed from the device camera behind everything in the scene.
So, what I would like to do is also pipe this texture into something that will be properly picked up by standard shaded objects. My guess is I want to feed it into RenderSettings.customReflection although I have no idea if Unity will actually work correctly with a shader or command buffer or something changing that texture every frame.
Any tips at all would be very much appreciated! Thank you!
↧