I have a need to stream a user’s camera on iPhone/Android using a streaming service (Dolby.IO). In addition, I want o display the user’s video in my app as well. Think like FaceTime, where you see your own video as well as the person you are talking to. The streaming service takes in a RenderTexure to stream. My application is Portrait mode only. The problem is the video from the phone camera is rotated 90 degrees. I use a WebCamTexture to a RenderTexture to display and send the video. I found a way to rotate the quad that displays the video to show it in Landscape mode on the phone, but that still leaves two problems.
- The RenderTexture still sends the video to the stream flipped 90 degrees.
- Once I flip it, it’s a landscape video while the person is holding their phone in portrait. I need the video to look like it would if you opened your camera app and took a video in portrait mode.
Here is some of my code:
webCamTexture = new WebCamTexture(thedevice.name, resolution.width, resolution.height);
renderTexture = new RenderTexture(resolution.width, resolution.height, 16, RenderTextureFormat.BGRA32);
quad.GetComponent<Renderer>().material.mainTexture = webCamTexture;
//more code here
I’ve read you can use custom shaders to rotate the WebCamTexture, but I have not worked with shaders much. And every solution I have tried has not worked. I don’t understand how this is a thing. It seems like Unity should be reading the orientation of the device and managing the camera image accordingly. Or at least have a way for us to do simply that does not involve looking at the Rotation of the camera and then adjusting the transform of an object (which does not solve the RenderTexture problem).
Any suggestions on how to solve this.
Here is a picture of what happens. I am holding the phone in Portrait mode. I need that to be a portrait mode video. Instead it is landscape and squeezed in. The white space will be the incoming video of the other person.