The main problem is to transfer your camera actor rotation, especially pitch. Rotation issue cause by Gimbal lock, so it will be very difficult to get only one axis rotation without affect others, second is to have an output from 0° to 360° from one axis also become a problem, even using quaternions. Quaternions are not available by default in blueprints, except one node (Make Quat), but exist in engine, so you can create your own nodes (Marketplace example of such library), still result sometimes could be tough to archive. The simplest solution, instead of using rotation, will be using transform vectors. So lets begin with initial setup.
Create a blueprint actor and add SceneCaptureComponentCube and Camera with (0.0.0) coordinates.
Create new Cube Render Target and apply into your Texture Target of SceneCaptureComponentCube in blueprint.
Change the Texture Target resolution to desire one in settings. Aspect ratio is 2:1, Size X is a vertical resolution, in this case 2048 – total resolution will be 4096 x 2048px (4K).
We need to setup transform vectors to pass from our camera to target material output, first let’s create them. Add three new variables X, Y, Z and make them as Vector type.
Next step will be to set a values from our Actor – Camera. We need to update this values every tick. We will get Forward Vector, Right Vector and Up Vector from our Camera Actor and connect them to X, Y and Z respectively.
We need to output our result that Camera captures to our TextureRenderTargetCube. Lets do it by sampling our Cubemap capture result into a Plane as a Material and render this as a video or sequence.
Create a new blueprint actor and add Camera and Plane.
Setup Plane position and Camera settings to perfect fit our Plane into camera frustum. For me Plane coordinates was this:
Scale 2 on X is to respect our aspect ratio 2:1. The Camera settings was this:
I chose Orthographic Projection Mode to not worry about angle of view and setup Ortho Width base to 200 based on my scale X of the Plane and turn on Constrain Aspect Ration.
Lets create material for our Plane that will render our TextureRenderTargetCube and pass into it our X, Y, Z Vectors.
Create new material, add TextureSampleParameterCube and choose as input our CubeRenderTarget texture that we create before for our SceneCaptureComponentCube.
You could have this Error – Cube sample needs UV input, by default it doesn’t now what texture coordinates your will use. Let’s connect our Plane coordinates(TexCoord) but remap UV coordinates from the spherical format (by default for cube sources such as HDR textures or cube targets ) to square texture space, from float2 into float3 by using UVToLongLat function.
In case you don’t have this function this is what it doing:
And this is a Custom Node code with one UV input Float3 output:
float2 Angles = float2(2 * PI * (UV.x + 0.5f), PI * UV.y);
float s = sin(Angles.y);
float3 Direction = float3(s * sin(Angles.x), cos(Angles.y), -s * cos(Angles.x));
Next step will be add our X, Y and Z vectors into material to make transform of our UVs. We will use the material function called Inverse Transform Matrix and create three Material Parameters X, Y and Z. Connect them into BasisX, BasisY and BasisZ. Also we need to define our starting coordinates which will be our Texture Coordinates converted to square texture space using UVToLongLat function and connect the Transformed Vector output into TextureSampleParameterCube UV coordinates.
Don’t forget to apply this material to our Plane.
The last and important is to connect our Forward Vector, Right Vector and Up Vector from our Camera Actor and Material Parameters X, Y and Z. First let’s add to our blueprint with a output Plane a new Variable and choose our SceneCubeCapture blueprint as a reference and make it Public by pressing and Eye icon:
This will allow us to choose in our scene the particular SceneCubeCapture blueprint that we created before and get access to his variables, such as X, Y and Z vectors.
Next will be to create Dynamic Material Instance to be able change our Material Parameters X, Y, Z. On BeginPlay event we will create a Dynamic Material Instance with our Plane target static mesh and Source Material from the same Plane which we created before.
Now we just need to get from our Scene Cube Capture Reference X, Y, Z Vectors and set their values to our Material Parameters X, Y and Z and do it every Tick. This could be done by Set Vector Parameter Value for each of our Vectors with certain names – X, Y, Z.
You should receive an Error here cause we forgot to connect our Target Material where our Parameters exist, lets do it:
We connect our Dynamic Material Instance as a Target to change every Even Tick our Material Parameters X, Y, Z values to Scene Cube Capture Reference blueprint from our Camera Actor Forward, Right and Up Vectors.
That’s it. Put into your scene SceneCubeCapture blueprint with Camera Actor and CubeTextureRender blueprint where you will render output from it. Don’t forget to select Scene Cube Capture Reference to our SceneCubeCapture.
Final step will be animate your CubeTextureRender actor as you needed, using Sequencer and output your SceneCubeCapture Camera result as a video or sequence.
You need to press Play or Simulate to be able see a rotation from our Camera Actor into our SceneCubeCapture Plane Material cause all logic to transfer those values happen in blueprints.
Don’t be scary of Pitch rotation distortion in a proper 360° video it will be fine. This is an example in Play Mode the CubeTextureRender blueprint Plane Material render our Camera Actor animation.
Now you need to render this as a sequence or video, apply a meta data for 360° videos, example Youtube Guide.
How did you render out the images?
Using sequencer 🙂
I get an error with the set vector parameters. 🙁