The Google Cardboard SDK for Android still is somewhat scary to me because of its OpenGL nature. It offers many convenience functions like automatic side by side rendering and lens distortion correction but you have to be at least somewhat familiar with low level OpenGL ES 2.0.
For this Android experiment I wanted to render a 360° equirectangular video inside a sphere with a decent framerate. After some investigation I decided to go with an OpenGL library or engine. Streaming a video to textures seems to be quite complex and poorly documented.
Choose a 3D engine
There a number of OpenGL libraries available out there and I didn’t put as much time in choosing one as I probably should have. I ended up with the open source Rajawali engine. The engines example projects are mostly out of date but the video texture example looked easy enough:
// inside RajawaliRenderer#initScene()
// setup world sphere
Sphere sphere = new Sphere(1, 24, 24);
sphere.setPosition(0, 0, 0);
// invert the sphere normals
// factor "1" is two small and result in rendering glitches
sphere.setScaleX(100);
sphere.setScaleY(100);
sphere.setScaleZ(-100);
// create texture from media player video
StreamingTexture videoTexture = new StreamingTexture("video", mediaPlayer);
// set material with video texture
Material material = new Material();
material.setColorInfluence(0f);
try {
material.addTexture(videoTexture);
} catch (ATexture.TextureException e){
throw new RuntimeException(e);
}
sphere.setMaterial(material);
// add sphere to scene
getCurrentScene().addChild(sphere);
As you can see StreamingTexture takes a regular Android MediaPlayer instance as argument. That’s very convenient because you can control the media player with its high level functions like play, pause or seekTo and have the changes reflect on your texture!
To reflect the changes in realtime you have to override the RajawaliRenderer#onRender() function like this:
@Override
protected void onRender(long elapsedRealTime, double deltaTime) {
super.onRender(elapsedRealTime, deltaTime);
if (videoTexture != null) {
// update texture from video content
videoTexture.update();
}
}
Voilà! That’s all it takes to render a 360° video in OpenGL. Of course head tracking and side by side rendering are still missing. That’s covered in the next step.
Google Cardboard SDK
In this step we will connect the Cardboard SDK’s convenience functions with Rajawali. I am implementing a CardboardView.StereoRenderer here while extending the RajawaliRenderer class. First we will handle surface changes that were detected by the Cardboard SDK:
@Override
public void onSurfaceChanged(int width, int height) {
// tell Rajawali that cardboard sdk detected a size change
super.onRenderSurfaceSizeChanged(null, width, height);
}
@Override
public void onSurfaceCreated(EGLConfig eglConfig) {
// pass opengl config to Rajawali
super.onRenderSurfaceCreated(eglConfig, null, -1, -1);
}
@Override
public void onRendererShutdown() {
// tell Rajawali about shutdown
super.onRenderSurfaceDestroyed(null);
}
But most important is the onDrawEye method. It gives us an Eye instance, holding the current field of view, position and orientation of the camera rendering the current eye. These parameters have to be applied to Rajawalis current camera. As a last step we tell Rajawali to render with the updated camera parameters.
@Override
public void onDrawEye(Eye eye) {
// Rajawali camera
Camera currentCamera = getCurrentCamera();
// cardboard field of view
FieldOfView fov = eye.getFov();
// update Rajawali camera from cardboard sdk
currentCamera.updatePerspective(fov.getLeft(), fov.getRight(), fov.getBottom(), fov.getTop());
eyeMatrix.setAll(eye.getEyeView());
// orientation
eyeOrientation.fromMatrix(eyeMatrix);
currentCamera.setOrientation(eyeOrientation);
// position
eyePosition = eyeMatrix.getTranslation().inverse();
currentCamera.setPosition(eyePosition);
// render with Rajawali
super.onRenderFrame(null);
}
Of course all helper variables have to be declared and instantiated:
/** position and rotation of eye camera in 3d space as matrix object */
private Matrix4 eyeMatrix = new Matrix4();
/** rotation of eye camera in 3d space */
private Quaternion eyeOrientation = new Quaternion();
/** position of eye camera in 3d space */
private Vector3 eyePosition;
Wrap it up
That’s it! The heavy lifting is already done but there a some things to consider. You have to create and start the MediaPlayer instance and wire up Activity, your custom Renderer, MediaPlayer and CardboardView. These steps are easy to figure out and well documented. Finally it is good practice to pause the MediaPlayer instance when the activity is paused by the Android system to save resources.
Please feel free to comment if you have any questions or additions!
X41
November 19, 2015 — 2:18 pm
but where is the demo video? :O
there needs to be a demo video!
Miracula
November 21, 2015 — 9:14 pm
There is no 360° video I own the rights of. So no demo video at the moment 🙁
dameng
May 23, 2016 — 10:54 am
but how to use the gesture to control the camera?
Miracula
May 23, 2016 — 8:21 pm
What do you mean? With this example code you can look around by moving your head. The Google Cardboard SDK does some sensor fusion and provides an orientation matrix. Look at the onDrawEye method above to see how the orientation is applied to the Rajawali camera.
Alex
October 14, 2016 — 2:04 pm
Hello,
Your example is really helpful, thank you so much for writing this.
I have implemented your example code into my project but my app is crashing when launching the activity.
Do you plan on posting a full code version of the above so I can compare what is going wrong? Thanks.
Miracula
October 14, 2016 — 9:15 pm
I created a Gist with the essential files. However, the Cardboard SDK I used is nearly one year old and out of date by now. I’m sure the current SDK will break the code in some way.
Alex
October 17, 2016 — 3:37 pm
Great! managed to get that working now, thank you so much!
I did have to add some abiFilters to the build.gradle and there seems to be something strange with the sphere/head movement being inverted but i’m sure it just requires a little adjusting.
Also will begin looking into whether this approach will work with the new GvrView rather than Cardboard but this may depend on whether Rajawali supports this.
Miracula
October 17, 2016 — 8:15 pm
I’m glad I could help 🙂
hakan
March 18, 2017 — 3:08 pm
Hello I started learning new. Can you explain to me a bit more descriptively how should I open these files with android studio yada can I use these files?
Miracula
March 18, 2017 — 5:40 pm
If you’re a beginner at developing for Android I would recommend to start with a less complex project. Alternatively you may start out with the Rajawali VideoTexture sample (check out the whole project and try to run it in Android Studio) and get comfortable with the code. Once you understand, what the VideoTextureFragment-class is doing, you can revisit my code samples and add them in.