How to use RajawaliVR or Rajawali to play 360 videos

It's hard for me to figure out how to use Rajawali to play 360 degree videos. To achieve this, I tried every solution I could find on the Internet, but I could not.

First, I used a Rajawali Cardboard, and let MainActivity extend from CardboardActivity . At the same time, in the MyRenderer class, this class extends from the RajawaliCardboardRenderer class. In the MyRenderer class MyRenderer I redefined the initScene() function:

 protected void initScene() { StreamingTexture mTexture = null; if (externalMemoryAvailable()) { mVideoPath = Environment.getExternalStorageDirectory().getAbsolutePath()+"/testVideo.mp4"; try{ mPlayer = new MediaPlayer(); mPlayer.setDataSource(mVideoPath); }catch(IllegalArgumentException e){ e.printStackTrace(); } catch (SecurityException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IllegalStateException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { mPlayer.prepare(); } catch (IOException t) { t.printStackTrace(); } mTexture = new StreamingTexture("video", mPlayer); } Sphere sphere = createPhotoSphereWithTexture(mTexture); getCurrentScene().addChild(sphere); getCurrentCamera().setPosition(Vector3.ZERO); getCurrentCamera().setFieldOfView(75); } private Sphere createPhotoSphereWithTexture(ATexture texture) { Material material = new Material(); material.setColor(0); try { material.addTexture(texture); } catch (ATexture.TextureException e) { throw new RuntimeException(e); } Sphere sphere = new Sphere(50, 64, 32); sphere.setScaleX(-1); sphere.setMaterial(material); return sphere; } 

The program can work without any errors, but the screen is black and without an image.
I want to ask what should I do to improve my program, and why should I do to play the video using Rajawali. Who can help me?

+4
source share
3 answers

I had success to play a video with Rajavali.

 public class VideoRenderer extends RajawaliCardboardRenderer { Context mContext; private MediaPlayer mMediaPlayer; private StreamingTexture mVideoTexture; public VideoRenderer(Context context) { super(context); mContext = context; } @Override protected void initScene() { mMediaPlayer = MediaPlayer.create(getContext(), R.raw.video); mMediaPlayer.setLooping(true); mVideoTexture = new StreamingTexture("sintelTrailer", mMediaPlayer); Material material = new Material(); material.setColorInfluence(0); try { material.addTexture(mVideoTexture); } catch (ATexture.TextureException e) { e.printStackTrace(); } Sphere sphere = new Sphere(50, 64, 32); sphere.setScaleX(-1); sphere.setMaterial(material); getCurrentScene().addChild(sphere); getCurrentCamera().setPosition(Vector3.ZERO); getCurrentCamera().setFieldOfView(75); mMediaPlayer.start(); } @Override protected void onRender(long ellapsedRealtime, double deltaTime) { super.onRender(ellapsedRealtime, deltaTime); mVideoTexture.update(); } @Override public void onPause() { super.onPause(); if (mMediaPlayer != null) mMediaPlayer.pause(); } @Override public void onResume() { super.onResume(); if (mMediaPlayer != null) mMediaPlayer.start(); } @Override public void onRenderSurfaceDestroyed(SurfaceTexture surfaceTexture) { super.onRenderSurfaceDestroyed(surfaceTexture); mMediaPlayer.stop(); mMediaPlayer.release(); } public void nextVideo(String nextVideoPath){ try{ mMediaPlayer.stop(); mMediaPlayer.reset(); mMediaPlayer.setDataSource(nextVideoPath); mMediaPlayer.prepare(); mMediaPlayer.start(); }catch (Exception e){ e.printStackTrace(); } } } 
+9
source

I think your main mistake is to call MediaPlayer.prepare() in the media player, not MediaPlayer.prepareAsync()
You must consider the various states that MediaPlayer performs when playing a video. Here you have a link to the status diagram. You should only call MediaPlayer.start() as soon as the video player has finished preparing everything for the video to start playing.
I am working on the same thing (360 video video player) with Rajawali, and so far I have achieved to play them in the normal gyro and touch mode, but I have many problems to get it working with Google Cardboard integration, so I I'm trying to make my own "sideBySide" renderer at the moment.

If my comments are not enough, here you have a code sample that I am now using to play video as a streaming texture on the Sphere. It is part of an overridden initScene() method for a class that extends RajawaliRenderer

 //create a 100 segment sphere earthSphere = new Sphere(1, 100, 100); //try to set the mediaPLayer data source mMediaPlayer = new MediaPlayer(); try{ mMediaPlayer.setDataSource(context, Uri.parse("android.resource://" + context.getPackageName() + "/" + R.raw.pyrex)); }catch(IOException ex){ Log.e("ERROR","couldn attach data source to the media player"); } mMediaPlayer.setLooping(true); //enable video looping video = new StreamingTexture("pyrex",mMediaPlayer); //create video texture mMediaPlayer.prepareAsync(); //prepare the player (asynchronous) mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() { @Override public void onPrepared(MediaPlayer mp) { mp.start(); //start the player only when it is prepared } }); //add textture to a new material Material material = new Material (); material.setColorInfluence(0f); try{ material.addTexture(video); }catch(ATexture.TextureException ex){ Log.e("ERROR","texture error when adding video to material"); } //set the material to the sphere earthSphere.setMaterial(material); earthSphere.setPosition(0, 0, 0); //add the sphere to the rendering scene getCurrentScene().addChild(earthSphere); 
+1
source

Since you want to play 360 videos, you will need the Orientation tracker. Here is an example for cardboard activities.

 public class CardboardRendererExample extends Renderer implements CardboardView.StereoRenderer { public static final int FIELD_OF_VIEW = 90; public static final float PLANE_WIDTH = 64.0f; public static final float PLANE_HEIGHT = 36.0f; public static final float PLANE_DISTANCE = -64.0f; private final MediaPlayer mMediaPlayer; protected StreamingTexture mStreamingTexture; protected Quaternion mOrientation = Quaternion.getIdentity(); protected Quaternion mEyeOrientation = Quaternion.getIdentity(); protected float[] mHeadView = new float[16]; private Matrix4 mEyeMatrix = new Matrix4(); private Vector3 mEyePosition = new Vector3(); private Matrix4 mHeadViewMatrix4 = new Matrix4(); public CardboardRendererExample(Context context, MediaPlayer mediaPlayer) { super(context); mMediaPlayer = mediaPlayer; } @Override protected void initScene() { getCurrentCamera().setPosition(Vector3.ZERO); getCurrentCamera().setFieldOfView(FIELD_OF_VIEW); mStreamingTexture = new StreamingTexture("give_it_some_name", mMediaPlayer); mStreamingTexture.shouldRecycle(true); setSceneCachingEnabled(true); final Plane projectionScreen = new Plane(PLANE_WIDTH, PLANE_HEIGHT, 64, 64); final Material material = new Material(); material.setColor(0); material.setColorInfluence(0f); try { material.addTexture(mStreamingTexture); } catch (ATexture.TextureException e) { e.printStackTrace(); throw new RuntimeException(e); } projectionScreen.setDoubleSided(true); projectionScreen.setMaterial(material); projectionScreen.setTransparent(true); projectionScreen.setPosition(0, 0, PLANE_DISTANCE); getCurrentScene().addChild(projectionScreen); getCurrentScene().addChild(projectionScreen); } @Override public void onNewFrame(HeadTransform headTransform) { headTransform.getHeadView(mHeadView, 0); mHeadViewMatrix4.setAll(mHeadView).inverse(); mOrientation.fromMatrix(mHeadViewMatrix4); } @Override public void onDrawEye(Eye eye) { getCurrentCamera().updatePerspective( eye.getFov().getLeft(), eye.getFov().getRight(), eye.getFov().getBottom(), eye.getFov().getTop()); mEyeMatrix.setAll(eye.getEyeView()); mEyeOrientation.fromMatrix(mEyeMatrix); getCurrentCamera().setOrientation(mEyeOrientation); mEyePosition = mEyeMatrix.getTranslation(mEyePosition).inverse(); getCurrentCamera().setPosition(mEyePosition); super.onRenderFrame(null); } @Override public void onFinishFrame(Viewport viewport) { } @Override public void onSurfaceChanged(int width, int height) { super.onRenderSurfaceSizeChanged(null, width, height); } @Override public void onSurfaceCreated(EGLConfig eglConfig) { super.onRenderSurfaceCreated(eglConfig, null, -1, -1); } @Override public void onRenderSurfaceCreated(EGLConfig config, GL10 gl, int width, int height) { super.onRenderSurfaceCreated(config, gl, width, height); } @Override public void onRendererShutdown() { } @Override protected void onRender(long elapsedRealTime, double deltaTime) { super.onRender(elapsedRealTime, deltaTime); if (mStreamingTexture != null) { mStreamingTexture.update(); } } @Override public void onOffsetsChanged(float xOffset, float yOffset, float xOffsetStep, float yOffsetStep, int xPixelOffset, int yPixelOffset) { } @Override public void onTouchEvent(MotionEvent event) { } 

}

Alternatively, you can implement your tracker based (for example) on

 com.google.vrtoolkit.cardboard.sensors.HeadTracker 

Of course, you can get rid of all these fields, but they should make life easier for the GC.

0
source

All Articles