Please select Into the mobile phone version | Continue to access the computer ver.
Close
You need to log in before you can reply       Login | Register now

[Android] My attempt to VR FPV app, currently lacks the yaw controlling method

Author: SylvanWind 2016-4-10 08:30
26 3757


i write an app for Android sdk, it has lens correction, a simple hud(battery bar on the left), adjustable output size by touching control, a head tracking on the pitch angle. I only have a phantom 3 advance, so yaw and roll cant be controlled by gimbal.
Realizing the fact that yaw can be controlled with flight control, im trying to send head yaw attitude to it , but the flight control turn out to accept only the complete yaw pitch roll and throttle. I can deal with yaw , but what about the other 3 i dont need? it cant just be set it to zero ,right?

This post contains more resources

You need to Login Before they can download or view, Not have an account?Register now

x
Reply
Replies
cinema2go
Try getting last (updated) values for all 4 parameters, edit and set all of them again.
2016-4-10 12:58
SylvanWind
cinema2go Posted at 2016-4-10 12:58
Try getting last (updated) values for all 4 parameters, edit and set all of them again.

well i tried...everything works fine, except that after enabling virtualstick, the virtualstick works when the RC controller doesnt. That's.....horrible while i test the stuff in my house, lucky nothing get damaged........
2016-4-10 13:13
cinema2go
SylvanWind Posted at 2016-4-10 13:13
well i tried...everything works fine, except that after enabling virtualstick, the virtualstick wo ...

I think get/set same parameters shouldn't affect other things. It looks like a bug somewhere
2016-4-10 19:33
SylvanWind
cinema2go Posted at 2016-4-10 19:33
I think get/set same parameters shouldn't affect other things. It looks like a bug somewhere

not that... to get Virtual Stick work i have to enable it by a function call in flightmode F, after that my RC controller lost control to the aircraft, so DJI doesnt like virtual stick and "hardware stick" functions together I think
2016-4-10 20:38
cinema2go
SylvanWind Posted at 2016-4-10 20:38
not that... to get Virtual Stick work i have to enable it by a function call in flightmode F, afte ...

Have no idea I am not experienced with Virtual Stick. Sorry
2016-4-10 20:54
johncarlo_franc
hello can you help me how to display video feeds on vr card board on android, i'm trying to pass texture view on google cardboard sdk but i get light only display.
2016-4-11 14:59
SylvanWind
johncarlo_franc Posted at 2016-4-11 14:59
hello can you help me how to display video feeds on vr card board on android, i'm trying to pass tex ...

it's not that hard, the DJICodecManager will decode the h264 frame onto a surfaceTexture you passed to it, just pass it to a GLSurfaceView and call its updateTex() method in onDrawFrame, this texture will be available as a samplerExternalOES in fragment shader, then you can render it as usual with fragment shader
2016-4-11 15:09
DJI SDK Support
The system must tell the drone who has the right to control the drone. And for one time, only one way to control the drone is our design.
2016-4-11 15:12
SylvanWind
DJI SDK Support Posted at 2016-4-11 15:12
The system must tell the drone who has the right to control the drone. And for one time, only one wa ...

This make sense , but really i hope DJI would split different channel like yaw pitch roll throttle control rights to different devices. The RC controller could handle the pitch and roll channel, yaw controlled by my phone's imu.
2016-4-11 15:21
johncarlo_franc
i get this display

This post contains more resources

You need to Login Before they can download or view, Not have an account?Register now

x
2016-4-11 15:26
SylvanWind
Last edited by SylvanWind In 2016-4-11 17:17 Editor

looks like you didn't do it right . I have no knowledge of cardboard sdk, but just render surfaceTexture in GL shader will be simple enough i think. A surfaceTexture could be capture as a samplerExternalOES in shader program, after it's updateTexImage() method call
    private static final String FRAGMENT_SHADER =
                    "#extension GL_OES_EGL_image_external : require\n" +
                    "precision mediump float;\n" +     
                    "varying vec2 vTextureCoord;\n" +
                    "uniform samplerExternalOES sTexture;\n" +

                    "void main()\n" +
                    "{\n" +
                    "  gl_FragColor = texture2D(sTexture, vTextureCoord) ;\n" +
                    "}";


here is my fragment shader for the first rendering pass that rendering the stream frame into a fbo for post process

public void onDrawFrame(GL10 gl) {
            //checkGlError("onDrawFrame start");
            st.updateTexImage();

            .....
}

and this should be called in the GLSurfaceView.Renderer , st is the surfacetexture
2016-4-11 17:05
SylvanWind
Last edited by SylvanWind In 2016-4-12 13:57 Editor

i found a trick that may still use RC controller when Virtual Stick is enabled, the trick is to receive RemoteController state of the left and right stick, and use them as the param of virtual stick, I will give it a try later
2016-4-12 10:47
DJI SDK Support
SylvanWind Posted at 2016-4-12 10:47
i found a trick that may still use RC controller when Virtual Stick is enabled, the trick is to rece ...

Is that available now? If there is some bug, please don't share it in the forum.
2016-4-12 14:33
johncarlo_franc
SylvanWind Posted at 2016-4-11 17:05
looks like you didn't do it right . I have no knowledge of cardboard sdk, but just render surfaceTe ...

yes i'm doing same, here my full class please help me, i'm stock for this function.
i will pay if you like
  1. package phantomman.philippineprogrammer.com.phantomman.activities.cardboard;

  2. import android.graphics.SurfaceTexture;
  3. import android.graphics.SurfaceTexture.OnFrameAvailableListener;
  4. import android.hardware.Camera;
  5. import android.opengl.GLES11Ext;
  6. import android.opengl.GLES20;
  7. import android.opengl.Matrix;
  8. import android.os.Bundle;
  9. import android.util.Log;

  10. import com.google.vrtoolkit.cardboard.CardboardActivity;
  11. import com.google.vrtoolkit.cardboard.CardboardView;
  12. import com.google.vrtoolkit.cardboard.Eye;
  13. import com.google.vrtoolkit.cardboard.HeadTransform;
  14. import com.google.vrtoolkit.cardboard.Viewport;

  15. import java.io.IOException;
  16. import java.nio.ByteBuffer;
  17. import java.nio.ByteOrder;
  18. import java.nio.FloatBuffer;
  19. import java.nio.ShortBuffer;

  20. import javax.microedition.khronos.egl.EGLConfig;
  21. import javax.microedition.khronos.opengles.GL10;

  22. import dji.sdk.Codec.DJICodecManager;
  23. import phantomman.philippineprogrammer.com.phantomman.R;
  24. import phantomman.philippineprogrammer.com.phantomman.camera.DjiCameraView;
  25. import phantomman.philippineprogrammer.com.phantomman.cardboard.CardboardHUD;

  26. import static android.opengl.Matrix.scaleM;

  27. public class CardBoardCamera extends CardboardActivity implements CardboardView.StereoRenderer, OnFrameAvailableListener {
  28.     private static final String TAG = "MainActivity";
  29.     private static final int GL_TEXTURE_EXTERNAL_OES = 0x8D65;
  30.     private boolean updateSurface = false;
  31. //    private Camera camera;
  32. private DJICodecManager codecManager = null;
  33.     private final String vertexShaderCode =
  34.             "attribute vec4 position;" +
  35.                     "attribute vec2 inputTextureCoordinate;" +
  36.                     "varying vec2 textureCoordinate;" +
  37.                     "void main()" +
  38.                     "{"+
  39.                     "gl_Position = position;"+
  40.                     "textureCoordinate = inputTextureCoordinate;" +
  41.                     "}";

  42.     private final String fragmentShaderCode ="#extension GL_OES_EGL_image_external : require\n" +
  43.             "precision mediump float;\n" +
  44.             "varying vec2 vTextureCoord;\n" +
  45.             "uniform samplerExternalOES sTexture;\n" +

  46.             "void main()\n" +
  47.             "{\n" +
  48.             "  gl_FragColor = texture2D(sTexture, vTextureCoord) ;\n" +
  49.             "}";
  50. //            "#extension GL_OES_EGL_image_external : require\n"+
  51. //                    "precision mediump float;" +
  52. //                    "varying vec2 textureCoordinate;                            \n" +
  53. //                    "uniform samplerExternalOES s_texture;               \n" +
  54. //                    "void main(void) {" +
  55. //                    "  gl_FragColor = texture2D( s_texture, textureCoordinate );\n" +
  56. //                    //"  gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\n" +
  57. //                    "}";

  58.     private FloatBuffer vertexBuffer, textureVerticesBuffer, vertexBuffer2;
  59.     private ShortBuffer drawListBuffer, buf2;
  60.     private int mProgram;
  61.     private int mPositionHandle, mPositionHandle2;
  62.     private int mColorHandle;
  63.     private int mTextureCoordHandle;


  64.     // number of coordinates per vertex in this array
  65.     static final int COORDS_PER_VERTEX = 2;
  66.     static float squareVertices[] = { // in counterclockwise order:
  67.             -1.0f, -1.0f,   // 0.left - mid
  68.             1.0f, -1.0f,   // 1. right - mid
  69.             -1.0f, 1.0f,   // 2. left - top
  70.             1.0f, 1.0f,   // 3. right - top
  71.     };




  72.     //, 1, 4, 3, 4, 5, 3
  73. //    private short drawOrder[] =  {0, 1, 2, 1, 3, 2 };//, 4, 5, 0, 5, 0, 1 }; // order to draw vertices
  74.     private short drawOrder[] =  {0, 2, 1, 1, 2, 3 }; // order to draw vertices
  75.     private short drawOrder2[] = {2, 0, 3, 3, 0, 1}; // order to draw vertices

  76.     static float textureVertices[] = {
  77.             0.0f, 1.0f,  // A. left-bottom
  78.             1.0f, 1.0f,  // B. right-bottom
  79.             0.0f, 0.0f,  // C. left-top
  80.             1.0f, 0.0f   // D. right-top

  81.     };

  82.     private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex

  83.     private ByteBuffer indexBuffer;    // Buffer for index-array

  84.     private int texture;


  85.     private CardboardHUD mOverlayView;


  86.     private CardboardView cardboardView;
  87.     private SurfaceTexture surface;
  88.     private float[] mView;
  89.     private float[] mCamera;

  90.     public void startCamera(int texture)
  91.     {
  92.         surface = new SurfaceTexture(texture);
  93.         surface.setOnFrameAvailableListener(this);

  94. //        camera = Camera.open();
  95. //
  96. //        try
  97. //        {
  98. //            camera.setPreviewTexture(surface);
  99. //            camera.startPreview();
  100. //        }
  101. //        catch (IOException ioe)
  102. //        {
  103. //            Log.w("MainActivity","CAM LAUNCH FAILED");
  104. //        }
  105.     }

  106.     static private int createTexture()
  107.     {
  108.         int[] texture = new int[1];

  109.         GLES20.glGenTextures(1, texture, 0);
  110.         GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture[0]);
  111.         GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES,
  112.                 GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);
  113.         GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES,
  114.                 GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
  115.         GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,
  116.                 GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
  117.         GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,
  118.                 GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);

  119.         return texture[0];
  120.     }


  121.     /**
  122.      * Converts a raw text file, saved as a resource, into an OpenGL ES shader
  123.      * @param type The type of shader we will be creating.
  124.      * @return
  125.      */
  126.     private int loadGLShader(int type, String code) {
  127.         int shader = GLES20.glCreateShader(type);
  128.         GLES20.glShaderSource(shader, code);
  129.         GLES20.glCompileShader(shader);

  130.         // Get the compilation status.
  131.         final int[] compileStatus = new int[1];
  132.         GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compileStatus, 0);

  133.         // If the compilation failed, delete the shader.
  134.         if (compileStatus[0] == 0) {
  135.             Log.e(TAG, "Error compiling shader: " + GLES20.glGetShaderInfoLog(shader));
  136.             GLES20.glDeleteShader(shader);
  137.             shader = 0;
  138.         }

  139.         if (shader == 0) {
  140.             throw new RuntimeException("Error creating shader.");
  141.         }

  142.         return shader;
  143.     }

  144.     /**
  145.      * Checks if we've had an error inside of OpenGL ES, and if so what that error is.
  146.      * @param func
  147.      */
  148.     private static void checkGLError(String func) {
  149.         int error;
  150.         while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
  151.             Log.e(TAG, func + ": glError " + error);
  152.             throw new RuntimeException(func + ": glError " + error);
  153.         }
  154.     }

  155.     /**
  156.      * Sets the view to our CardboardView and initializes the transformation matrices we will use
  157.      * to render our scene.
  158.      * @param savedInstanceState
  159.      */
  160.     protected DjiCameraView cameraView;
  161.     @Override
  162.     public void onCreate(Bundle savedInstanceState) {
  163.         super.onCreate(savedInstanceState);

  164.         setContentView(R.layout.activity_cardboard_camera);
  165.         cardboardView = (CardboardView) findViewById(R.id.cardboard_view);
  166. //        cardboardView.setRestoreGLStateEnabled(false);
  167.         cardboardView.setRenderer(this);
  168.         setCardboardView(cardboardView);

  169. //        mModelCube = new float[16];
  170.         mCamera = new float[16];
  171.         mView = new float[16];

  172. //
  173.         mOverlayView = (CardboardHUD) findViewById(R.id.hud);
  174.         mOverlayView.show3DToast("Pull the magnet when you find an object.");
  175.     }

  176.     @Override
  177.     public void onRendererShutdown() {
  178.         Log.i(TAG, "onRendererShutdown");
  179.     }

  180.     @Override
  181.     public void onSurfaceChanged(final int width, final int height) {
  182.         Log.i(TAG, "onSurfaceChanged");
  183.         this.runOnUiThread(new Runnable() {
  184.             public void run() {
  185.                 if (codecManager == null) {
  186.                     Log.e(TAG, "codecManager is null");
  187.                     codecManager = new DJICodecManager(getApplicationContext(), surface, width, height);
  188.                 }
  189.             }
  190.         });
  191.     }

  192.     /**
  193.      * Creates the buffers we use to store information about the 3D world. OpenGL doesn't use Java
  194.      * arrays, but rather needs data in a format it can understand. Hence we use ByteBuffers.
  195.      * @param config The EGL configuration used when creating the surface.
  196.      */
  197.     @Override
  198.     public void onSurfaceCreated(EGLConfig config) {
  199.         Log.i(TAG, "onSurfaceCreated");
  200.         GLES20.glClearColor(0.1f, 0.1f, 0.1f, 0.5f); // Dark background so text shows up well

  201.         ByteBuffer bb = ByteBuffer.allocateDirect(squareVertices.length * 4);
  202.         bb.order(ByteOrder.nativeOrder());
  203.         vertexBuffer = bb.asFloatBuffer();
  204.         vertexBuffer.put(squareVertices);
  205.         vertexBuffer.position(0);


  206.         ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
  207.         dlb.order(ByteOrder.nativeOrder());
  208.         drawListBuffer = dlb.asShortBuffer();
  209.         drawListBuffer.put(drawOrder);
  210.         drawListBuffer.position(0);


  211.         ByteBuffer bb2 = ByteBuffer.allocateDirect(textureVertices.length * 4);
  212.         bb2.order(ByteOrder.nativeOrder());
  213.         textureVerticesBuffer = bb2.asFloatBuffer();
  214.         textureVerticesBuffer.put(textureVertices);
  215.         textureVerticesBuffer.position(0);

  216.         int vertexShader = loadGLShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
  217.         int fragmentShader = loadGLShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);

  218.         mProgram = GLES20.glCreateProgram();             // create empty OpenGL ES Program
  219.         GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
  220.         GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
  221.         GLES20.glLinkProgram(mProgram);

  222.         texture = createTexture();
  223. //        startCamera(texture);
  224.         surface = new SurfaceTexture(texture);
  225.         surface.setOnFrameAvailableListener(this);
  226.         synchronized (this) {
  227.             updateSurface = false;
  228.         }

  229.     }

  230.     @Override
  231.     public void onNewFrame(HeadTransform headTransform) {
  232.         synchronized (this) {
  233.             if (updateSurface) {
  234.                 surface.updateTexImage();
  235.                 float[] mtx = new float[16];
  236.                 GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
  237.                 surface.getTransformMatrix(mtx);
  238.                 updateSurface = false;
  239.             }
  240.         }

  241.     }

  242.     @Override
  243.     public void onFrameAvailable(SurfaceTexture arg0) {
  244.         this.cardboardView.requestRender();
  245.         updateSurface = true;
  246.     }

  247.     /**
  248.      * Draws a frame for an eye. The transformation for that eye (from the camera) is passed in as
  249.      * a parameter.
  250.      * @param transform The transformations to apply to render this eye.
  251.      */
  252.     @Override
  253.     public void onDrawEye(Eye transform) {
  254.         //
  255.         GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
  256.         GLES20.glUseProgram(mProgram);
  257.         GLES20.glActiveTexture(GL_TEXTURE_EXTERNAL_OES);
  258. //        GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture);
  259.         GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture);
  260.         mPositionHandle = GLES20.glGetAttribLocation(mProgram, "position");
  261.         GLES20.glEnableVertexAttribArray(mPositionHandle);
  262.         GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT,
  263.                 false,vertexStride, vertexBuffer);
  264.         mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
  265.         GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
  266.         GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT,
  267.                 false, vertexStride, textureVerticesBuffer);
  268.         mColorHandle = GLES20.glGetAttribLocation(mProgram, "s_texture");
  269.         GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length,
  270.                 GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
  271.         // Disable vertex array
  272.         GLES20.glDisableVertexAttribArray(mPositionHandle);
  273.         GLES20.glDisableVertexAttribArray(mTextureCoordHandle);
  274.         Matrix.multiplyMM(mView, 0, transform.getEyeView(), 0, mCamera, 0);
  275.         float mViews[]={2};
  276. //        scaleM(mViews,0,mCamera,0,0.2f,0.2f,0.2f);
  277.     }

  278.     @Override
  279.     public void onFinishFrame(Viewport viewport) {
  280.     }

  281.     /**
  282.      * Increment the score, hide the object, and give feedback if the user pulls the magnet while
  283.      * looking at the object. Otherwise, remind the user what to do.
  284.      */
  285.     @Override
  286.     public void onCardboardTrigger() {
  287.     }
  288. }
Copy the code
2016-4-12 16:49
yunqi
Can you tell me what vr device are you using?Oculus or GearVR?
2016-4-12 17:42
SylvanWind
Last edited by SylvanWind In 2016-4-12 18:11 Editor
johncarlo_franc Posted at 2016-4-12 16:49
yes i'm doing same, here my full class please help me, i'm stock for this function.
i will pay if  ...

I havent try ur code but there are some issues
"vTextureCoord" in my fragment shader is the varying of my texture coordinate, this should fit to your identifier in your vertex shader, here in your code is "textureCoordinate"

the DJICodecManager is a decoder object, you must send h264 stream bytes to it's .sendDataToDecoder(byte[] buf , int size) method, then it will output the decoded frame onto the surface texture, you can take the DJI's fpv sample for reference,  there you will see an OnReceivedVideoCallback that do the thing. https://github.com/DJI-Mobile-SDK/Android-FPVDemo

I think you can directly import this sample activity into your project,  replace it's content view with your Cardboard view, the bridge from dji's activity to your view is the surfacetexture, so init the surfacetexture in your view, get it's instance, pass it to DJI activity and construct the DJICodecManager with that surfacetexture.  

hope this help
oh , there is  1 last thing about surfacetexture, st.setDefaultBufferSize(mwidth,mheight) , you shold call this before the construct of codecmanager
2016-4-12 17:58
SylvanWind
yunqi Posted at 2016-4-12 17:42
Can you tell me what vr device are you using?Oculus or GearVR?

a 100 yuan 暴风魔镜
2016-4-12 18:08
SylvanWind
Last edited by SylvanWind In 2016-4-12 23:47 Editor
DJI SDK Support Posted at 2016-4-12 14:33
Is that available now? If there is some bug, please don't share it in the forum.

I tried and it worked. to my opinion this could hardly be any bug , since I'm just grabbing RC controller's value from callback and sending them using VirtualStick command. The RC controller wont control the aircraft if i dont send them.
2016-4-12 21:29
johncarlo_franc
SylvanWind Posted at 2016-4-12 17:58
I havent try ur code but there are some issues
"vTextureCoord" in my fragment shader is the varying ...

can you show me your full snippet of your CardboardActivity class ?
if its ok to you thank you very much  
2016-4-14 13:13
SylvanWind
Last edited by SylvanWind In 2016-4-14 22:37 Editor
johncarlo_franc Posted at 2016-4-14 13:13
can you show me your full snippet of your CardboardActivity class ?
if its ok to you thank you ver ...

really dont know much about the cardboard sdk. My stereo view was done with GLSurfaceView and in which main structure contains nothing but draw the texture twice. most key points I have already mentioned before. Maybe you should try the dji fpvdemo first, and then graduately replace the textureview with your cardboardview, only by this you would see which step is wrong. I could post part of my GLSurfaceView's code but it really doesnt do anything special

        public void onSurfaceCreated(GL10 gl, EGLConfig config) {
            Matrix.setIdentityM(mSTMatrix, 0);

            lens.init();
            backsurface.init();
            bar.init();

            st = new SurfaceTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);
            st.setOnFrameAvailableListener(MyGLSurfaceView.this);
            if(mnotifier!=null)
                mnotifier.onSurfaceCreated(st);

            GLES20.glDisable(GLES20.GL_DEPTH_TEST);
        }
I just create surfacetexture in gl context and send it to decoder via callback

        public void onDrawFrame(GL10 gl) {
            //checkGlError("onDrawFrame start");
            st.updateTexImage();     //after this, samplerExternalOES in fragment shader get updated
            st.getTransformMatrix(mSTMatrix);

            GLES20.glViewport(0, 0, screenWidth / 2, screenHeight);
            rtthelper.renderPrepare();
            GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
            backsurface.drawContent(mSTMatrix);
            bar.drawContent(mBarMVPMatrix, mBarColor);
            rtthelper.finish();

            // (optional) clear to green so we can see if we're failing to set pixels
            GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, rtthelper.getTex());
            lens.drawPrepare(ratio, 0);

            //L screen
            GLES20.glViewport(0, 0, screenWidth / 2, screenHeight);
            lens.drawContent(mLiMVPMatrix, dx);
            //R screen
            GLES20.glViewport(screenWidth / 2, 0, screenWidth / 2, screenHeight);
            lens.drawContent(mRiMVPMatrix, -dx);
            lens.drawFinish();
        }
just keep in mind that updating the surfacetexture before the real draw. in dji sdk part i just follow the fpvdemo, init a bunch of relative class

mCodecManager = new DJICodecManager(VRActivity.this, msurfaceTexture, mwidth/2, mheight);
...
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
            @Override
            public void onResult(byte[] videoBuffer, int size) {
                //Log.e("test", "~~~~data");
                if(mCodecManager != null){
                    // Send the raw H264 video data to codec manager for decoding
                    mCodecManager.sendDataToDecoder(videoBuffer, size);
                }else {
                    Log.e(TAG, "mCodecManager is null");
                }
            }
        };

...
mCamera = mProduct.getCamera();
mCamera.setDJICameraReceivedVideoDataCallback(mReceivedVideoDataCallBack);
2016-4-14 21:56
nochoice
johncarlo_franc Posted at 2016-4-12 16:49
yes i'm doing same, here my full class please help me, i'm stock for this function.
i will pay if  ...

Have you solved this problem? I also get stuck for this. Wait for your reply!!
2016-5-4 11:10
johncarlo_franc
nochoice Posted at 2016-5-4 11:10
Have you solved this problem? I also get stuck for this. Wait for your reply!!

not solve for now i'm stop yet develop drone app, i have other project.
can i get your email? or skype, lets talk to develop this function
2016-5-6 09:58
SylvanWind
johncarlo_franc Posted at 2016-5-6 09:58
not solve for now i'm stop yet develop drone app, i have other project.
can i get your email? or s ...

what's your problem really? his problem is that he got only 1 pixel output whick extends into the full screen, the screen color could react to the drone flash light and other thing, but nothing else, this could be solved by adding setDefaultBufferSize call to surfaceTexture before the init of DJICodecManager
2016-5-9 21:20
johncarlo_franc
SylvanWind Posted at 2016-5-9 21:20
what's your problem really? his problem is that he got only 1 pixel output whick extends into the  ...

yes i doing same, i think my problem is the version of phantom, i'm using phantom 3 standard through wifi connection, and your using phantom 3 advance using usb connection lightbridge
2016-5-10 13:39
SylvanWind
johncarlo_franc Posted at 2016-5-10 13:39
yes i doing same, i think my problem is the version of phantom, i'm using phantom 3 standard throu ...

no P3A or P3S that's not the problem. If your app behaviour the same, that, got only 1 pixel output which extends to full screen, only 1 color,  give surfaceTexture a setDefaultBufferSize(width, height) method call while init
2016-5-10 14:33
_微风小杨_
for anyone who was interested in the APP, I opened a new thread and provide a downlink for the app

http://forum.dev.dji.com/forum.php?mod=viewthread&tid=32520
2016-7-26 22:32
You need to log in before you can reply Login | Register now

Quick Reply Back to top Back to list
Quick Reply Back to top Back to list