oepngl实现Fbo绘制相机数据

1:综述

在使用oepngl的过程中,经常会遇到多层图层叠加的需求,多层图层叠加的常见实现手段之一就是使用FBO来实现。如何使用FBO来实现?简单来说就是FBO提供了一块离屏的缓存区,此缓存区在显存中,一般对此进行操作时是需要程序员向GPU发送指令进行操作,程序员发送的指令就是opengl的指令。在业务当中,需要创建过个gpu的program,通过业务需求,控制实现多次对图层的操作,而每一个gpu的program操作的结果都显示在此fbo绑定的纹理上,我们进行多次操作之后实现对多个图层的操作。

2:业务场景

    场景需求:

        目前存在一个需求,需要我们在获取到camera的预览流的同时,将视频数据在同一时刻刷到预览界面。

    需求分析:

        此需求实现需要有2条技术难点需要注意实现

        1:将视频解码数据解码送到opengl环境中,相机数据也按照相机的帧率送到opengl环境中,需要制定一个2个帧率不同或者不是同时刷新数据的策略

        2:需要使用fbo将相机的数据和视频的数据刷新到同一个纹理上

    需求实现方案:

        1:相机的频率和视频的频率按照哪一个来,需要根据具体的业务场景,此处我们优先按照相机的频率来更新画面,即,相机回调帧可用的时候我们进行绘制,此时我们先绘制相机的一帧数据,绘制完成我们去绘制视频,如果视频数据更新了就更新下视频数据,如果视频数据没更新就不更新,直接绘制上一帧的视频数据,然后完成此帧绘制,最后实际显示的帧数是相机的帧数,在此模式下,我们需要根据相机的帧率调整解码播放的帧率,否则即使视频帧率很高也不会全部显示

        2:关于如何将2个视频帧画面画在同一个画布上且不同图层实现图层的遮盖关系,我们使用FBO绑定一个纹理,第一步:先向FBO上绘制相机数据,第二步:向FBO上绘制视频数据,图层关系是,相机数据在最下层,视频数据在上面一层。

    PS1:我们使用FBO可以同时使用无限层,也就是可以在一块画布上无限叠加绘制,但是如果我们只使用一个opengl的program,我们只能往上叠加16个,不同手机或者显卡上可能不同,可以自行采用glGetIntegerv,来查询

    PS2:关于第二点实现方案,除了采用FBO之外还可以采用1个GPU 的Program,多个纹理来实现,我们根据不同的纹理设置不同的矩阵来实现界面上图片和视频的位置和方向的移动设置,本文实现采用方案1,即Fbo实现的

3:实现细节

具体实现分为以下几步

1:创建包含glsurfaceview的界面,为glsurfceview指定render

2:创建render,为render创建filter,创建3个filter,第一个filer接收相机数据,并且创建fbo,第二个fillter创建绘制解码的视频数据,第三个filter用来绘制Fbo绑定的texture,

3:对上层的数据提供对外的接口,对外接口主要作用就是提供下上层纹理的位置和形状变化接口,即提供设置平移,旋转和缩放的接口

第一步就在不展示代码了,我们直接展示第二步和第三步:

/**
 * 本render 通过采集到camera数据之后将数据先渲染到FBO 最后在渲染到屏幕上
 */
public class CameraFboRender implements GLSurfaceView.Renderer {

    private FloatBuffer mVertexBuffer = null;
    private FloatBuffer mTextureBuffer = null;

    private CameraBaseFilter mCameraBaseFilter = null;


    private String TAG = "CameraFboRender";
    Context mContext;
    Handler mHandler;
    private int preTextureId = -1;

    SurfaceTexture mSurfaceTexture;
    GLSurfaceView mGlSurfaceView;
    Camera2dTextureNormalFilter mNormalFiler;

    int cameraTextureId = -1;
    public CameraFboRender(Context context, Handler handler,GLSurfaceView glSurfaceView) {
        mContext = context ;
        mHandler = handler;
        mGlSurfaceView = glSurfaceView;
        createVertexArray();
    }

    public void setScale(){
        currentViewWidth += 50;
        currentViewHeight += 50;
        computerScaleRatio(currentViewWidth,currentViewHeight);
        mPlayVideoNormalFilter.updateMtx();
    }

    float currentXTranslate = 0;
    float currentYTranslate = 0;
    public void setTranslate(){
        currentXTranslate += 0.1f;
        currentYTranslate += 0.1f;

        computerTranslate(currentXTranslate,currentYTranslate);
        mPlayVideoNormalFilter.updateMtx();
    }

    float currentRotate = 0.0f;
    public void setRotate(){
        currentRotate+= 30.0f;
        computerRotate(currentRotate,1);
        mPlayVideoNormalFilter.updateMtx();
    }



    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        Log.i(TAG,"onSurfaceCreated");

        GLES30.glDisable(GL10.GL_DITHER);
        GLES30.glClearColor(1.0f, 0.0f, 0.0f, 0.0f);
        GLES30.glEnable(GL10.GL_CULL_FACE);
        GLES30.glEnable(GL10.GL_DEPTH_TEST);

        int textureId[] = new int[2];
        GLES30.glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS,textureId,0);
        Log.i(TAG,"onSurfaceCreated GL_MAX_TEXTURE_IMAGE_UNITS:" +textureId[0]);
        //1:创建texture以及设置帧回调
        initTextureEnv();

        //2:创建camera的基本filter
        mCameraBaseFilter = new CameraBaseFilter(mContext);
        mCameraBaseFilter.init();

        Log.i(TAG,"onSurfaceCreated CameraBaseFilter init end");


        // 顺时针选准
        mNormalFiler =  new Camera2dTextureNormalFilter(mContext);
        mNormalFiler.init();
        Log.i(TAG,"onSurfaceCreated mNormalFiler init end");

        //视频播放
        mPlayVideoNormalFilter = new PlayVideoByFboFilter(mContext);
        mPlayVideoNormalFilter.init();

        Log.i(TAG,"onSurfaceCreated mPlayVideoNormalFilter init end");



        //3:启动相机
        initCamera();

        //4:启动解码器
        initCodec();

    }

    private int count = 0;
    private void initTextureEnv(){
        cameraTextureId = GLesUtils.createCameraTexture();
        Log.i(TAG,"createCameraTexture" + cameraTextureId);
        mSurfaceTexture = new SurfaceTexture(cameraTextureId);
        mSurfaceTexture.setDefaultBufferSize(1920,1080);
        mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
            @Override
            public void onFrameAvailable(SurfaceTexture surfaceTexture) {
                if(count % 300 == 0){
                    Log.i(TAG,"camera onFrameAvailable:" +count);
                }
                count++;
                mGlSurfaceView.requestRender();
            }
        });

        videoTextureId = GLesUtils.createCameraTexture();
        Log.i(TAG,"create video Texture" + videoTextureId);
    }


    protected int mSurfaceWidth, mSurfaceHeight;//渲染到屏幕上的时候的控件的尺寸大小

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Log.i(TAG,"onSurfaceChanged" + "-->width:" +width + ",height:" +height);

        //0:获取屏幕上的尺寸
        mSurfaceWidth = width;
        mSurfaceHeight = height;

        //1:启动预览
        cameraManager.startPreview(mSurfaceTexture);

        //2:设置FBO大小
        mCameraBaseFilter.initFrameBuffer(mImageWidth, mImageHeight);

        //2:设置FBO
        onFilterChanged();

        Log.i(TAG,"onSurfaceChanged ,startPreview end" );

    }
    private final float[] mMatrix = new float[16];
    private final float[] mVideoMatrix = new float[16];

    @Override
    public void onDrawFrame(GL10 gl) {

        GLES20.glEnable(GLES20.GL_BLEND);
        GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);

        GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT | GLES30.GL_DEPTH_BUFFER_BIT);
        GLES30.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);

        mSurfaceTexture.updateTexImage();
        mSurfaceTexture.getTransformMatrix(mMatrix);
        mCameraBaseFilter.setTextureTransformMatrix(mMatrix);

        //1:告诉mCameraBaseFilter 绑定了fbo,并且指定了 glViewport(0,0,mInputWidth,mInputHeight);
        mCameraBaseFilter.bindFrameBuffer();

        //1:绘制相机数据 到FBO绑定的纹理上
        int textureId = mCameraBaseFilter.onDrawToFramebuffer(cameraTextureId);
        mCameraBaseFilter.unBindFrameBuffer();//到此,已经将相机数据画到了 fbo绑定的 纹理上,纹理为 textureId

        //2:绘制视频数据到FBO绑定的纹理上
        handleVideo();

        //3:将fbo上绑定的纹理绘制出来
        mNormalFiler.onDrawFrameNormalTexture(textureId);
        mCameraBaseFilter.unBindFrameBuffer();

    }



    private void handleVideo(){

        //将视频绘制到FBO的纹理上
        if(isUpdateVideo){
            mNeedUpdateSurfaceTexture.updateTexImage();
            mNeedUpdateSurfaceTexture.getTransformMatrix(mVideoMatrix);
            mPlayVideoNormalFilter.setTextureTransformMatrix(mVideoMatrix);
        }

        mCameraBaseFilter.bindFrameBufferAndPoi(0,0);
        mPlayVideoNormalFilter.onDrawFrameFromStartPos(videoTextureId,0,0);

        mCameraBaseFilter.unBindFrameBuffer();//到此应该将视频的数据绘制到了 fbo的纹理上,并且此处指定了

    }

    private int currentViewWidth = 360;
    private int currentViewHeight = 480;

    private  void onFilterChanged(){
        Log.i(TAG,"onFilterChanged onOutputSizeChanged  outputwidth:" +mSurfaceWidth
                + ",outputHeight:" + mSurfaceHeight);
       //告诉mCameraBaseFilter 最后输出的界面的宽和高,也就是最后敞口上能看见的
        //但是一般 如果此时是第一步处理,后续这个最后输出的尺寸可以不用使用屏幕尺寸
        mCameraBaseFilter.onOutputSizeChanged(mSurfaceWidth, mSurfaceHeight);

        Log.i(TAG,"onFilterChanged  initFrameBuffer frameWidth:" +mImageWidth
                + ",frameHeight:" + mImageHeight);
        //告诉mCameraBaseFilter 当前输出的纹理大小
        mCameraBaseFilter.onInputSizeChanged(mImageWidth,mImageHeight);


        //告诉  mPlayVideoNormalFilter 当前绘制的纹理的宽和高
        mPlayVideoNormalFilter.onInputSizeChanged(mImageWidth,mImageHeight);
        computerScaleRatio(currentViewWidth,currentViewHeight);
        updateProjection();
        mPlayVideoNormalFilter.updateMtx();
        //告诉mNormalFiler 我们的输入纹理的大小
        mNormalFiler.onInputSizeChanged(mImageWidth,mImageHeight);


        Log.i(TAG,"onFilterChanged END");
    }

    private float xScale = 0f;
    private float YScale = 0f;
    private void computerScaleRatio(int viewWidth,int viewHeight){
        if(xScale == 0 && YScale == 0){
            xScale = mVideoWidth *1.0f/mImageWidth ;
            YScale = mVideoHeight *1.0f /mImageHeight;
        }else {
            xScale += 0.01;
            YScale += 0.01;
        }

        Log.i(TAG,"xScale:" +xScale + ",YScale :" +YScale);
        mPlayVideoNormalFilter.setScaleMtx(xScale,YScale);
    }

    private void computerTranslate(float x,float y){
        mPlayVideoNormalFilter.setTranslateMtx(x,y);
    }

    private void computerRotate(float degree,int x){
        mPlayVideoNormalFilter.setRotateMtx(degree,x);
    }


    private final float[] projectionMatrix = new float[16];

    //此处2个矩阵在视频的处理中 暂时看不出来作用 ,也就是作用是等价的
    private void updateProjection() {
        //透视矩阵
        Matrix.frustumM(projectionMatrix, 0,
                - 1, 1, -1, 1,
                1f, 20.0f);

        mPlayVideoNormalFilter.setProjectionMatrix(projectionMatrix);
    }



    //相机开启成功之后 拿到的预览数据大小,由相机给出
    private int mImageWidth = -1;
    private int mImageHeight = -1;
    CameraManager cameraManager;

    private void initCamera() {
        cameraManager = new CameraManager();
        //开启 0号摄像头 0号是后置摄像头
        if (cameraManager.getCamera() == null)
//            cameraManager.openCamera();
            cameraManager.openFrontCamera();
        Camera.Size size = cameraManager.getPreviewSize();
        Log.i(TAG,"initCamera priviewSize : " + size.width + " " + size.height );
        // rotation=90 or rotation=270, we need to exchange width and height
        int orientation = cameraManager.getOrientation();
        Log.i(TAG,"initCamera orientation: " +orientation );
        if (cameraManager.getOrientation() == 90 || cameraManager.getOrientation() == 270) {
            mImageWidth = size.height;
            mImageHeight = size.width;
        } else {
            mImageWidth = size.width;
            mImageHeight = size.height;
        }
        //调整相机出来的视频 +纹理坐标位置
        mCameraBaseFilter.onInputSizeChanged(mImageWidth, mImageHeight);
        //        initCamera mImageWidth -->720,mImageHeight-->1280
        Log.d(TAG,"initCamera mImageWidth -->" +mImageWidth
                + ",mImageHeight-->" +mImageHeight);
        Log.i(TAG,"initCamera END");
        //调整Fbo中使用camera数据的旋转方向,因为采用的相机是后置摄像头,后置摄像头的拿到的数据是和我们在屏幕上看到的
        //逆时针旋转了90度的,所以在手机屏幕上想要看到是一个符合人类视觉的画面 需要将相机的数据顺时针旋转 90度,
        // 下面的方法进行相机画面调整
        adjustSize(cameraManager.getOrientation(), cameraManager.isFront(), true);
    }

    private PlayVideoController codecController;
    private int videoTextureId = -1;//视频纹理ID
    private SurfaceTexture mNeedUpdateSurfaceTexture = null;//回调回来的SurfaceTexture
    private PlayVideoByFboFilter mPlayVideoNormalFilter = null;
    private boolean isUpdateVideo = false;
    private int mVideoWidth,mVideoHeight;
    private void initCodec(){
        codecController = new PlayVideoController(mContext);
        codecController.setIFrameListener(new PlayVideoController.IFrameListener() {
            @Override
            public void onFrameAviable(SurfaceTexture surfaceTexture) {
                isUpdateVideo = true;
                mNeedUpdateSurfaceTexture = surfaceTexture;
            }
        });
        codecController.setSurfaceTexture(videoTextureId);
        codecController.initCodec();
        codecController.startDecode();
        mVideoWidth = codecController.getmCurrentVideoWidth();
        mVideoHeight = codecController.getmCurrentVideoHigh();
        Log.i(TAG,"mVideoWidth:" +mVideoWidth + ",mVideoHeight:" +mVideoHeight );
    }

    public void adjustSize(int rotation, boolean horizontalFlip, boolean verticalFlip) {

        //如果纹理上的图像没有要符合人类视觉的要求,则纹理四个顶点坐标对应的方式只要和顶点的坐标对应上即可
        //如果要符合人类的视觉要求,则需要知道原来的相机出来的数据的orition,
        // 根据orition在决定我们的相机数据改做如何的调整

        float[] vertexData = TextureRotateUtil.VERTEX;
        // 1:相机角度 决定如何匹配顶点坐标和纹理坐标对应
        // 2:前置后置摄像头 决定是否进行水平镜像
        // 3:竖直方向是否镜像 直接传true 原因是 纹理坐标竖直方向不颠倒,贴上来是反着
            //   这个是显示设备和opengl的纹理坐标不匹配的关系

        float[] textureData = TextureRotateUtil.getRotateTexture(Rotation.fromInt(rotation),
                horizontalFlip, verticalFlip);

        mVertexBuffer.clear();
        mVertexBuffer.put(vertexData).position(0);
        mTextureBuffer.clear();
        mTextureBuffer.put(textureData).position(0);

    }

    void createVertexArray(){
        //1:创建顶点坐标和纹理坐标系
        mVertexBuffer = ByteBuffer.allocateDirect(TextureRotateUtil.VERTEX.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        mVertexBuffer.put(TextureRotateUtil.VERTEX).position(0);

        mTextureBuffer = ByteBuffer.allocateDirect(TextureRotateUtil.TEXTURE_ROTATE_0.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        mTextureBuffer.put(TextureRotateUtil.TEXTURE_ROTATE_0).position(0);
        if(mVertexBuffer.hasArray() ){
            Log.d(TAG,"createVertexArray :" + Arrays.toString(mVertexBuffer.array()) );
        }
        if(mTextureBuffer.hasArray()){
            Log.d(TAG,"createVertexArray :" +Arrays.toString(mTextureBuffer.array()) );
        }
    }
}

CameraBaseFilter 主要是第一步处理相机数据的Filter,主要包含2个操作,将相机的数据渲染到

//相机基本滤镜
public class CameraBaseFilter extends BaseFilter {
    private final static  String TAG = "CameraBaseFilter";
    public CameraBaseFilter(Context context) {
        super(GLesUtils.readTextFileFromResource(context, R.raw.base_fliter_normal_vertex),
                GLesUtils.readTextFileFromResource(context, R.raw.base_filter_nomal_oes_fragement));
    }

    private int textureTransformLocation;//mvp矩阵在glsl中的 Uniform 句柄值
    protected void onInit() {
        super.onInit();
        textureTransformLocation = GLES30.glGetUniformLocation(getProgramId(), "textureTransform");
        updateVertexArray();
    }

    private void updateVertexArray(){
        mVertexBuffer = ByteBuffer.allocateDirect(TextureRotateUtil.VERTEX.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        mVertexBuffer.put(TextureRotateUtil.VERTEX).position(0);

        mTextureBuffer = ByteBuffer.allocateDirect(TextureRotateUtil.TEXTURE_ROTATE_90.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer();
        mTextureBuffer.put(TextureRotateUtil.getRotateTexture(Rotation.ROTATION_90, false, true))
                .position(0);
    }


    private float[] textureTransformMatrix;
    public void setTextureTransformMatrix(float[] matrix) {
        textureTransformMatrix = matrix;
    }

    @Override
    public int onDrawFrame(int textureId) {
        return onDrawFrame(textureId, mVertexBuffer, mTextureBuffer);
    }

    int count = 0;
    @Override
    public int onDrawFrame(int textureId, FloatBuffer vertexBuffer, FloatBuffer textureBuffer) {
        if (!hasInitialized()) {
            return GLesUtils.NOT_INIT;
        }
//        Log.d(TAG,"getProgramId() :" +getProgramId());
        GLES30.glUseProgram(getProgramId());
        runPendingOnDrawTask();
        if(count == 0){
            Log.d(TAG,"onDrawFrame getProgramId() :" +getProgramId());
            Log.d(TAG,"onDrawFrame textureTransformLocation() :" +
                    Arrays.toString(textureTransformMatrix));
            Log.d(TAG,"onDrawFrame mInputWidth :" +
                    mInputWidth + ",mInputHeight:" + mInputHeight);

        }
        count++;
        //启用顶点坐标
        vertexBuffer.position(0);
        GLES30.glVertexAttribPointer(mAttributePosition,
                2, GLES30.GL_FLOAT, false, 0, vertexBuffer);
        GLES30.glEnableVertexAttribArray(mAttributePosition);

        //启用纹理坐标
        textureBuffer.position(0);
        GLES30.glVertexAttribPointer(mAttributeTextureCoordinate,
                2, GLES30.GL_FLOAT, false, 0, textureBuffer);
        GLES30.glEnableVertexAttribArray(mAttributeTextureCoordinate);

        //设置mvp矩阵
        GLES30.glUniformMatrix4fv(textureTransformLocation,
                1, false, textureTransformMatrix, 0);


        //启用纹理,此处纹理即为相机启动之后设置给相机预览创建的texture
        if (textureId != GLesUtils.NO_TEXTURE) {
            GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
            GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
            GLES30.glUniform1i(mUniformTexture, 0);
        }

        //启动绘制,请绘制完成之后清除绘制参数,顶点着色器,片元着色器 和纹理
        GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);
        GLES30.glDisableVertexAttribArray(mAttributePosition);
        GLES30.glDisableVertexAttribArray(mAttributeTextureCoordinate);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);

        return GLesUtils.ON_DRAWN;
    }

    @Override
    public void onInputSizeChanged(int width, int height) {
        super.onInputSizeChanged(width, height);
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        destroyFrameBuffer();
    }

    public void bindFrameBuffer(){
        GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, frameBuffer[0]);

        GLES30.glViewport(0,0,mInputWidth,mInputHeight);
        GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT
                | GLES30.GL_DEPTH_BUFFER_BIT | GLES30. GL_STENCIL_BUFFER_BIT);
    }
 
    public void bindFrameBufferAndPoi(int xStart,int yStart){
        GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, frameBuffer[0]);
        GLES30.glViewport(xStart,yStart,mInputWidth,mInputHeight);
    }

    public void unBindFrameBuffer(){
        GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER,0);
    }

 //绘制到fbo上,且fbo控制在不在本方法之内
    public int onDrawToFramebuffer(final int textureId){

        GLES30.glUseProgram(getProgramId());

        mVertexBuffer.position(0);
        GLES30.glVertexAttribPointer(mAttributePosition, 2, GLES30.GL_FLOAT, false, 0, mVertexBuffer);
        GLES30.glEnableVertexAttribArray(mAttributePosition);
        mTextureBuffer.position(0);
        GLES30.glVertexAttribPointer(mAttributeTextureCoordinate, 2, GLES30.GL_FLOAT, false, 0, mTextureBuffer);
        GLES30.glEnableVertexAttribArray(mAttributeTextureCoordinate);
        GLES30.glUniformMatrix4fv(textureTransformLocation, 1, false, textureTransformMatrix, 0);

        if (textureId != GLesUtils.NO_TEXTURE) {
            GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
            GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
            GLES30.glUniform1i(mUniformTexture, 0);
        }

        GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);

        GLES30.glDisableVertexAttribArray(mAttributePosition);
        GLES30.glDisableVertexAttribArray(mAttributeTextureCoordinate);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
        return frameBufferTexture[0];
    }

    //绘制到fbo上,且fbo控制在本方法之内
    public int onDrawToTexture(int textureId) {
        if (!hasInitialized()) {
            return GLesUtils.NOT_INIT;
        }

        if (frameBuffer == null) {
            return GLesUtils.NO_TEXTURE;
        }

        GLES30.glUseProgram(getProgramId());
        runPendingOnDrawTask();
        GLES30.glViewport(0, 0, frameWidth, frameHeight);
        GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, frameBuffer[0]);

        mVertexBuffer.position(0);
        GLES30.glVertexAttribPointer(mAttributePosition, 2, GLES30.GL_FLOAT, false, 0, mVertexBuffer);
        GLES30.glEnableVertexAttribArray(mAttributePosition);
        mTextureBuffer.position(0);
        GLES30.glVertexAttribPointer(mAttributeTextureCoordinate, 2, GLES30.GL_FLOAT, false, 0, mTextureBuffer);
        GLES30.glEnableVertexAttribArray(mAttributeTextureCoordinate);
        GLES30.glUniformMatrix4fv(textureTransformLocation, 1, false, textureTransformMatrix, 0);

        if (textureId != GLesUtils.NO_TEXTURE) {
            GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
            GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
            GLES30.glUniform1i(mUniformTexture, 0);
        }

        GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);

        GLES30.glDisableVertexAttribArray(mAttributePosition);
        GLES30.glDisableVertexAttribArray(mAttributeTextureCoordinate);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
        GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);

        GLES30.glViewport(0, 0, mOutputWidth, mOutputHeight);
        return frameBufferTexture[0];
    }

    //初始化fbo
    public void initFrameBuffer(int width, int height){
        //初始化的参数,先根据默认参数进行清除数据
        if (frameBuffer != null && (frameWidth != width || frameHeight != height))
            destroyFrameBuffer();

        //初始化FBO
        if (frameBuffer == null) {
            //传入参数是预览的宽和高
            frameWidth = width;
            frameHeight = height;

            frameBuffer = new int[1];
            frameBufferTexture = new int[1];
            //生成FBO
            GLES30.glGenFramebuffers(1, frameBuffer, 0);

            //生成FBO附着的纹理
            GLES30.glGenTextures(1, frameBufferTexture, 0);
            GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, frameBufferTexture[0]);
            GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_LINEAR);
            GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR);
            GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE);
            GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D, GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE);

            //分配FBO的缓存大小
            GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_RGBA, width, height,
                    0, GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, null);
            //绑定FBO
            GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, frameBuffer[0]);
            //将FBO对应附着的纹理和FBO绑定起来
            GLES30.glFramebufferTexture2D(GLES30.GL_FRAMEBUFFER, GLES30.GL_COLOR_ATTACHMENT0,
                    GLES30.GL_TEXTURE_2D, frameBufferTexture[0], 0);


            if (GLES30.glCheckFramebufferStatus(GLES30.GL_FRAMEBUFFER)!= GLES30.GL_FRAMEBUFFER_COMPLETE) {
                Log.e(TAG,"glCheckFramebufferStatus not GL_FRAMEBUFFER_COMPLETE");
                return ;
            }else {
                Log.i(TAG,"glCheckFramebufferStatus  GL_FRAMEBUFFER_COMPLETE");
            }
            GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
            GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0);

        }

    }

    private int[] frameBuffer = null;
    private int[] frameBufferTexture = null;
    private int frameWidth = -1;
    private int frameHeight = -1;
    public void destroyFrameBuffer() {
        if (frameBufferTexture != null) {
            GLES30.glDeleteTextures(1, frameBufferTexture, 0);
            frameBufferTexture = null;
        }
        if (frameBuffer != null) {
            GLES30.glDeleteFramebuffers(1, frameBuffer, 0);
            frameBuffer = null;
        }
        frameWidth = -1;
        frameHeight = -1;
    }
}

第三个Filte主要作用是将前两步绘制的相机数据和视频数据显示到的Fbo纹理上的数据绘制到屏幕上,如何实现绘制到屏幕上

/**
 * 此filter 将相机出来的效果使用 opengl 2Dtexture进行显示
 * 在使用此filer之前需要使用 fbo 将相机之前的OES纹理转换成 普通2D纹理
 */

public class Camera2dTextureNormalFilter extends  BaseFilter{


    // R.raw.base_camera_filter_normal_texture2d_fragment
    public  static  final String FS = "precision mediump float;\n" +
            "uniform sampler2D inputImageTexture;\n" +
            "varying vec2 textureCoordinate;\n" +
            "void main()\n" +
            "{\n" +
            "    gl_FragColor = texture2D(inputImageTexture, textureCoordinate);\n" +
            "}";

    public Camera2dTextureNormalFilter(Context context) {
        super(NORMAL_VERTEX_SHADER, FS);
    }

    @Override
    public void init() {
        super.init();
    }

    @Override
    public void onDrawFrameNormalTexture(int textureId) {
        onDrawFrameNormalTexture(textureId,mVertexBuffer,mTextureBuffer);
    }

}
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 159,835评论 4 364
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 67,598评论 1 295
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 109,569评论 0 244
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 44,159评论 0 213
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 52,533评论 3 287
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 40,710评论 1 222
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 31,923评论 2 313
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 30,674评论 0 203
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 34,421评论 1 246
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 30,622评论 2 245
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 32,115评论 1 260
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 28,428评论 2 254
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 33,114评论 3 238
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 26,097评论 0 8
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 26,875评论 0 197
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 35,753评论 2 276
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 35,649评论 2 271

推荐阅读更多精彩内容