Android OpenGL20 模型,視圖,投影與Viewport

對於很多初學者,視圖投影之類非常的難理解,然而這個東西非常非常的重要,如果不是非常清楚,根本無法定位3D Object(空間座標)和觀察角度(觀察角度不一樣,效果就不一樣),自己閱博無數,發現了一篇非常棒的blog文章:

http://blog.csdn.net/kesalin/article/details/7168967

由於儘量保證自己博客的原創性,所以不方便裝載,所以reviewer一定要看上面鏈接的文章,圖文並茂,然後通過自己的測試程序進行測試,就會徹底明白.當然這個博主是蘋果APP的,但是沒有關係,理論是通用的.

這裏大致總結一下:

概念一:

a> : viewport(視口)變換 : 結合程序,下面是定義點的座標,平時看sample比較多就會發現,x,y,z軸都用標量爲1去設置

private float vertexs[]={
            0.0f,0.0f,0.0f,
            1.0f,0.0f,0.0f,
            0.0f,1.0f,0.0f
    };

但是顯示在移動設備屏幕上是一個3D圖像,但是這個1是如何轉換到屏幕的呢?這個1即不代表像素,又沒有代表一個比例(比如1:500,1代表佔用500像素),卻在程序運行後顯示一個3D圖形.這個地方就是上面博客中提到的:從 Normalized Device Space 到 Window Space 就是 viewport 變換過程:

看看上面的,程序中設置就是Normalized Device Space上的"座標",如果要顯示在移動設備的屏幕上,就需要一個轉換,轉換規則

其中上面轉換公式中的參數,x,y,width,height是通過:

glViewport(x, y, width, height);

設置的;

(xw, yw)是屏幕座標;

(xnd, ynd)是投影之後經歸一化之後的點(上圖中 Normalized Device Space 空間的點);


概念二 :

b> : 模型視圖變換 : 這裏分兩種,

1> : 變換3D Object在空間中的位置和旋轉,而觀察者(很多地方表述爲:Camera)的位置保持不變;

2> : 保持3D Object在空間中的位置和旋轉不變,觀察者的位置變化.

所以當需要觀察3D Object的不同角度的時候,可以通過變換3D Object的位置或角度,也可以變換觀察者的位置或角度.

如果變化3D Object可以通過矩陣平移,旋轉,縮放等操作;

如果變化觀察者角度 :

gluLookAt(eyex, eyey, eyez, centerx, centery, centerz, upx, upy, upz);


eye 表示 camera/viewer 的位置, center 表示相機或眼睛的焦點(它與 eye 共同來決定 eye 的朝向),而 up 表示 eye 的正上方向,注意 up 只表示方向,與大小無關。通過調用此函數,就能夠設定觀察的場景,在這個場景中的物體就會被 OpenGL 處理。在 OpenGL 中,eye 的默認位置是在原點,指向 Z 軸的負方向(屏幕往裏),up 方向爲 Y 軸的正方向.


概念三 :

c> : 投影變換 : 投影變換的目的是確定 3D 空間的物體如何投影到 2D 平面上,從而形成2D圖像,這些 2D 圖像再經視口變換就被渲染到屏幕上;

這個也包含兩種情況:

1> : 正交投影;

2> : 透視投影;


1> : 正交投影:可以把正交投影看成是透視投影的特殊形式:即近裁剪面與遠裁剪面除了Z 位置外完全相同,因此物體始終保持一致的大小,即便是在遠處看上去也不會變小.

這個圖其實非常好了,但是覺得還差一樣東西,就可以更明白了,3D Object物體,這個物體一般如果想被觀察者看到,就需要將3D物體放在上面的那個立體盒子中(當然很多情況通過設置了near,far會將3D物體"放在盒子外面了"),也就是說,要想看到3D物理,首先需要將其置於兩個切面之間(即圖中黑色斜線面和藍綠色斜線面之間),同時如果有必要還需要將3D物理進行縮放操作(這樣方便從黑色斜面觀察進去).

設置正交投影:

glOrtho(left, right, bottom, top, zNear, zFar);
left,right, bootom,top 定義了 near 裁剪面大小,而 zNear 和 zFar 定義了從 Camera/Viewer 到遠近兩個裁剪面的距離(注意這兩個距離都是正值).


2> : 透視投影:這個地方由於使用的庫不一樣,存在兩種:

<I> : OpenGL es提供的模型:

glFrustum(left, right, bottom, top, zNear, zFar);
left,right, bootom,top 定義了 near 裁剪面大小,而 zNear 和 zFar 定義了從 Camera/Viewer 到遠近兩個裁剪面的距離(注意這兩個距離都是正值)。由這六個參數可以定義出六個裁剪面構成的錐體,這個錐體通常被稱之爲視錐體或視景體。只有在這個錐體內的物體纔是可以見的,不在這個錐體內的物體就相當於不再視線範圍內,因而會被裁減掉,OpenGL 不會這些物體進行渲染

通過glFrustum機型設置該模型!


<II> : glut輔助庫模型如下:

注意這個模型和上面模型的標註部分,樣子是一樣的,但是標註是不一樣的.

gluPerspective(fovy, aspect, zNear, zFar);

fovy 定義了 camera 在 y 方向上的視線角度(介於 0 ~ 180 之間),aspect 定義了近裁剪面的寬高比 aspect = w/h,而 zNear 和 zFar 定義了從 Camera/Viewer 到遠近兩個裁剪面的距離(注意這兩個距離都是正值)。這四個參數同樣也定義了一個視錐體。

在 OpenGL ES 2.0 中,我們也需要自己實現該函數。我們可以通過三角公式 tan(fovy/2)  = (h / 2)/zNear 計算出 h ,然後再根據 w =  h * aspect 計算出 w,這樣就可以得到 left, right, top, bottom, zNear, zFar 六個參數,代入在介紹視錐體時提到的公式即可.

補充兩個圖:



結論:
注意

寫 OpenGL 代碼時從前到後的順序依次是:設定 viewport(視口變換),設定投影變換,設定視圖變換,設定模型變換,在本地座標空間描繪物體。而在前面爲了便於理解做介紹時,說的順序是OpenGL 中物體最初是在本地座標空間中,然後轉換到世界座標空間,再到 camera 視圖空間,再到投影空間。由於模型變換包括了本地空間變換到世界座標空間,所以我們理解3D 變換是一個順序,而真正寫代碼時則是以相反的順序進行的,如果從左乘矩陣這點上去理解就很容易明白爲什麼會是反序的


重點中的重點,如何將上面的模型轉換到程序中,在程序中如何體現:

我們一般會定義:

private float[] mMVPMatrix = new float[16];
    private float[] mViewMatrix = new float[16];
    private float[] mProjectionMatrix = new float[16];

<1> : mViewMatrix是保存4*4的矩陣信息,這個是觀察者的眼睛的位置(或者叫做Camera):

Matrix.setLookAtM(mViewMatrix, 0, eyeX, eyeY, eyeZ, lookX, lookY, lookZ, upX, upY, upZ);

這個方法在前面介紹了用途,這裏將會將設置的矩陣信息保存到mViewMatrix矩陣裏面返回,這樣我就可以獲取眼睛在空間中的基位置,即後面要調整Camera,位置就需要乘以這個基位置矩陣,從而獲得最終的Camera位置.

<2> : mProjectionMatrix是保存透視矩陣信息的:我們通過下面建立一個透視模型,然後將這個模型保存到這個矩陣中,算是基矩陣.

Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near, far);

設置透視後,保存信息通過mProjectionMatrix返回.

<3> : mMVPMatrix是保存上面兩個建立起來的模型,這個模型即觀察者,觀察範圍和角度都一定設定了的模型,這個模型矩陣的表示是通過觀察者和透視矩陣兩個相乘得到的:

Matrix.setIdentityM(mModelMatrix, 0);
        Matrix.translateM(mModelMatrix, 0, 0.0f, 0.0f, -5.0f);

        Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
        Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
後面的繪製座標,比如繪製三角形,給出三角形的座標,但是在程序裏面給出的都是"絕對座標",或者是"理論座標",將這些座標*mMVPMatrix矩陣,才能夠將這個三角形映射到觀察模型中,從而顯示在觀察模型(透視模型)中,其中上面mModelMatrix是物理(即如三角形)初始定位點,即三角形的座標會依據這個進行.

所以通過上面三步,就建立起透視觀測模型,後面的物理座標設置乘以mMVPMatrix矩陣,就可以讓物理定位到模型中顯示(當然這個是給出了顯示的依據,實際物理不一定會顯示在這個透視模型中,可能在之外,所以說這一種參考依據)


根據這個可以做一個Android Demo測試一下Android studio工程[]:

代碼片區如下:

package org.pumpkin.pumpkintutor2gsls;

import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;

import org.pumpkin.pumpkintutor2gsls.tutor2.cube.CubeSurfaceView;
import org.pumpkin.pumpkintutor2gsls.tutor2.triangle.TriangleSurfaceView;
import org.pumpkin.pumpkintutor2gsls.tutor2.triangle1.TriangleSurfaceView1;

public class PumpKinMainActivity extends AppCompatActivity {

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(new TriangleSurfaceView1(this)/*new CubeSurfaceView(this)*//*new TriangleSurfaceView(this)*/);
    }
}

package org.pumpkin.pumpkintutor2gsls.tutor2.triangle1;

import android.content.Context;
import android.opengl.GLSurfaceView;

import org.pumpkin.pumpkintutor2gsls.tutor2.triangle.TriangleRenderer;

/**
 * Project name : PumpKinTutor2Gsls
 * Created by zhibao.liu on 2016/5/18.
 * Time : 11:18
 * Email [email protected]
 * Action : durian
 */
public class TriangleSurfaceView1 extends GLSurfaceView {

    public TriangleSurfaceView1(Context context) {
        super(context);

        this.setEGLContextClientVersion(2);

        //fix for error No Config chosen, but I don't know what this does.
        super.setEGLConfigChooser(8 , 8, 8, 8, 16, 0);

        this.setRenderer(new TriangleRenderer1(context));
        // Render the view only when there is a change in the drawing data
        setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

    }

}

package org.pumpkin.pumpkintutor2gsls.tutor2.triangle1;

import android.content.Context;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.os.SystemClock;

import org.pumpkin.pumpkintutor2gsls.tutor2.coord.Coord;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

/**
 * Project name : PumpKinTutor2Gsls
 * Created by zhibao.liu on 2016/5/18.
 * Time : 11:17
 * Email [email protected]
 * Action : durian
 */
public class TriangleRenderer1 implements GLSurfaceView.Renderer {

    private float[] mMVPMatrix = new float[16];
    private float[] mViewMatrix = new float[16];
    private float[] mModelMatrix = new float[16];
    private float[] mProjectionMatrix = new float[16];

    private Context mContext;
    private Triangle1 triangle1;
    private Coord coord;

    public TriangleRenderer1(Context context) {
        mContext = context;
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {

        GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

        GLES20.glEnable(GLES20.GL_CULL_FACE);
        GLES20.glEnable(GLES20.GL_DEPTH_TEST);

        triangle1 = new Triangle1(mContext);
        triangle1.loadTexture();

        coord = new Coord(mContext);

        // Position the eye behind the origin.
        final float eyeX = 0.0f;
        final float eyeY = 0.0f;
        final float eyeZ = 0.0f;

        // We are looking toward the distance
        final float lookX = 0.0f;
        final float lookY = 0.0f;
        final float lookZ = -1.0f;

        // Set our up vector. This is where our head would be pointing were we holding the camera.
        final float upX = 0.0f;
        final float upY = 1.0f;
        final float upZ = 0.0f;

        // Set the view matrix. This matrix can be said to represent the camera position.
        // NOTE: In OpenGL 1, a ModelView matrix is used, which is a combination of a model and
        // view matrix. In OpenGL 2, we can keep track of these matrices separately if we choose.
        Matrix.setLookAtM(mViewMatrix, 0, eyeX, eyeY, eyeZ, lookX, lookY, lookZ, upX, upY, upZ);

    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {

        GLES20.glViewport(0, 0, width, height);

        final float ratio = (float) width / height;
        final float left = -ratio;
        final float right = ratio;
        final float bottom = -1.0f;
        final float top = 1.0f;
        final float near = 1.0f;
        final float far = 10.0f;

        Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near, far);

    }

    @Override
    public void onDrawFrame(GL10 gl) {

        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

        Matrix.setIdentityM(mModelMatrix, 0);
        Matrix.translateM(mModelMatrix, 0, 0.0f, 0.0f, -5.0f);

        Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
        Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);

        triangle1.draw(mMVPMatrix);
        coord.draw(mMVPMatrix);

    }

}

在上面的渲染器中,調整setLookAtM參數,以及frustumM參數,在運行既可以發現視角在變化.



package org.pumpkin.pumpkintutor2gsls.tutor2.triangle1;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.opengl.GLES20;
import android.opengl.GLUtils;

import org.pumpkin.pumpkintutor2gsls.R;
import org.pumpkin.pumpkintutor2gsls.shader.PumpKinShader;

import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;

/**
 * Project name : PumpKinTutor2Gsls
 * Created by zhibao.liu on 2016/5/18.
 * Time : 11:17
 * Email [email protected]
 * Action : durian
 */
public class Triangle1 {

    private FloatBuffer vertexsBuffer;
    private FloatBuffer colorsBuffer;
    private FloatBuffer texturesBuffer;

    private int mMVPMatrixHandle;
    private int mPositionHandle;
    private int mColorHandle;
    private int mTextureCoordsHandle;
    private int mProgram;

    private Context mContext;
    private Bitmap bitmap;

    private int[] textures=new int[1];

    private float vertexs[]={
            0.0f,0.0f,0.0f,
            2.0f,0.0f,0.0f,
            0.0f,2.0f,0.0f
    };

    private float colors[]={
            1.0f,0.0f,0.0f,1.0f,
            0.0f,1.0f,0.0f,1.0f,
            0.0f,0.0f,1.0f,1.0f
    };

    private float textureCoords[]={
            /*0,0,
            1,0,
            0,1*/
            0,1,
            1,0,
            0,0
    };

    public Triangle1(Context context){

        mContext=context;

        ByteBuffer vbb=ByteBuffer.allocateDirect(vertexs.length*4);
        vbb.order(ByteOrder.nativeOrder());
        vertexsBuffer=vbb.asFloatBuffer();
        vertexsBuffer.put(vertexs);
        vertexsBuffer.position(0);

        ByteBuffer cbb=ByteBuffer.allocateDirect(colors.length*4);
        cbb.order(ByteOrder.nativeOrder());
        colorsBuffer=cbb.asFloatBuffer();
        colorsBuffer.put(colors);
        colorsBuffer.position(0);

        ByteBuffer tbb=ByteBuffer.allocateDirect(textureCoords.length*4);
        tbb.order(ByteOrder.nativeOrder());
        texturesBuffer=tbb.asFloatBuffer();
        texturesBuffer.put(textureCoords);
        texturesBuffer.position(0);

        String vshaderCode= PumpKinShader.loadGsls(mContext,0);
        String fshaderCode=PumpKinShader.loadGsls(mContext,1);

        int mvShaderHandle= GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER);
        if(mvShaderHandle!=0) {
            GLES20.glShaderSource(mvShaderHandle, vshaderCode);
            GLES20.glCompileShader(mvShaderHandle);

            int[] status=new int[1];
            GLES20.glGetShaderiv(mvShaderHandle,GLES20.GL_COMPILE_STATUS,status,0);
            if(status[0]==0){
                GLES20.glDeleteShader(mvShaderHandle);
                mvShaderHandle=0;
            }

        }

        if(mvShaderHandle==0){
            throw new RuntimeException("failed to create vertex shader !");
        }

        int mfShaderHandle=GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER);
        if(mfShaderHandle!=0){
            GLES20.glShaderSource(mfShaderHandle,fshaderCode);
            GLES20.glCompileShader(mfShaderHandle);

            int[] status=new int[1];
            GLES20.glGetShaderiv(mfShaderHandle,GLES20.GL_COMPILE_STATUS,status,0);
            if(status[0]==0){
                GLES20.glDeleteShader(mfShaderHandle);
                mfShaderHandle=0;
            }

        }

        if(mfShaderHandle==0){
            throw new RuntimeException("failed to create fragment shader !");
        }

        mProgram=GLES20.glCreateProgram();
        if(mProgram!=0){

            GLES20.glAttachShader(mProgram,mvShaderHandle);
            GLES20.glAttachShader(mProgram,mfShaderHandle);

            GLES20.glLinkProgram(mProgram);

            int[] linkstatus=new int[1];
            GLES20.glGetProgramiv(mProgram,GLES20.GL_LINK_STATUS,linkstatus,0);
            if(linkstatus[0]==0){

                GLES20.glDeleteProgram(mProgram);
                mProgram=0;

            }

        }

        if(mProgram==0){
            throw new RuntimeException("failed to create program !");
        }

    }

    public void draw(float[] mvpmatrix){

        GLES20.glUseProgram(mProgram);

        mPositionHandle=GLES20.glGetAttribLocation(mProgram,"a_Position");
        GLES20.glVertexAttribPointer(mPositionHandle,3,GLES20.GL_FLOAT,false,0,vertexsBuffer);
        GLES20.glEnableVertexAttribArray(mPositionHandle);

        mColorHandle=GLES20.glGetAttribLocation(mProgram,"a_Color");
        GLES20.glVertexAttribPointer(mColorHandle,4,GLES20.GL_FLOAT,false,0,colorsBuffer);
        GLES20.glEnableVertexAttribArray(mColorHandle);

        mTextureCoordsHandle=GLES20.glGetAttribLocation(mProgram,"a_inputTextureCoordinate");
        GLES20.glVertexAttribPointer(mTextureCoordsHandle,2,GLES20.GL_FLOAT,false,0,texturesBuffer);
        GLES20.glEnableVertexAttribArray(mTextureCoordsHandle);

        mMVPMatrixHandle=GLES20.glGetUniformLocation(mProgram,"u_MVPMatrix");
        PumpKinShader.checkGLError("glGetUniformLocation");
        GLES20.glUniformMatrix4fv(mMVPMatrixHandle,1,false,mvpmatrix,0);
        PumpKinShader.checkGLError("glUniformMatrix4fv");

        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP,0,3);

        GLES20.glDisableVertexAttribArray(mPositionHandle);
        GLES20.glDisableVertexAttribArray(mColorHandle);
        GLES20.glDisableVertexAttribArray(mTextureCoordsHandle);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,0);
        GLES20.glDisable(GLES20.GL_BLEND);

    }

    public void loadTexture(){

        GLES20.glGenTextures(1,textures,0);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,textures[0]);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_WRAP_S,GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXTURE_WRAP_T,GLES20.GL_CLAMP_TO_EDGE);

        bitmap= BitmapFactory.decodeResource(mContext.getResources(), R.drawable.src);
        GLUtils.texImage2D(GLES20.GL_TEXTURE_2D,0,bitmap,0);

    }

}


輔助類:

package org.pumpkin.pumpkintutor2gsls.shader;

import android.content.Context;
import android.opengl.GLES20;
import android.util.Log;

import java.io.IOException;
import java.io.InputStream;

/**
 * Project name : PumpKinBasicGLSL
 * Created by zhibao.liu on 2016/5/11.
 * Time : 14:26
 * Email [email protected]
 * Action : durian
 */
public class PumpKinShader {

    private final static String TAG="PumpKinShader";

    private static int GLESVersion=20;

    public static void setVersion(int version){

        switch (version){
            case 20:
                GLESVersion=20;
                break;
            case 30:
                GLESVersion=30;
                break;
            default:
                GLESVersion=20;
                break;
        }

    }

    public static String loadGsls(Context context, int type){

        String shadercode="";
        String shaderfilename="";
        switch (type){
            case 0:
                shaderfilename="vshader.glsl";
                break;
            case 1:
                shaderfilename="fshader.glsl";
                break;
            case 2:
                shaderfilename="tvshader.glsl";
                break;
            case 3:
                shaderfilename="tfshader.glsl";
                break;
            case 4:
                shaderfilename="coordvshader.glsl";
                break;
            case 5:
                shaderfilename="coordfshader.glsl";
                break;
        }

        try {
            InputStream is=context.getResources().getAssets().open(shaderfilename);
            int length=is.available();
            byte[] buffer=new byte[length];
            int read = is.read(buffer);
            shadercode=new String(buffer);//buffer.toString();
        } catch (IOException e) {
            e.printStackTrace();
        }

        Log.i(TAG,"shadercode : "+shadercode);

        return shadercode;
    }

    public static int loadShader(int type,String shadercode){

        int shader= GLES20.glCreateShader(type);

        GLES20.glShaderSource(shader,shadercode);
        GLES20.glCompileShader(shader);

        return shader;

    }

    public static void checkGLError(String glOperation){
        int error;
        while((error=GLES20.glGetError())!=GLES20.GL_NO_ERROR){
            throw new RuntimeException(glOperation+" : glError "+error);
        }
    }

}


同樣增加一個座標顯示:

package org.pumpkin.pumpkintutor2gsls.tutor2.coord;

import android.content.Context;
import android.opengl.GLES20;

import org.pumpkin.pumpkintutor2gsls.shader.PumpKinShader;

import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;

/**
 * Project name : PumpKinTutor2Gsls
 * Created by zhibao.liu on 2016/5/18.
 * Time : 15:36
 * Email [email protected]
 * Action : durian
 */
public class Coord {

    private FloatBuffer vertexsBuffer;
    private FloatBuffer colorsBuffer;

    private int mPositionHandle;
    private int mColorHandle;
    private int mMVPMatrixHandle;
    private int mProgram;

    private Context mContext;

    private float[] vertexs={
            0,0,0,
            5,0,0,
            0,0,0,
            0,5,0,
            0,0,0,
            0,0,5
    };

    private float[] colors={
        1.0f,0.0f,0.0f,1.0f,
            1.0f,0.0f,0.0f,1.0f,
            0.0f,1.0f,0.0f,1.0f,
            0.0f,1.0f,0.0f,1.0f,
            0.0f,0.0f,1.0f,1.0f,
            0.0f,0.0f,1.0f,1.0f
    };

    public Coord(Context context){

        mContext=context;

        ByteBuffer vbb=ByteBuffer.allocateDirect(vertexs.length*4);
        vbb.order(ByteOrder.nativeOrder());
        vertexsBuffer=vbb.asFloatBuffer();
        vertexsBuffer.put(vertexs);
        vertexsBuffer.position(0);

        ByteBuffer cbb=ByteBuffer.allocateDirect(colors.length*4);
        cbb.order(ByteOrder.nativeOrder());
        colorsBuffer=cbb.asFloatBuffer();
        colorsBuffer.put(colors);
        colorsBuffer.position(0);

        String vshaderCode= PumpKinShader.loadGsls(mContext,4);
        String fshaderCode=PumpKinShader.loadGsls(mContext,5);

        int vshaderHandle=GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER);
        if(vshaderHandle!=0){

            GLES20.glShaderSource(vshaderHandle,vshaderCode);
            GLES20.glCompileShader(vshaderHandle);

            int[] status=new int[1];
            GLES20.glGetShaderiv(vshaderHandle,GLES20.GL_COMPILE_STATUS,status,0);
            if(status[0]==0){
                GLES20.glDeleteShader(vshaderHandle);
                vshaderHandle=0;
            }

        }

        int fshaderHandle=GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER);
        if(fshaderHandle!=0){

            GLES20.glShaderSource(fshaderHandle,fshaderCode);
            GLES20.glCompileShader(fshaderHandle);

            int[] status=new int[1];
            GLES20.glGetShaderiv(fshaderHandle,GLES20.GL_COMPILE_STATUS,status,0);
            if(status[0]==0){
                GLES20.glDeleteShader(fshaderHandle);
                fshaderHandle=0;
            }

        }

        if(fshaderHandle==0){
            throw new RuntimeException("failed to create frag shader !");
        }

        mProgram=GLES20.glCreateProgram();
        if(mProgram!=0){

            GLES20.glAttachShader(mProgram,vshaderHandle);
            GLES20.glAttachShader(mProgram,fshaderHandle);

            GLES20.glLinkProgram(mProgram);

            int[] linkstatus=new int[1];
            GLES20.glGetProgramiv(mProgram,GLES20.GL_LINK_STATUS,linkstatus,0);
            if(linkstatus[0]==0){
                GLES20.glDeleteProgram(mProgram);
                mProgram=0;
            }

        }

        if(mProgram==0){
            throw new RuntimeException("failed to create program !");
        }

    }

    public void draw(float[] mvpMatrix){

        GLES20.glUseProgram(mProgram);

        mPositionHandle=GLES20.glGetAttribLocation(mProgram,"a_Position");
        GLES20.glVertexAttribPointer(mPositionHandle,3,GLES20.GL_FLOAT,false,0,vertexsBuffer);
        GLES20.glEnableVertexAttribArray(mPositionHandle);

        mColorHandle=GLES20.glGetAttribLocation(mProgram,"a_Color");
        GLES20.glVertexAttribPointer(mColorHandle,4,GLES20.GL_FLOAT,false,0,colorsBuffer);
        GLES20.glEnableVertexAttribArray(mColorHandle);

        mMVPMatrixHandle=GLES20.glGetUniformLocation(mProgram,"u_MvpMatrix");
        PumpKinShader.checkGLError("glGetUniformLocation");
        GLES20.glUniformMatrix4fv(mMVPMatrixHandle,1,false,mvpMatrix,0);
        PumpKinShader.checkGLError("glUniformMatrix4fv");

        GLES20.glDrawArrays(GLES20.GL_LINES,0,vertexs.length/3);

        GLES20.glDisableVertexAttribArray(mColorHandle);
        GLES20.glDisableVertexAttribArray(mPositionHandle);

    }

}


下面是glsl腳本:

vshader.glsl :

uniform mat4 u_MVPMatrix;
uniform vec4 u_Color;
attribute vec4 a_Position;
attribute vec4 a_Color;
attribute vec4 a_inputTextureCoordinate;
varying vec2 textureCoordinate;
varying vec4 v_Color;
void main(){
    gl_Position=u_MVPMatrix*a_Position;
    v_Color=a_Color;
    textureCoordinate=a_inputTextureCoordinate.xy;
}

gshader.glsl :

precision mediump float;
varying vec4 v_Color;
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
void main(){
    gl_FragColor=v_Color*texture2D(inputImageTexture,textureCoordinate);
}


座標對應的glsl腳本:

coordvshader.glsl:

uniform mat4 u_MvpMatrix;
attribute vec4 a_Position;
attribute vec4 a_Color;
varying vec4 v_Color;
void main() {

    gl_Position=u_MvpMatrix*a_Position;
    v_Color=a_Color;

}


coordfshader.glsl :

precision mediump float;
varying vec4 v_Color;
void main() {

    gl_FragColor=v_Color;

}


另外在drawable下面增加一個src.png的圖片/

運行結果:





最後:下載Nate Robin tutors-win32.zip這個包,裏面有3D模擬器,可以通過3D模擬器參數設置觀察效果,從而進一步理解上面的理論.這個模擬器可以說是神器啊!





發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章