七牛雲推流SDK使用

http://www.jianshu.com/p/e4c43c6551d1


七牛雲直播Android推流端之開速開發

字數997 閱讀339 評論9 

前言

在我看來,定性爲快速開發的文檔應該是毫無障礙的,對着敲應該就能直接運行的。可是由於七牛迭代太快了,文檔跟不上代碼迭代的速度,導致快速開始這部分文檔的還沒更新,很多被廢棄的類、方法還在文檔中,導致剛入手的時候各種報錯,當然如果對照着前面的更新說明和demo,做相應的調整,快速開發也確實可以談得上,可卻跟我所理解的有些不同了。所以本文基於2.1.0版本爲大家呈現一個七牛雲直播Android端的快速開發。(風格按七牛來)

快速開始

開發環境配置

  • 已全部完成 BOOK - I 中的所有操作。搭建好帶有 Pili server sdk 的業務服務端,SDK 推流信息的輸入來自服務端返回的 StreamJson
  • Android Studio 開發工具。官方下載地址
  • 下載 Android 官方開發SDK 。官方下載地址PLDroidMediaStreaming 軟編要求 Min API 15 和硬編要求 Android Min API 18
  • 下載 PLDroidMediaStreaming 最新的 JAR 和 SO 文件。下載地址
  • 請用真機調試代碼,模擬器無法調試。

    創建新工程

  • 通過Android studio創建Project

    new project.png
  • 設置新項目
    • 填寫 Application id
    • 填寫 Company Domain
    • 填寫 Package id
    • 選擇 Project location
    • 可以使用默認的填寫項

new project.png
  • 選擇 Target Android Devices
    本例中選擇使用 MinimumSDK API 18(軟編要求 MinimumSDK API 15 ; 硬編要求 MinimumSDK API 18)

Target.png
  • 選擇 Empty Activity

Paste_Image.png
  • 填寫 Main Activity 信息,作爲 android.intent.action.MAIN

Paste_Image.png
  • 完成創建

Paste_Image.png

導入SDK

  • 將左側文件夾目錄切換爲 Project視圖

snipaste20161110_094628.jpg
  • 在 app/src/main 目錄下創建 jniLibs 目錄。按圖所示,將文件導入對應的目錄。

snipaste20161110_094922.jpg
  • 選中 lib 目錄下 pldroid-media-streaming-2.1.0.jar,右鍵添加新建庫,如圖所示(這個忘記截圖了,結果想再截,就算把dependencies裏的compile刪了發現右鍵也不會出來了,所以直接用七牛的吧)

Paste_Image.png
  • 導入完成,雙擊 build.gradle文件查看內容,lib 目錄中的文件已經自動導入,涉及的文件名如下:
// jar
pldroid-media-streaming-2.1.0.jar

// so
libpldroid_mmprocessing.so
libpldroid_streaming_aac_encoder.so
libpldroid_streaming_core.so
libpldroid_streaming_h264_encoder.so

創建基礎播放實例

添加相關權限

  • 在 app/src/main 目錄中的 AndroidManifest.xml 中增加 uses-permission
    和 uses-feature聲明
<manifest>
    ......
    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.WAKE_LOCK" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />

    <uses-feature android:name="android.hardware.camera.autofocus" />
    <uses-feature android:glEsVersion="0x00020000" android:required="true" />
    ......
</manifest>

添加 happy-dns 依賴

  • 打開 app 目錄下的 build.gradle,在dependencies添加兩條語句
dependencies {  
    ......
    compile 'com.qiniu:happy-dns:0.2.+'
    compile 'com.qiniu.pili:pili-android-qos:0.8.+'
}

實現自己的 Application

public class SimplePlayerApplication extends Application {
    @Override
    public void onCreate() {
        super.onCreate();
        StreamingEnv.init(getApplicationContext());
    }
}

在AndroidManifest的application標籤中加上你的Application,如

android:name=".SimplePlayerApplication"

創建主界面

由於是快速開發demo,所以主界面很簡單,只有兩個按鈕,一個推流,一個觀看,這裏只講推流的。mainactivity非常簡單,只是實現了按鈕的點擊事件,當然還有些判斷權限、添加權限的代碼,然後異步請求stream的代碼我也省去了。所以最簡單的代碼如下:

public class MainActivity extends AppCompatActivity {
    private static final String TAG = "MainActivity";
    private android.widget.Button btnpili;
    private android.widget.Button btnplay;
    private boolean mPermissionEnabled = false;
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        this.btnplay = (Button) findViewById(R.id.btn_play);
        this.btnpili = (Button) findViewById(R.id.btn_pili);

        //開始直播
        btnpili.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                startActivity(new Intent(MainActivity.this, HWCameraStreamingActivity.class));
            }
        });

        //觀看直播
        btnplay.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {

            }
        });
    }
}

創建主界面佈局文件

activity_main.xml:

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:id="@+id/activity_main"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:paddingBottom="@dimen/activity_vertical_margin"
    android:paddingLeft="@dimen/activity_horizontal_margin"
    android:paddingRight="@dimen/activity_horizontal_margin"
    android:paddingTop="@dimen/activity_vertical_margin"
    tools:context="com.jcmels.liba.simpleplayerdemo.MainActivity">

   <Button
       android:id="@+id/btn_pili"
       android:layout_width="match_parent"
       android:layout_height="wrap_content"
       android:text="開始直播"/>
    <Button
        android:id="@+id/btn_play"
        android:layout_below="@id/btn_pili"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:text="觀看直播"/>
</RelativeLayout>

創建推流界面(以HW爲例,七牛是SW)

  • 創建名爲 HWCameraStreamingActivity的 Empty Activity,HWCameraStreamingActivity的主要工作包括:
    • 配置推流url
    • 初始化推流 SDK 的核心類 MediaStreamingManager
    • onResume中調用 streamingManager.onResume();
    • 在接收到 READY指令之後,開始推流 streamingManager.startStreaming();,startStreaming需要在非 UI 線程中進行操作。
  • 新建Activity:HWCameraStreamingActivity,代碼如下:
public class HWCameraStreamingActivity extends Activity implements StreamingStateChangedListener, CameraPreviewFrameView.Listener {
    private MediaStreamingManager streamingManager;
    private StreamingProfile streamingProfile;
    private MicrophoneStreamingSetting mMicrophoneStreamingSetting;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        setContentView(R.layout.activity_hwcamera_streaming);
        AspectFrameLayout afl = (AspectFrameLayout) findViewById(R.id.cameraPreview_afl);
        afl.setShowMode(AspectFrameLayout.SHOW_MODE.REAL);
        CameraPreviewFrameView cameraPreviewFrameView =
                (CameraPreviewFrameView) findViewById(R.id.cameraPreview_surfaceView);
        cameraPreviewFrameView.setListener(this);
        String publishurl = "這裏換成你的推流地址";
        streamingProfile = new StreamingProfile();

        try {
            streamingProfile.setVideoQuality(StreamingProfile.VIDEO_QUALITY_MEDIUM2)
                    .setAudioQuality(StreamingProfile.AUDIO_QUALITY_MEDIUM2)
//                .setPreferredVideoEncodingSize(960, 544)
                    .setEncodingSizeLevel(StreamingProfile.VIDEO_ENCODING_HEIGHT_480)
                    .setEncoderRCMode(StreamingProfile.EncoderRCModes.BITRATE_PRIORITY)
//                .setAVProfile(avProfile)
                    .setDnsManager(getMyDnsManager())
                    .setAdaptiveBitrateEnable(true)
                    .setFpsControllerEnable(true)
                    .setStreamStatusConfig(new StreamingProfile.StreamStatusConfig(3))
                    .setPublishUrl(publishurl)
//                .setEncodingOrientation(StreamingProfile.ENCODING_ORIENTATION.PORT)
                    .setSendingBufferProfile(new StreamingProfile.SendingBufferProfile(0.2f, 0.8f, 3.0f, 20 * 1000));
            CameraStreamingSetting setting = new CameraStreamingSetting();
            setting.setCameraId(Camera.CameraInfo.CAMERA_FACING_BACK)
                    .setContinuousFocusModeEnabled(true)
                    .setCameraPrvSizeLevel(CameraStreamingSetting.PREVIEW_SIZE_LEVEL.MEDIUM)
                    .setCameraPrvSizeRatio(CameraStreamingSetting.PREVIEW_SIZE_RATIO.RATIO_16_9);

            streamingManager = new MediaStreamingManager(this, afl, cameraPreviewFrameView,
                    AVCodecType.HW_VIDEO_WITH_HW_AUDIO_CODEC); // hw codec  // soft codec
            mMicrophoneStreamingSetting = new MicrophoneStreamingSetting();
            mMicrophoneStreamingSetting.setBluetoothSCOEnabled(false);
            streamingManager.prepare(setting, mMicrophoneStreamingSetting, null, streamingProfile);
            streamingManager.setStreamingStateListener(this);
        } catch (URISyntaxException e) {
            e.printStackTrace();
        }

    }

    @Override
    protected void onResume() {
        super.onResume();
        streamingManager.resume();
    }

    @Override
    protected void onPause() {
        super.onPause();
        // You must invoke pause here.
        streamingManager.pause();
    }

    @Override
    public void onStateChanged(StreamingState streamingState, Object o) {
        switch (streamingState) {
            case PREPARING:
                break;
            case READY:
                // start streaming when READY
                new Thread(new Runnable() {
                    @Override
                    public void run() {
                        if (streamingManager != null) {
                            streamingManager.startStreaming();
                        }
                    }
                }).start();
                break;
            case CONNECTING:
                break;
            case STREAMING:
                // The av packet had been sent.
                break;
            case SHUTDOWN:
                // The streaming had been finished.
                break;
            case IOERROR:
                // Network connect error.
                break;
            case SENDING_BUFFER_EMPTY:
                break;
            case SENDING_BUFFER_FULL:
                break;
            case AUDIO_RECORDING_FAIL:
                // Failed to record audio.
                break;
            case OPEN_CAMERA_FAIL:
                // Failed to open camera.
                break;
            case DISCONNECTED:
                // The socket is broken while streaming
                break;
        }
    }

    private static DnsManager getMyDnsManager() {
        IResolver r0 = new DnspodFree();
        IResolver r1 = AndroidDnsServer.defaultResolver();
        IResolver r2 = null;
        try {
            r2 = new Resolver(InetAddress.getByName("119.29.29.29"));
        } catch (IOException ex) {
            ex.printStackTrace();
        }
        return new DnsManager(NetworkInfo.normal, new IResolver[]{r0, r1, r2});
    }

    @Override
    public boolean onSingleTapUp(MotionEvent e) {
        return false;
    }

    @Override
    public boolean onZoomValueChanged(float factor) {
        return false;
    }
}

創建CameraPreviewFrameView

public class CameraPreviewFrameView extends GLSurfaceView {
    private static final String TAG = "CameraPreviewFrameView";

    public interface Listener {
        boolean onSingleTapUp(MotionEvent e);
        boolean onZoomValueChanged(float factor);
    }

    private Listener mListener;
    private ScaleGestureDetector mScaleDetector;
    private GestureDetector mGestureDetector;

    public CameraPreviewFrameView(Context context) {
        super(context);
        initialize(context);
    }

    public CameraPreviewFrameView(Context context, AttributeSet attrs) {
        super(context, attrs);
        initialize(context);
    }

    public void setListener(Listener listener) {
        mListener = listener;
    }

    @Override
    public boolean onTouchEvent(MotionEvent event) {
        if (!mGestureDetector.onTouchEvent(event)) {
            return mScaleDetector.onTouchEvent(event);
        }
        return false;
    }

    private GestureDetector.SimpleOnGestureListener mGestureListener = new GestureDetector.SimpleOnGestureListener() {
        @Override
        public boolean onSingleTapUp(MotionEvent e) {
            if (mListener != null) {
                mListener.onSingleTapUp(e);
            }
            return false;
        }
    };

    private ScaleGestureDetector.SimpleOnScaleGestureListener mScaleListener = new ScaleGestureDetector.SimpleOnScaleGestureListener() {

        private float mScaleFactor = 1.0f;

        @Override
        public boolean onScaleBegin(ScaleGestureDetector detector) {
            return true;
        }

        @Override
        public boolean onScale(ScaleGestureDetector detector) {
            // factor > 1, zoom
            // factor < 1, pinch
            mScaleFactor *= detector.getScaleFactor();

            // Don't let the object get too small or too large.
            mScaleFactor = Math.max(0.01f, Math.min(mScaleFactor, 1.0f));

            return mListener != null && mListener.onZoomValueChanged(mScaleFactor);
        }
    };

    private void initialize(Context context) {
        Log.i(TAG, "initialize");
        mScaleDetector = new ScaleGestureDetector(context, mScaleListener);
        mGestureDetector = new GestureDetector(context, mGestureListener);
    }
}

創建推流界面佈局文件

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:id="@+id/content"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:background="@color/background_floating_material_dark"
    tools:context=".HWCameraStreamingActivity" >

    <com.qiniu.pili.droid.streaming.widget.AspectFrameLayout
        android:id="@+id/cameraPreview_afl"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:layout_centerHorizontal="true"
        android:layout_alignParentTop="true">

        <com.jcmels.liba.simpleplayerdemo.CameraPreviewFrameView
            android:id="@+id/cameraPreview_surfaceView"
            android:layout_width="match_parent"
            android:layout_height="match_parent"
            android:layout_gravity="center" />

    </com.qiniu.pili.droid.streaming.widget.AspectFrameLayout>
</RelativeLayout>

啓動 APP 之後,當點擊 開始直播,就可以開始推流了。

測試播放效果

  • 測試方法: 從 app server 獲取到推流對應的播放地址,輸入到播放器中進行播放。

後記

通過快速開發,我們能做到的是最基礎的推流功能,更爲高級的功能還需要進一步的編寫。


發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章