一、前言
之前已經介紹過過時的舊 Camera 的使用了,畢竟在從 Android 5.0 後推薦使用 Camera2 了,所以現在開始介紹 Camera2 相關使用。老規矩還是從 SurfaceView 說起。
如果你對 Camera2 的相關類和接口還不熟悉,可以先看看下面這些介紹:
- CameraManager詳解
- CameraDevice詳解
- CameraCharacteristics詳解
- CameraCaptureSession詳解
- CaptureRequest和CaptureResult
爲什麼選擇 SurfaceView
?
SurfaceView
在自己獨立的線程中繪製,不會影響到主線程,內部使用雙緩衝機制,畫面更流暢。相比於 TextureView
,它內存佔用低,繪製更及時,耗時也更低,但不支持動畫和截圖。
下面是該應用的簡要截圖:
二、相機開發步驟
我們選擇將 Camera 和 View 分開,Camera 的相關操作由 Camera2Proxy
類完成,而 View 持有一個 Camera2Proxy 對象。這樣 Camera2Proxy 也是可以重複利用的。
注意: 避免篇幅過長,下面每個小模塊的示例代碼在最後統一給出。
1. 打開相機
通過 CameraManager 的 openCamera()
方法打開相機,並在 CameraDevice.StateCallback
回調中獲取 CameraDevice 對象。需要指定打開的相機 cameraId。
注意:
CameraCharacteristics.LENS_FACING_FRONT
通常表示後置攝像頭,CameraCharacteristics.LENS_FACING_BACK
通常表示前置攝像頭。
2. 相機配置
在 Camera2 API 中,相機的一些通用配置是通過 CameraCharacteristics
類完成,針對不同的請求(預覽&拍照等),我們還可以通過 CaptureRequest
類單獨配置。
我們可以設置 閃光模式、聚焦模式、曝光強度、預覽圖片格式和大小、拍照圖片格式和大小 等等信息。
3. 設置相機預覽時的顯示方向
設置好了預覽的顯示方向和大小,預覽的畫面纔不會產生拉伸等現象。
4. 開始預覽、停止預覽
可以通過 CameraCaptureSession
的 setRepeatingRequest()
重複發送預覽的請求來實現預覽,通過 stopRepeating()
方法來停止發送。
5. 釋放相機
相機是很耗費系統資源的東西,用完一定要釋放。
6. 點擊聚焦
簡單的說,就是根據用戶在 view 上的觸摸點,映射到相機座標系中對應的點,然後通過 CaptureRequest.Builder
的 CaptureRequest.CONTROL_AF_REGIONS
字段設置聚焦的區域。
7. 雙指放大縮小
通過 View 的點擊事件,獲取到雙指之間的間距,並通過 CaptureRequest.Builder
的 CaptureRequest.SCALER_CROP_REGION
字段設置縮放。
8. 拍照
新建一個 ImageReader
對象作爲拍照的輸出目標,通過創建一個拍照的 CaptureRequest,並通過 CameraCaptureSession
的 capture()
方法來發送單次請求。
注意,預覽的時候是通過 CameraCaptureSession
的 setRepeatingRequest()
來發送重複請求,注意區分。
9. Camera2Proxy 類
下面代碼還用到了 OrientationEventListener
,這裏之前沒介紹,是通過傳感器來獲取當前手機的方向的,用於 拍照 的時候設置圖片的選擇使用,後面會介紹。
package com.afei.camerademo.camera;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.app.Activity;
import android.content.Context;
import android.graphics.ImageFormat;
import android.graphics.Rect;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.MeteringRectangle;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.ImageReader;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.util.Log;
import android.util.Size;
import android.view.OrientationEventListener;
import android.view.Surface;
import android.view.SurfaceHolder;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
public class Camera2Proxy {
private static final String TAG = "Camera2Proxy";
private Activity mActivity;
private int mCameraId = CameraCharacteristics.LENS_FACING_FRONT; // 要打開的攝像頭ID
private Size mPreviewSize; // 預覽大小
private CameraManager mCameraManager; // 相機管理者
private CameraCharacteristics mCameraCharacteristics; // 相機屬性
private CameraDevice mCameraDevice; // 相機對象
private CameraCaptureSession mCaptureSession;
private CaptureRequest.Builder mPreviewRequestBuilder; // 相機預覽請求的構造器
private CaptureRequest mPreviewRequest;
private Handler mBackgroundHandler;
private HandlerThread mBackgroundThread;
private ImageReader mImageReader;
private Surface mPreviewSurface;
private OrientationEventListener mOrientationEventListener;
private int mDisplayRotate = 0;
private int mDeviceOrientation = 0; // 設備方向,由相機傳感器獲取
private int mZoom = 1; // 縮放
/**
* 打開攝像頭的回調
*/
private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
Log.d(TAG, "onOpened");
mCameraDevice = camera;
initPreviewRequest();
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
Log.d(TAG, "onDisconnected");
releaseCamera();
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
Log.e(TAG, "Camera Open failed, error: " + error);
releaseCamera();
}
};
@TargetApi(Build.VERSION_CODES.M)
public Camera2Proxy(Activity activity) {
mActivity = activity;
mCameraManager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE);
mOrientationEventListener = new OrientationEventListener(mActivity) {
@Override
public void onOrientationChanged(int orientation) {
mDeviceOrientation = orientation;
}
};
}
@SuppressLint("MissingPermission")
public void openCamera(int width, int height) {
Log.v(TAG, "openCamera");
startBackgroundThread(); // 對應 releaseCamera() 方法中的 stopBackgroundThread()
mOrientationEventListener.enable();
try {
mCameraCharacteristics = mCameraManager.getCameraCharacteristics(Integer.toString(mCameraId));
StreamConfigurationMap map = mCameraCharacteristics.get(CameraCharacteristics
.SCALER_STREAM_CONFIGURATION_MAP);
// 拍照大小,選擇能支持的一個最大的圖片大小
Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new
CompareSizesByArea());
Log.d(TAG, "picture size: " + largest.getWidth() + "*" + largest.getHeight());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, 2);
// 預覽大小,根據上面選擇的拍照圖片的長寬比,選擇一個和控件長寬差不多的大小
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height, largest);
Log.d(TAG, "preview size: " + mPreviewSize.getWidth() + "*" + mPreviewSize.getHeight());
// 打開攝像頭
mCameraManager.openCamera(Integer.toString(mCameraId), mStateCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void releaseCamera() {
Log.v(TAG, "releaseCamera");
if (null != mCaptureSession) {
mCaptureSession.close();
mCaptureSession = null;
}
if (mCameraDevice != null) {
mCameraDevice.close();
mCameraDevice = null;
}
if (mImageReader != null) {
mImageReader.close();
mImageReader = null;
}
mOrientationEventListener.disable();
stopBackgroundThread(); // 對應 openCamera() 方法中的 startBackgroundThread()
}
public void setImageAvailableListener(ImageReader.OnImageAvailableListener onImageAvailableListener) {
if (mImageReader == null) {
Log.w(TAG, "setImageAvailableListener: mImageReader is null");
return;
}
mImageReader.setOnImageAvailableListener(onImageAvailableListener, null);
}
public void setPreviewSurface(SurfaceHolder holder) {
mPreviewSurface = holder.getSurface();
}
public void setPreviewSurface(SurfaceTexture surfaceTexture) {
surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
mPreviewSurface = new Surface(surfaceTexture);
}
private void initPreviewRequest() {
try {
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(mPreviewSurface); // 設置預覽輸出的 Surface
mCameraDevice.createCaptureSession(Arrays.asList(mPreviewSurface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
mCaptureSession = session;
// 設置連續自動對焦
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest
.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// 設置自動曝光
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest
.CONTROL_AE_MODE_ON_AUTO_FLASH);
// 設置完後自動開始預覽
mPreviewRequest = mPreviewRequestBuilder.build();
startPreview();
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG, "ConfigureFailed. session: mCaptureSession");
}
}, mBackgroundHandler); // handle 傳入 null 表示使用當前線程的 Looper
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void startPreview() {
if (mCaptureSession == null || mPreviewRequestBuilder == null) {
Log.w(TAG, "startPreview: mCaptureSession or mPreviewRequestBuilder is null");
return;
}
try {
// 開始預覽,即一直髮送預覽的請求
mCaptureSession.setRepeatingRequest(mPreviewRequest, null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void stopPreview() {
if (mCaptureSession == null || mPreviewRequestBuilder == null) {
Log.w(TAG, "stopPreview: mCaptureSession or mPreviewRequestBuilder is null");
return;
}
try {
mCaptureSession.stopRepeating();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void captureStillPicture() {
try {
CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice
.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getJpegOrientation(mDeviceOrientation));
// 預覽如果有放大,拍照的時候也應該保存相同的縮放
Rect zoomRect = mPreviewRequestBuilder.get(CaptureRequest.SCALER_CROP_REGION);
if (zoomRect != null) {
captureBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
}
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
final long time = System.currentTimeMillis();
mCaptureSession.capture(captureBuilder.build(), new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
Log.w(TAG, "onCaptureCompleted, time: " + (System.currentTimeMillis() - time));
try {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata
.CONTROL_AF_TRIGGER_CANCEL);
mCaptureSession.capture(mPreviewRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
startPreview();
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private int getJpegOrientation(int deviceOrientation) {
if (deviceOrientation == android.view.OrientationEventListener.ORIENTATION_UNKNOWN) return 0;
int sensorOrientation = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
// Round device orientation to a multiple of 90
deviceOrientation = (deviceOrientation + 45) / 90 * 90;
// Reverse device orientation for front-facing cameras
boolean facingFront = mCameraCharacteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics
.LENS_FACING_FRONT;
if (facingFront) deviceOrientation = -deviceOrientation;
// Calculate desired JPEG orientation relative to camera orientation to make
// the image upright relative to the device orientation
int jpegOrientation = (sensorOrientation + deviceOrientation + 360) % 360;
return jpegOrientation;
}
public boolean isFrontCamera() {
return mCameraId == CameraCharacteristics.LENS_FACING_BACK;
}
public Size getPreviewSize() {
return mPreviewSize;
}
public void switchCamera(int width, int height) {
mCameraId ^= 1;
releaseCamera();
openCamera(width, height);
}
private Size chooseOptimalSize(Size[] sizes, int viewWidth, int viewHeight, Size pictureSize) {
int totalRotation = getRotation();
boolean swapRotation = totalRotation == 90 || totalRotation == 270;
int width = swapRotation ? viewHeight : viewWidth;
int height = swapRotation ? viewWidth : viewHeight;
return getSuitableSize(sizes, width, height, pictureSize);
}
private int getRotation() {
int displayRotation = mActivity.getWindowManager().getDefaultDisplay().getRotation();
switch (displayRotation) {
case Surface.ROTATION_0:
displayRotation = 90;
break;
case Surface.ROTATION_90:
displayRotation = 0;
break;
case Surface.ROTATION_180:
displayRotation = 270;
break;
case Surface.ROTATION_270:
displayRotation = 180;
break;
}
int sensorOrientation = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
mDisplayRotate = (displayRotation + sensorOrientation + 270) % 360;
return mDisplayRotate;
}
private Size getSuitableSize(Size[] sizes, int width, int height, Size pictureSize) {
int minDelta = Integer.MAX_VALUE; // 最小的差值,初始值應該設置大點保證之後的計算中會被重置
int index = 0; // 最小的差值對應的索引座標
float aspectRatio = pictureSize.getHeight() * 1.0f / pictureSize.getWidth();
Log.d(TAG, "getSuitableSize. aspectRatio: " + aspectRatio);
for (int i = 0; i < sizes.length; i++) {
Size size = sizes[i];
// 先判斷比例是否相等
if (size.getWidth() * aspectRatio == size.getHeight()) {
int delta = Math.abs(width - size.getWidth());
if (delta == 0) {
return size;
}
if (minDelta > delta) {
minDelta = delta;
index = i;
}
}
}
return sizes[index];
}
public void handleZoom(boolean isZoomIn) {
if (mCameraDevice == null || mCameraCharacteristics == null || mPreviewRequestBuilder == null) {
return;
}
int maxZoom = mCameraCharacteristics.get(CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM).intValue()
* 10;
Rect rect = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
if (isZoomIn && mZoom < maxZoom) {
mZoom++;
} else if (mZoom > 1) {
mZoom--;
}
int minW = rect.width() / maxZoom;
int minH = rect.height() / maxZoom;
int difW = rect.width() - minW;
int difH = rect.height() - minH;
int cropW = difW * mZoom / 100;
int cropH = difH * mZoom / 100;
cropW -= cropW & 3;
cropH -= cropH & 3;
Rect zoomRect = new Rect(cropW, cropH, rect.width() - cropW, rect.height() - cropH);
mPreviewRequestBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
mPreviewRequest = mPreviewRequestBuilder.build();
startPreview(); // 需要重新 start preview 才能生效
}
public void focusOnPoint(double x, double y, int width, int height) {
if (mCameraDevice == null || mPreviewRequestBuilder == null) {
return;
}
// 1. 先取相對於view上面的座標
int previewWidth = mPreviewSize.getWidth();
int previewHeight = mPreviewSize.getHeight();
if (mDisplayRotate == 90 || mDisplayRotate == 270) {
previewWidth = mPreviewSize.getHeight();
previewHeight = mPreviewSize.getWidth();
}
// 2. 計算攝像頭取出的圖像相對於view放大了多少,以及有多少偏移
double tmp;
double imgScale;
double verticalOffset = 0;
double horizontalOffset = 0;
if (previewHeight * width > previewWidth * height) {
imgScale = width * 1.0 / previewWidth;
verticalOffset = (previewHeight - height / imgScale) / 2;
} else {
imgScale = height * 1.0 / previewHeight;
horizontalOffset = (previewWidth - width / imgScale) / 2;
}
// 3. 將點擊的座標轉換爲圖像上的座標
x = x / imgScale + horizontalOffset;
y = y / imgScale + verticalOffset;
if (90 == mDisplayRotate) {
tmp = x;
x = y;
y = mPreviewSize.getHeight() - tmp;
} else if (270 == mDisplayRotate) {
tmp = x;
x = mPreviewSize.getWidth() - y;
y = tmp;
}
// 4. 計算取到的圖像相對於裁剪區域的縮放係數,以及位移
Rect cropRegion = mPreviewRequestBuilder.get(CaptureRequest.SCALER_CROP_REGION);
if (cropRegion == null) {
Log.w(TAG, "can't get crop region");
cropRegion = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
}
int cropWidth = cropRegion.width();
int cropHeight = cropRegion.height();
if (mPreviewSize.getHeight() * cropWidth > mPreviewSize.getWidth() * cropHeight) {
imgScale = cropHeight * 1.0 / mPreviewSize.getHeight();
verticalOffset = 0;
horizontalOffset = (cropWidth - imgScale * mPreviewSize.getWidth()) / 2;
} else {
imgScale = cropWidth * 1.0 / mPreviewSize.getWidth();
horizontalOffset = 0;
verticalOffset = (cropHeight - imgScale * mPreviewSize.getHeight()) / 2;
}
// 5. 將點擊區域相對於圖像的座標,轉化爲相對於成像區域的座標
x = x * imgScale + horizontalOffset + cropRegion.left;
y = y * imgScale + verticalOffset + cropRegion.top;
double tapAreaRatio = 0.1;
Rect rect = new Rect();
rect.left = clamp((int) (x - tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.width());
rect.right = clamp((int) (x + tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.width());
rect.top = clamp((int) (y - tapAreaRatio / 2 * cropRegion.height()), 0, cropRegion.height());
rect.bottom = clamp((int) (y + tapAreaRatio / 2 * cropRegion.height()), 0, cropRegion.height());
// 6. 設置 AF、AE 的測光區域,即上述得到的 rect
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[]{new MeteringRectangle
(rect, 1000)});
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_REGIONS, new MeteringRectangle[]{new MeteringRectangle
(rect, 1000)});
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CameraMetadata
.CONTROL_AE_PRECAPTURE_TRIGGER_START);
try {
// 7. 發送上述設置的對焦請求,並監聽回調
mCaptureSession.capture(mPreviewRequestBuilder.build(), mAfCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final CameraCaptureSession.CaptureCallback mAfCaptureCallback = new CameraCaptureSession.CaptureCallback() {
private void process(CaptureResult result) {
Integer state = result.get(CaptureResult.CONTROL_AF_STATE);
if (null == state) {
return;
}
if (state == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED || state == CaptureResult
.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED) {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest
.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.FLASH_MODE_OFF);
startPreview();
}
}
@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull CaptureResult partialResult) {
process(partialResult);
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
process(result);
}
};
private void startBackgroundThread() {
if (mBackgroundThread == null || mBackgroundHandler == null) {
mBackgroundThread = new HandlerThread("CameraBackground");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
}
private void stopBackgroundThread() {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private int clamp(int x, int min, int max) {
if (x > max) return max;
if (x < min) return min;
return x;
}
/**
* Compares two {@code Size}s based on their areas.
*/
static class CompareSizesByArea implements Comparator<Size> {
@Override
public int compare(Size lhs, Size rhs) {
// We cast here to ensure the multiplications won't overflow
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}
}
}
三、Camera2SurfaceView
通過上面的介紹,對於相機的操作應該有了一定的瞭解了,接下來完成 View 這部分。
需求分析:
Camera2SurfaceView
是要繼承SurfaceView
的。- 我們需要重寫
onMeasure
使得Camera2SurfaceView
的寬高可以和相機預覽尺寸相匹配,這樣就不會有畫面被拉伸的感覺了。 - 我們需要在
Camera2SurfaceView
中完成對相機的打開、關閉等操作,值得慶幸的是我們可以通過上面的Camera2Proxy
很容易的做到。 - 我們需要重寫
onTouchEvent
方法,來實現單點聚焦,雙指放大縮小的功能。
實現:
主要是在 SurfaceHolder.Callback
的幾個回調方法中打開和釋放相機,另外就是重寫 onMeasure
,onTouchEvent
那幾個方法。
package com.afei.camerademo.surfaceview;
import android.app.Activity;
import android.content.Context;
import android.util.AttributeSet;
import android.util.Log;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import com.afei.camerademo.camera.Camera2Proxy;
public class Camera2SurfaceView extends SurfaceView {
private static final String TAG = "Camera2SurfaceView";
private Camera2Proxy mCameraProxy;
private int mRatioWidth = 0;
private int mRatioHeight = 0;
private float mOldDistance;
public Camera2SurfaceView(Context context) {
this(context, null);
}
public Camera2SurfaceView(Context context, AttributeSet attrs) {
this(context, attrs, 0);
}
public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
this(context, attrs, defStyleAttr, 0);
}
public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr, int defStyleRes) {
super(context, attrs, defStyleAttr, defStyleRes);
init(context);
}
private void init(Context context) {
getHolder().addCallback(mSurfaceHolderCallback);
mCameraProxy = new Camera2Proxy((Activity) context);
}
private final SurfaceHolder.Callback mSurfaceHolderCallback = new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder holder) {
mCameraProxy.setPreviewSurface(holder);
mCameraProxy.openCamera(getWidth(), getHeight());
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.d(TAG, "surfaceChanged: width: " + width + ", height: " + height);
int previewWidth = mCameraProxy.getPreviewSize().getWidth();
int previewHeight = mCameraProxy.getPreviewSize().getHeight();
if (width > height) {
setAspectRatio(previewWidth, previewHeight);
} else {
setAspectRatio(previewHeight, previewWidth);
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCameraProxy.releaseCamera();
}
};
public void setAspectRatio(int width, int height) {
if (width < 0 || height < 0) {
throw new IllegalArgumentException("Size cannot be negative.");
}
mRatioWidth = width;
mRatioHeight = height;
requestLayout();
}
public Camera2Proxy getCameraProxy() {
return mCameraProxy;
}
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec);
int width = MeasureSpec.getSize(widthMeasureSpec);
int height = MeasureSpec.getSize(heightMeasureSpec);
if (0 == mRatioWidth || 0 == mRatioHeight) {
setMeasuredDimension(width, height);
} else {
if (width < height * mRatioWidth / mRatioHeight) {
setMeasuredDimension(width, width * mRatioHeight / mRatioWidth);
} else {
setMeasuredDimension(height * mRatioWidth / mRatioHeight, height);
}
}
}
@Override
public boolean onTouchEvent(MotionEvent event) {
if (event.getPointerCount() == 1) {
mCameraProxy.focusOnPoint(event.getX(), event.getY(), getWidth(), getHeight());
return true;
}
switch (event.getAction() & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_POINTER_DOWN:
mOldDistance = getFingerSpacing(event);
break;
case MotionEvent.ACTION_MOVE:
float newDistance = getFingerSpacing(event);
if (newDistance > mOldDistance) {
mCameraProxy.handleZoom(true);
} else if (newDistance < mOldDistance) {
mCameraProxy.handleZoom(false);
}
mOldDistance = newDistance;
break;
default:
break;
}
return super.onTouchEvent(event);
}
private static float getFingerSpacing(MotionEvent event) {
float x = event.getX(0) - event.getX(1);
float y = event.getY(0) - event.getY(1);
return (float) Math.sqrt(x * x + y * y);
}
}
四、SurfaceCamera2Activity
接下來,我們把寫好的 Camera2SurfaceView
放在 Activity 或者 Fragment 中使用就行了。
注意相機使用前,需要申請相關權限,以及權限的動態申請。
1. AndroidManifest.xml
相機相關權限如下,動態權限的申請代碼很多,這裏不詳細介紹了,不清楚的可以看這篇博客:Android動態權限申請
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
2. 拍照功能
需要注意的是,前置攝像頭是存在左右鏡像的,因此針對前置攝像頭我們需要手機進行一個左右鏡像的操作。
下面是完整的 SurfaceCamera2Activity
代碼:
package com.afei.camerademo.surfaceview;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.media.Image;
import android.media.ImageReader;
import android.os.AsyncTask;
import android.os.Bundle;
import android.provider.MediaStore;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.ImageView;
import com.afei.camerademo.ImageUtils;
import com.afei.camerademo.R;
import com.afei.camerademo.camera.Camera2Proxy;
import java.nio.ByteBuffer;
public class SurfaceCamera2Activity extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "SurfaceCamera2Activity";
private ImageView mCloseIv;
private ImageView mSwitchCameraIv;
private ImageView mTakePictureIv;
private ImageView mPictureIv;
private Camera2SurfaceView mCameraView;
private Camera2Proxy mCameraProxy;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_surface_camera2);
initView();
}
private void initView() {
mCloseIv = findViewById(R.id.toolbar_close_iv);
mCloseIv.setOnClickListener(this);
mSwitchCameraIv = findViewById(R.id.toolbar_switch_iv);
mSwitchCameraIv.setOnClickListener(this);
mTakePictureIv = findViewById(R.id.take_picture_iv);
mTakePictureIv.setOnClickListener(this);
mPictureIv = findViewById(R.id.picture_iv);
mPictureIv.setOnClickListener(this);
mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap());
mCameraView = findViewById(R.id.camera_view);
mCameraProxy = mCameraView.getCameraProxy();
}
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.toolbar_close_iv:
finish();
break;
case R.id.toolbar_switch_iv:
mCameraProxy.switchCamera(mCameraView.getWidth(), mCameraView.getHeight());
mCameraProxy.startPreview();
break;
case R.id.take_picture_iv:
mCameraProxy.setImageAvailableListener(mOnImageAvailableListener);
mCameraProxy.captureStillPicture(); // 拍照
break;
case R.id.picture_iv:
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivity(intent);
break;
}
}
private ImageReader.OnImageAvailableListener mOnImageAvailableListener =
new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
new ImageSaveTask().execute(reader.acquireNextImage()); // 保存圖片
}
};
private class ImageSaveTask extends AsyncTask<Image, Void, Void> {
@Override
protected Void doInBackground(Image ... images) {
ByteBuffer buffer = images[0].getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
long time = System.currentTimeMillis();
if (mCameraProxy.isFrontCamera()) {
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
Log.d(TAG, "BitmapFactory.decodeByteArray time: " + (System.currentTimeMillis() - time));
time = System.currentTimeMillis();
// 前置攝像頭需要左右鏡像
Bitmap rotateBitmap = ImageUtils.rotateBitmap(bitmap, 0, true, true);
Log.d(TAG, "rotateBitmap time: " + (System.currentTimeMillis() - time));
time = System.currentTimeMillis();
ImageUtils.saveBitmap(rotateBitmap);
Log.d(TAG, "saveBitmap time: " + (System.currentTimeMillis() - time));
rotateBitmap.recycle();
} else {
ImageUtils.saveImage(bytes);
Log.d(TAG, "saveBitmap time: " + (System.currentTimeMillis() - time));
}
images[0].close();
return null;
}
@Override
protected void onPostExecute(Void aVoid) {
mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap());
}
}
}
附上 ImageUtils
代碼:
package com.afei.camerademo;
import android.content.ContentResolver;
import android.content.ContentValues;
import android.content.Context;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.Matrix;
import android.os.Environment;
import android.provider.MediaStore;
import android.util.Log;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
public class ImageUtils {
private static final String TAG = "ImageUtils";
private static final String GALLERY_PATH = Environment.getExternalStoragePublicDirectory(Environment
.DIRECTORY_DCIM) + File.separator + "Camera";
private static final SimpleDateFormat DATE_FORMAT = new SimpleDateFormat("yyyyMMdd_HHmmss");
public static Bitmap rotateBitmap(Bitmap source, int degree, boolean flipHorizontal, boolean recycle) {
if (degree == 0) {
return source;
}
Matrix matrix = new Matrix();
matrix.postRotate(degree);
if (flipHorizontal) {
matrix.postScale(-1, 1); // 前置攝像頭存在水平鏡像的問題,所以有需要的話調用這個方法進行水平鏡像
}
Bitmap rotateBitmap = Bitmap.createBitmap(source, 0, 0, source.getWidth(), source.getHeight(), matrix, false);
if (recycle) {
source.recycle();
}
return rotateBitmap;
}
public static void saveBitmap(Bitmap bitmap) {
String fileName = DATE_FORMAT.format(new Date(System.currentTimeMillis())) + ".jpg";
File outFile = new File(GALLERY_PATH, fileName);
Log.d(TAG, "saveImage. filepath: " + outFile.getAbsolutePath());
FileOutputStream os = null;
try {
os = new FileOutputStream(outFile);
boolean success = bitmap.compress(Bitmap.CompressFormat.JPEG, 100, os);
if (success) {
insertToDB(outFile.getAbsolutePath());
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (os != null) {
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
public static void insertToDB(String picturePath) {
ContentValues values = new ContentValues();
ContentResolver resolver = MyApp.getInstance().getContentResolver();
values.put(MediaStore.Images.ImageColumns.DATA, picturePath);
values.put(MediaStore.Images.ImageColumns.TITLE, picturePath.substring(picturePath.lastIndexOf("/") + 1));
values.put(MediaStore.Images.ImageColumns.DATE_TAKEN, System.currentTimeMillis());
values.put(MediaStore.Images.ImageColumns.MIME_TYPE, "image/jpeg");
resolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
}
}
五、項目地址
部分沒有貼出來的代碼,可在下面地址中找到。
地址:
https://github.com/afei-cn/CameraDemo/tree/master/app/src/main/java/com/afei/camerademo/surfaceview
其它:
自定義Camera系列之:SurfaceView + Camera