之前做過android的錄音,編輯(裁剪和合成(WAV格式)),思路大概是從麥克風獲取音頻的詳細數據填充到list集合中,再將這些數據經過計算畫到屏幕上,算是實時錄製的波形圖!之後有一段時間沒碰過那個項目了,雖然功能是做出來了,但是還不算是完整的,那要是播放的時候呢?播放的時候怎麼實時動態的獲取音頻數據來繪製呢?思考良久,在逛github的時候,發現了這個功能!在這裏做個記錄,也給沒有這方面知識的朋友們做個補充,分享一下!
OK,先看效果圖吧!
這個效果圖是線性和圓形的音頻傅里葉數據圖形,當然還有柱狀的效果圖,這裏並沒有展示,整完這篇博客後,大家可以自己下載demo自己運行看看效果。
獲取音頻播放的實時數據並繪製,涉及到android提供的一個類,Visualizer,這個類可以捕獲使用MediaPlayer的時候音頻數據,主要返回兩種類型的數據,一種是音頻的波形數據,一種是傅里葉數據(未考究),android系統中關於這個類的描述幾乎爲0,並不像其它類會有大把英文註釋!很煩。。
代碼使用方法如下:
`
很簡單吧!只需要實現一個捕獲監聽即可!
OK,我們再來看看柱狀圖。
效果圖如下:
界面的效果有點糙?不急,我們理解了原理,之後慢慢改唄!哪有什麼東西都是現成的?
先看自定義的界面展示的代碼:
/**
* Copyright 2011, Felix Palmer
*
* Licensed under the MIT license:
* http://creativecommons.org/licenses/MIT/
*/
package com.tian.audio.wave.widget;
import java.util.HashSet;
import java.util.Set;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.Bitmap.Config;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Matrix;
import android.graphics.Paint;
import android.graphics.PorterDuff.Mode;
import android.graphics.PorterDuffXfermode;
import android.graphics.Rect;
import android.media.MediaPlayer;
import android.media.audiofx.Visualizer;
import android.util.AttributeSet;
import android.view.View;
import com.tian.audio.wave.dao.AudioData;
import com.tian.audio.wave.dao.FFTData;
import com.tian.audio.wave.renderer.Renderer;
/**
* A class that draws visualizations of data received from a
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture } and
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
*/
public class VisualizerView extends View {
private static final String TAG = "VisualizerView";
private byte[] mBytes;
private byte[] mFFTBytes;
private Rect mRect = new Rect();
private Visualizer mVisualizer;
private Set<Renderer> mRenderers;
private Paint mFlashPaint = new Paint();
private Paint mFadePaint = new Paint();
public VisualizerView(Context context, AttributeSet attrs, int defStyle){
super(context, attrs);
init();
}
public VisualizerView(Context context, AttributeSet attrs)
{
this(context, attrs, 0);
}
public VisualizerView(Context context)
{
this(context, null, 0);
}
private void init() {
mBytes = null;
mFFTBytes = null;
mFlashPaint.setColor(Color.argb(122, 255, 255, 255));
mFadePaint.setColor(Color.argb(238, 255, 255, 255)); // Adjust alpha to change how quickly the image fades
mFadePaint.setXfermode(new PorterDuffXfermode(Mode.MULTIPLY));
mRenderers = new HashSet<Renderer>();
}
/**
* Links the visualizer to a player
* @param player - MediaPlayer instance to link to
*/
public void link(MediaPlayer player){
if(player == null)
{
throw new NullPointerException("Cannot link to null MediaPlayer");
}
// Create the Visualizer object and attach it to our media player.
mVisualizer = new Visualizer(player.getAudioSessionId());
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
// Pass through Visualizer data to VisualizerView
Visualizer.OnDataCaptureListener captureListener = new Visualizer.OnDataCaptureListener(){
//捕獲波形數據
@Override
public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate){
updateVisualizer(bytes);
}
//捕獲傅里葉數據
@Override
public void onFftDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate){
updateVisualizerFFT(bytes);
}
};
mVisualizer.setDataCaptureListener(captureListener,
Visualizer.getMaxCaptureRate() / 2, true, true);
// Enabled Visualizer and disable when we're done with the stream
mVisualizer.setEnabled(true);
player.setOnCompletionListener(new MediaPlayer.OnCompletionListener(){
@Override
public void onCompletion(MediaPlayer mediaPlayer){
mVisualizer.setEnabled(false);
}
});
}
public void addRenderer(Renderer renderer){
if(renderer != null){
mRenderers.add(renderer);
}
}
public void clearRenderers()
{
mRenderers.clear();
}
/**
* Call to release the resources used by VisualizerView. Like with the
* MediaPlayer it is good practice to call this method
*/
public void release()
{
mVisualizer.release();
}
/**
* Pass data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture }
* @param bytes
*/
public void updateVisualizer(byte[] bytes) {
mBytes = bytes;
invalidate();
}
/**
* Pass FFT data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
* @param bytes
*/
public void updateVisualizerFFT(byte[] bytes) {
mFFTBytes = bytes;
invalidate();
}
boolean mFlash = false;
/**
* Call this to make the visualizer flash. Useful for flashing at the start
* of a song/loop etc...
*/
public void flash() {
mFlash = true;
invalidate();
}
Bitmap mCanvasBitmap;
Canvas mCanvas;
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
// Create canvas once we're ready to draw
mRect.set(0, 0, getWidth(), getHeight());
if(mCanvasBitmap == null){
mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(), canvas.getHeight(), Config.ARGB_8888);
}
if(mCanvas == null){
mCanvas = new Canvas(mCanvasBitmap);
}
if (mBytes != null) {
// Render all audio renderers
AudioData audioData = new AudioData(mBytes);
for(Renderer r : mRenderers){
r.render(mCanvas, audioData, mRect);
}
}
if (mFFTBytes != null) {
// Render all FFT renderers
FFTData fftData = new FFTData(mFFTBytes);
for(Renderer r : mRenderers){
r.render(mCanvas, fftData, mRect);
}
}
// 漸變產生的陰影的效果
mCanvas.drawPaint(mFadePaint);
if(mFlash){
mFlash = false;
mCanvas.drawPaint(mFlashPaint);
}
canvas.drawBitmap(mCanvasBitmap, new Matrix(), null);
}
}
這個類很簡單,對不同的展示界面進行了簡單的封裝,主要的繪製那肯定在onDraw方法體!而在獲取到音頻數據的時候,將Visualizer捕獲到的音頻數據(傅里葉),抽取出來進行invalidate();重新繪製,所以其他的我們可以跳過,直接看onDraw方法體,也包括怎樣產生的陰影效果!
抽取的父類處理:
package com.tian.audio.wave.renderer;
import android.graphics.Canvas;
import android.graphics.Rect;
import com.tian.audio.wave.dao.AudioData;
import com.tian.audio.wave.dao.FFTData;
abstract public class Renderer{
// Have these as members, so we don't have to re-create them each time
protected float[] mPoints;
protected float[] mFFTPoints;
public Renderer()
{
}
// As the display of raw/FFT audio will usually look different, subclasses
// will typically only implement one of the below methods
/**
* Implement this method to render the audio data onto the canvas
* @param canvas - Canvas to draw on
* @param data - Data to render
* @param rect - Rect to render into
*/
abstract public void onRender(Canvas canvas, AudioData data, Rect rect);
/**
* Implement this method to render the FFT audio data onto the canvas
* @param canvas - Canvas to draw on
* @param data - Data to render
* @param rect - Rect to render into
*/
abstract public void onRender(Canvas canvas, FFTData data, Rect rect);
// These methods should actually be called for rendering
/**
* Render the audio data onto the canvas
* @param canvas - Canvas to draw on
* @param data - Data to render
* @param rect - Rect to render into
*/
final public void render(Canvas canvas, AudioData data, Rect rect)
{
if (mPoints == null || mPoints.length < data.bytes.length * 4) {
mPoints = new float[data.bytes.length * 4];
}
onRender(canvas, data, rect);
}
/**
* Render the FFT data onto the canvas
* @param canvas - Canvas to draw on
* @param data - Data to render
* @param rect - Rect to render into
*/
final public void render(Canvas canvas, FFTData data, Rect rect)
{
if (mFFTPoints == null || mFFTPoints.length < data.bytes.length * 4) {
mFFTPoints = new float[data.bytes.length * 4];
}
onRender(canvas, data, rect);
}
}
繪製數據類:
/**
* Copyright 2011, Felix Palmer
*
* Licensed under the MIT license:
* http://creativecommons.org/licenses/MIT/
*/
package com.tian.audio.wave.renderer;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.Rect;
import com.tian.audio.wave.dao.AudioData;
import com.tian.audio.wave.dao.FFTData;
/**
* 操作畫筆進行各個bar的繪製工作
*/
public class BarGraphRenderer extends Renderer{
private int mDivisions;
private Paint mPaint;
private boolean mTop;
/**
* Renders the FFT data as a series of lines, in histogram form
* @param divisions - must be a power of 2. Controls how many lines to draw
* @param paint - Paint to draw lines with
* @param top - whether to draw the lines at the top of the canvas, or the bottom
*/
public BarGraphRenderer(int divisions,
Paint paint,
boolean top){
super();
mDivisions = divisions;
mPaint = paint;
mTop = top;
}
@Override
public void onRender(Canvas canvas, AudioData data, Rect rect){
// Do nothing, we only display FFT data
}
@Override
public void onRender(Canvas canvas, FFTData data, Rect rect){
for (int i = 0; i < data.bytes.length / mDivisions; i++) {
mFFTPoints[i * 4] = i * 4 * mDivisions;
mFFTPoints[i * 4 + 2] = i * 4 * mDivisions;
byte rfk = data.bytes[mDivisions * i];//間隔倍數
byte ifk = data.bytes[mDivisions * i + 1];
float magnitude = (rfk * rfk + ifk * ifk);
int dbValue = (int) (10 * Math.log10(magnitude));
if(mTop){
mFFTPoints[i * 4 + 1] = 0;
mFFTPoints[i * 4 + 3] = (dbValue * 2 - 10);
}else{
mFFTPoints[i * 4 + 1] = rect.height();
mFFTPoints[i * 4 + 3] = rect.height() - (dbValue * 2 - 10);
}
}
canvas.drawLines(mFFTPoints, mPaint);
}
}
佈局文件:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:background="@drawable/bg"
android:orientation="vertical" >
<FrameLayout
android:layout_width="fill_parent"
android:layout_height="0dp"
android:layout_margin="10dp"
android:layout_weight="1"
android:background="#000" >
<com.tian.audio.wave.widget.VisualizerView
android:id="@+id/visualizerView"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
</com.tian.audio.wave.widget.VisualizerView>
</FrameLayout>
<LinearLayout
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:layout_weight="0" >
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.25"
android:onClick="barPressed"
android:text="Bar" >
</Button>
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.25"
android:onClick="circlePressed"
android:text="Circle" >
</Button>
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.25"
android:onClick="circleBarPressed"
android:text="Circle Bar" >
</Button>
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.25"
android:onClick="linePressed"
android:text="Line" >
</Button>
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.25"
android:onClick="clearPressed"
android:text="Clear" >
</Button>
</LinearLayout>
<LinearLayout
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:layout_weight="0" >
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.5"
android:onClick="startPressed"
android:text="Start" >
</Button>
<Button
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:layout_weight="0.5"
android:onClick="stopPressed"
android:text="Stop" >
</Button>
</LinearLayout>
</LinearLayout>
使用代碼:
// Methods for adding renderers to visualizer
private void addBarGraphRenderers(){
//底部柱狀條
Paint paint = new Paint();
paint.setStrokeWidth(50f);
paint.setAntiAlias(true);
paint.setColor(Color.argb(200, 56, 138, 252));
BarGraphRenderer barGraphRendererBottom = new BarGraphRenderer(16, paint, false);
mVisualizerView.addRenderer(barGraphRendererBottom);
//頂部柱狀條
Paint paint2 = new Paint();
paint2.setStrokeWidth(12f);
paint2.setAntiAlias(true);
paint2.setColor(Color.argb(200, 181, 111, 233));
BarGraphRenderer barGraphRendererTop = new BarGraphRenderer(4, paint2, true);
mVisualizerView.addRenderer(barGraphRendererTop);
}
OK,不知道的,這個功能是很難做,知道後,就很簡單了!哈哈。。
github地址(大家下載的話,順手給個star,也是對作者的鼓勵!謝謝啦!):
https://github.com/T-chuangxin/AudioWaveShow
每天進步一點點,時間會讓你成爲巨人!加油!