Flutter 彙總請看這裏
文章目錄
Flutter插件 解決什麼問題
Flutter是一種跨平臺框架,一套代碼在Android和iOS上運行,那麼當一些功能在兩端需要不同的實現比如權限申請,三方api調用,這時Flutter Plugin就可以發揮作用,它包含一個用Dart編寫的API定義,然後結合Android和iOS的平臺不同實現。達到統一調用的目的
通信原理
消息通過platform channels在客戶端(UI)和主機(platform)之間傳遞,如下圖所示:
在Flutter層,MethodChannel
(API)允許發送與方法調用相對應的消息。 在平臺層面也就是Android或者iOS,Android(API)上的MethodChannel
和iOS(API)上的FlutterMethodChannel
啓用接收方法調用併發回結果。 可以使用非常少的“樣板”代碼開發平臺插件。
支持傳遞的數據類型
既然要通信,那麼一下兩個問題就不僅浮現在眼前
MethodChannel傳遞的數據支持什麼類型?
Dart數據類型與Android,iOS類型的對應關係是怎樣的?
結論如下:
Dart | Android | iOS |
---|---|---|
null | null | nil (NSNull when nested) |
bool | java.lang.Boolean | NSNumber numberWithBool: |
int | java.lang.Integer | NSNumber numberWithInt: |
int if 32 bits not enough | java.lang.Long | NSNumber numberWithLong: |
double | java.lang.Double | NSNumber numberWithDouble: |
String | java.lang.String | NSString |
Uint8List | byte[] | FlutterStandardTypedData typedDataWithBytes: |
Int32List | int[] | FlutterStandardTypedData typedDataWithInt32: |
Int64List | long[] | FlutterStandardTypedData typedDataWithInt64: |
Float64List | double[] | FlutterStandardTypedData typedDataWithFloat64: |
List | java.util.ArrayList | NSArray |
Map | java.util.HashMap | NSDictionary |
step1 創建插件工程
File->New->New Flutter Project 選擇Flutter Plugin 起個名字一路next
此時我們創建了一個名爲voice_plugin的插件
我們以工程模式打開插件工程 項目結構是這樣的
我們着重瞭解一下以下三個文件
lib/voice_plugin.dart ——Api定義類
android/src/main/java/demo/roobo/com/voice_plugin/VoicePlugin.java ——android實現類
ios/Classes/VoicePlugin.m ——ios實現類
MethodChannel是如何交互
交互就是通過MethodChannel
。MethodChannel
負責dart和原生代碼通信。voice_plugin
是MethodChannel
的名字,flutter通過一個具體的名字能纔夠在對應平臺上找到對應的MethodChannel
,從而實現flutter與平臺的交互。同樣地,我們在對應的平臺上也要註冊名爲voice_plugin
的MethodChannel
。
lib/voice_plugin.dart
class VoicePlugin {
static const MethodChannel _channel = const MethodChannel('voice_plugin');
static Future<String> get platformVersion async {
final String version = await _channel.invokeMethod('getPlatformVersion');
return version;
}
}
Android註冊名爲voice_plugin
的Channel
public class VoicePlugin implements MethodCallHandler {
private static final String TAG = "VoicePlugin";
/**
* Plugin registration.
*/
public static void registerWith(Registrar registrar) {
final MethodChannel channel = new MethodChannel(registrar.messenger(), "voice_plugin");
channel.setMethodCallHandler(new VoicePlugin());
Log.e(TAG, "init: mContext");
}
@Override
public void onMethodCall(MethodCall call, Result result) {
switch (call.method) {
case "getPlatformVersion":
result.success("Android " + android.os.Build.VERSION.RELEASE);
break;
default:
result.notImplemented();
break;
}
}
}
ios註冊名爲voice_plugin
的Channel
public class SwiftVoicePlugin: NSObject, FlutterPlugin {
public static func register(with registrar: FlutterPluginRegistrar) {
let channel = FlutterMethodChannel(name: "voice_plugin", binaryMessenger: registrar.messenger())
let instance = SwiftVoicePlugin()
registrar.addMethodCallDelegate(instance, channel: channel)
}
public func handle(_ call: FlutterMethodCall, result: @escaping FlutterResult) {
result("iOS " + UIDevice.current.systemVersion)
}
}
@implementation VoicePlugin
+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {
[SwiftVoicePlugin registerWithRegistrar:registrar];
}
@end
至此,Flutter與原生的橋接已經完成
step2 編寫Api和不同平臺的實現
編寫不同平臺實現的時候你會發現沒有提示,甚至連包依賴都找不到,而且ide在提示你 編寫特定平臺的代碼時 你需要以工程身份打開項目 點解即可
我們打開android 工程
如下圖 :
app 是插件工程example下的android的示例程序,可以用來測試android與dart的通信
voice_plugin即android平臺下實現插件api的地方
這裏就像一個android工程一樣 可以gradle導包 編寫java或kotlin代碼
首先我們在插件工程下 lib/voice_plugin.dart
定義我們需要的功能 比如我們定義了register
getRead
getASR
platformVersion
等方法
在android/src/main/java/demo/roobo/com/voice_plugin/VoicePlugin.java
中的onMethodCall
截獲這些方法獲取參數並實現相應功能並且提供返回值。
//以如下方式傳遞參數
static Future register(String sn, String publicKey) async {
Map<String, Object> map = {"sn": sn, "publicKey": publicKey};
await _channel.invokeMethod('register', map);
}
//以如下方式截獲參數處理
private void registerAndInit(MethodCall call, Result result) {
String sn = call.argument("sn");
String publicKey = call.argument("publicKey");
Log.e("voicePlugin", " sn:" + sn + " pbk: " + publicKey);
init(mContext, sn, publicKey, result);
}
//以如下方式返回數據給flutter
//Result result result 由flutter框架創建 onMethodCall中可以得到 public void onMethodCall(MethodCall call, Result result) {}
result.success(answer.Data.pronunciation + "");
flutter插件實例
step2.1 定義api
class VoicePlugin {
static const MethodChannel _channel = const MethodChannel('voice_plugin');
static Future<String> get platformVersion async {
final String version = await _channel.invokeMethod('getPlatformVersion');
return version;
}
static Future register(String sn, String publicKey) async {
Map<String, Object> map = {"sn": sn, "publicKey": publicKey};
await _channel.invokeMethod('register', map);
}
static Future<String> get getASR async {
final String result = await _channel.invokeMethod('getASR');
return result;
}
static Future<String> getRead(String qa) async {
Map<String, Object> map = {"qa": qa};
final String result = await _channel.invokeMethod('getRead', map);
return result;
}
}
step2.2 實現android API
public class VoicePlugin implements MethodCallHandler {
private static final String TAG = "VoicePlugin";
private static Context mContext;
/**
* Plugin registration.
*/
public static void registerWith(Registrar registrar) {
final MethodChannel channel = new MethodChannel(registrar.messenger(), "voice_plugin");
channel.setMethodCallHandler(new VoicePlugin());
Log.e(TAG, "init: mContext");
mContext = registrar.context();
}
@Override
public void onMethodCall(MethodCall call, Result result) {
switch (call.method) {
case "getPlatformVersion":
result.success("Android " + android.os.Build.VERSION.RELEASE);
break;
case "register":
registerAndInit(call, result);
break;
case "getASR":
onASRClick(result);
break;
case "getRead":
onReadClick(call, result);
break;
default:
result.notImplemented();
break;
}
}
private void registerAndInit(MethodCall call, Result result) {
String sn = call.argument("sn");
String publicKey = call.argument("publicKey");
Log.e("voicePlugin", " sn:" + sn + " pbk: " + publicKey);
init(mContext, sn, publicKey, result);
}
public void onASRClick(final Result result) {
//這裏我們調用了三方sdk獲取結果
IRooboRecognizeEngine engine = VUISDK.getInstance().getRecognizeEngine();
engine.stopListening();
engine.setParameter(SDKConstant.Parameter.PARAMERER_AI_ON_OFF, "off");
engine.setCloudResultListener(new OnCloudResultListener() {
@Override
public void onResult(int i, String s) {
Log.d(TAG, "onASRResult() called with: i = [" + i + "], s = [" + s + "]");
Gson gson = new Gson();
VoiceAnswer voiceAnswer = gson.fromJson(s, VoiceAnswer.class);
//回執
result.success(voiceAnswer.text);
IRooboRecognizeEngine engine = VUISDK.getInstance().getRecognizeEngine();
engine.stopListening();
}
@Override
public void onError(SDKError sdkError) {
Log.d(TAG, "onError() called with: sdkError = [" + sdkError + "]");
}
});
engine.setOnVadListener(new OnVADListener() {
@Override
public void onBeginOfSpeech() {
Log.d(TAG, "onBeginOfSpeech() called");
}
@Override
public void onEndOfSpeech() {
Log.d(TAG, "onEndOfSpeech() called");
}
@Override
public void onCancelOfSpeech() {
Log.d(TAG, "onCancelOfSpeech() called");
}
@Override
public void onAudioData(byte[] bytes) {
}
});
engine.startListening();
}
public void onReadClick(MethodCall call, final Result result) {
String qa = call.argument("qa");
IRooboRecognizeEngine engine = VUISDK.getInstance().getRecognizeEngine();
engine.stopListening();
engine.setParameter(SDKConstant.Parameter.PARAMERER_ORAL_TESTING, "on");
engine.setParameter(SDKConstant.Parameter.PARAMERER_AI_CONTEXTS, "");
engine.setCloudResultListener(new OnCloudResultListener() {
@Override
public void onResult(int i, String s) {
Log.d(TAG, "onReadResult() called with: i = [" + i + "], s = [" + s + "]");
Gson gson = new Gson();
FollowAnswer answer = gson.fromJson(s, FollowAnswer.class);
result.success(answer.Data.pronunciation + "");
IRooboRecognizeEngine engine = VUISDK.getInstance().getRecognizeEngine();
engine.stopListening();
}
@Override
public void onError(SDKError sdkError) {
Log.d(TAG, "onError() called with: sdkError = [" + sdkError + "]");
}
});
engine.setOnVadListener(new OnVADListener() {
@Override
public void onBeginOfSpeech() {
Log.d(TAG, "onBeginOfSpeech() called");
}
@Override
public void onEndOfSpeech() {
Log.d(TAG, "onEndOfSpeech() called");
}
@Override
public void onCancelOfSpeech() {
Log.d(TAG, "onCancelOfSpeech() called");
}
@Override
public void onAudioData(byte[] bytes) {
}
});
engine.setParameter(
SDKConstant.Parameter.PARAMERER_ORAL_TESTING_EX,
JsonUtil.toJsonString(new RequestReadScore(1, qa)));
engine.startListening();
}
private static void init(Context context, String sn, String publicKey, final Result result) {
if (VUISDK.getInstance().hasInited()) {
Log.e(TAG, "init: has inited");
result.success("succeed");
return;
}
SDKRequiredInfo info = new SDKRequiredInfo();
info.setDeviceID(sn);
info.setAgentID("xxxxxxxxxxxxxxxxxx");
info.setPublicKey(publicKey);
info.setAgentToken("xxxxxxxxxxxxxxxxxxxxxxxxx");
ISDKSettings settings = VUISDK.getInstance().getSettings();
settings.setTokenType(AIParamInfo.TOKEN_MODE_INNER);
settings.setEnv(SDKConstant.EnvType.ENV_TEST);
settings.setLogLevel(SDKConstant.LogLevelType.LOG_LEVEL_VERBOSE);
VUISDK.getInstance().init(context, info, new ISDKInitListener() {
@Override
public void onSuccess() {
Log.d(TAG, "onSuccess() called");
IRooboRecognizeEngine engine = VUISDK.getInstance().getRecognizeEngine();
engine.setEngineEnable(SDKConstant.Engine.ENGINE_LOCAL_REC, false);
engine.setEngineEnable(SDKConstant.Engine.ENGINE_VAD, false);
engine.setEngineEnable(SDKConstant.Engine.ENGINE_WAKEUP, false);
}
@Override
public void onFailed(SDKError sdkError) {
Log.d(TAG, "onFailed() called with: sdkError = [" + sdkError + "]");
}
});
result.success("succeed");
}
}
step2.3 實現iOS API
和android 一樣 實現一套ios平臺代碼
step2.4 flutter 調用
在pubspec.yaml
中添加依賴
dependencies:
flutter:
sdk: flutter
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^0.1.2
dio: ^3.0.0
shared_preferences: ^0.5.4+6
permission_handler: ^4.0.0
voice_plugin:
path: voice_plugin
//導包
import 'package:voice_plugin/voice_plugin.dart';
Future<String> getRead(String qa) async {
return await VoicePlugin.getRead(qa);
}
Future<String> getASR() async {
return await VoicePlugin.getASR;
}
//調用
getRead(msg.extra.question)
.then((str) {
NetHelper.post(
action: msg.action,
session: msg.session,
answer: str,
hostId: msg.hostId);
Navigator.pop(context);
});
getASR().then((str) {
NetHelper.post(
action: msg.action,
session: msg.session,
answer: str,
hostId: msg.hostId);
Navigator.pop(context);
});
Flutter調用原生並傳遞數據
static Future register(String sn, String publicKey) async {
await _channel.invokeMethod('register', {"sn": sn, "publicKey": publicKey});
}
我們將傳進來的參數重新組裝成了Map並傳遞給了invokeMethod
。其中invokeMethod
函數第一個參數爲函數名稱,即register
,我們將在原生平臺用到這個名字。第二個參數爲要傳遞給原生的數據。我們看一下invokeMethod的源碼:
@optionalTypeArgs
Future<T> invokeMethod<T>(String method, [ dynamic arguments ]) async {
assert(method != null);
final ByteData result = await binaryMessenger.send(
name,
codec.encodeMethodCall(MethodCall(method, arguments)),
);
if (result == null) {
throw MissingPluginException('No implementation found for method $method on channel $name');
}
final T typedResult = codec.decodeEnvelope(result);
return typedResult;
}
第二個參數是dynamic的,那麼我們是否可以傳遞任何數據類型呢?至少語法上是沒有錯誤的,但實際上這是不允許的,只有對應平臺的codec支持的類型才能進行傳遞,也就是上文提到的數據類型對應表,這條規則同樣適用於返回值,也就是原生給Flutter傳值。請記住這條規定
在平臺接收Flutter傳遞過來的數據
@Override
public void onMethodCall(MethodCall call, Result result) {
switch (call.method) {
case "getPlatformVersion":
result.success("Android " + android.os.Build.VERSION.RELEASE);
break;
case "register":
registerAndInit(call, result);
break;
case "getASR":
onASRClick(result);
break;
case "getRead":
onReadClick(call, result);
break;
default:
result.notImplemented();
break;
}
}
call.method
是方法名稱,我們要通過方法名稱比對完成調用匹配。當call.method == "register"
成立時,說明我們要調用register
,從而進行更多的操作。
如發現call.method
不存在,通過result
向Flutter報告一下該方法沒實現:result.notImplemented()
當調用這個方法之後,我們會在Flutter層收到一個沒實現該方法的異常。
iOS端也是大同小異:
- (void)handleMethodCall:(FlutterMethodCall *)call result:(FlutterResult)result {
if ([@"registerApp" isEqualToString:call.method]) {
[_fluwxWXApiHandler registerApp:call result:result];
return;
}
}
如果方法不存在:result(FlutterMethodNotImplemented);
其數據放在call.arguments
,其類型爲java.lang.Object,與Flutter傳遞過來數據類型一一對應。如果數據類型是Map,我們可以通過以下方式取出對應值:
String sn = call.argument("sn");
String publicKey = call.argument("publicKey");
IOS也一樣
NSString *appId = call.arguments[@"sn"];
NSString *appId = call.arguments[@"publicKey"];
從平臺回調結果給Flutter
在接收Flutter調用的時候會傳遞一個名字result
的參數,通過result
我們可以向Flutter回值有三種形式:
success,成功
error,遇到錯誤
notImplemented,沒實現對應方法
result.success("succeed")
result.error("invalid sn", "are you sure your sn is correct ?", sn)
result("succeed");
result([FlutterError errorWithCode:@"invalid sn" message:@"are you sure your sn is correct ? " details:sn]);
平臺如何調用Flutter
如果插件的功能很重,或者是異步回調,此時我們就不好使用Result 對象回傳結果,比如微信分享,當我們完成分享時,我們可能將分享結果傳回Flutter。微信的這些回調是異步的,
原理也一樣,在原生代碼中,我們也有一個MethodChannel:
val channel = MethodChannel(registrar.messenger(), "xxx")
let channel = FlutterMethodChannel(name: "xxx", binaryMessenger: registrar.messenger())
當我們拿到了MethodChannel,
val result = mapOf(
errStr to response.errStr,
WechatPluginKeys.TRANSACTION to response.transaction,
type to response.type,
errCode to response.errCode,
openId to response.openId,
WechatPluginKeys.PLATFORM to WechatPluginKeys.ANDROID
)
channel?.invokeMethod("onShareResponse", result)
NSDictionary *result = @{
description: messageResp.description == nil ?@"":messageResp.description,
errStr: messageResp.errStr == nil ? @"":messageResp.errStr,
errCode: @(messageResp.errCode),
type: messageResp.type == nil ? @2 :@(messageResp.type),
country: messageResp.country== nil ? @"":messageResp.country,
lang: messageResp.lang == nil ? @"":messageResp.lang,
fluwxKeyPlatform: fluwxKeyIOS
};
[methodChannel invokeMethod:@"onShareResponse" arguments:result];
原生調用Flutter和Flutter調用原生的方式其實是一樣的,都是通過MethodChannel調用指定名稱的方法,並傳遞數據。
Flutter的接受原生調用的方式和原生接收Flutter調用的方式應該也是樣的:
final MethodChannel _channel = const MethodChannel('XXX')
..setMethodCallHandler(_handler);
Future<dynamic> _handler(MethodCall methodCall) {
if ("onShareResponse" == methodCall.method) {
_responseController.add(WeChatResponse(methodCall.arguments, WeChatResponseType.SHARE));
}
return Future.value(true);
}
稍微不一樣的地方就是,在Flutter中,我們使用到了Stream:
StreamController<WeChatResponse> _responseController =
new StreamController.broadcast();
Stream<WeChatResponse> get response => _responseController.stream;
當然了不使用Stream也可以。通過Stream,我們可以更輕鬆地監聽回調數據變化:
response.listen((data) {
//do work
});