這裏使用的方法時製作一個空的Player,使結構和CameraRig主要結構一致,然後實時獲取CameraRig的手柄和頭盔的位置,然後把Player的位置進行同步,對應的可見模型就添加在Player上面.
首先創建了三個腳本:
InteractionTest是負責具體交互功能同步測試的腳本,掛在Player上(僅交互功能測試使用,對同步VIVE位置信息沒有作用)
MappingIVIVETransform是用來獲取ViVE組件位置的腳本,掛在Player上
VIVEInstance是一個單例腳本,主要獲取VIVE的手柄頭盔,掛在CameraRig上。
首先是VIVEInstance腳本:
using UnityEngine;
public class VIVEInstance : MonoBehaviour {
public static VIVEInstance Instance;
public Transform LeftHand;
public Transform RightHand;
public Transform Head;
private void Awake()
{
Instance = this;
}
}
賦值後的結果,分別賦值接可以了。CameraRig就配置好了,不需要其他操作
然後我們創建一個Player的空物體。結構如下:
首先Player下面添加一個Head我這裏使用的是一個Sphere,然後添加空物體LeftHand和空物體RightHand(下面的Model針對特定模型用來顯示的,如果LeftHand和RightHand自帶模型組件則不需要),Player的結果就是這樣,然後添加組件;
如上圖。首先是Player的自身同步組件,然後需要添加三個NetWork Transform Child組件,因爲我們需要同步頭部,左右手三個物體,所以需要添加三個,然後分別拖拽賦值。Player下的子物體不需要特殊操作。
然後添加MappingIVIVETransform腳本,需要繼承自NetWorkBehaviour,具體代碼如下:
using UnityEngine;
using UnityEngine.Networking;
public class MappingVIVETransform : NetworkBehaviour
{
public Transform Head;
public Transform LeftHand;
public Transform RightHand;
private void Update()
{
Mapping();
}
public void Mapping()
{
if (isLocalPlayer)
{
Head.position = VIVEInstance.Instance.Head.position;
Head.rotation = VIVEInstance.Instance.Head.rotation;
LeftHand.position = VIVEInstance.Instance.LeftHand.position;
LeftHand.rotation = VIVEInstance.Instance.LeftHand.rotation;
RightHand.position = VIVEInstance.Instance.RightHand.position;
RightHand.rotation = VIVEInstance.Instance.RightHand.rotation;
}
}
}
在Update裏面實時賦值,這裏需要判斷是不是isLocalPlayer。掛在Player的結構如下:
到了這一步同步其實就已經完成了,測試就可以互相看到彼此。如果卡頓延遲嚴重記得調節同步組件下的:
具體值按照自身需求來
寫了一個發射子彈的測試腳本InterationTest。子彈的顏色隨機,按下一次單發,按下不松連發:
using UnityEngine;
using UnityEngine.Networking;
public class InteractionTest : NetworkBehaviour
{
public GameObject Bullet;
public Transform LuncherPos;
private float luncherTimer;
public float IntervalTime;
private SteamVR_TrackedObject leftTrackerObj;
private SteamVR_TrackedObject rightTrackerObj;
private SteamVR_Controller.Device leftDevice;
private SteamVR_Controller.Device rightDevice;
public override void OnStartClient()
{
leftTrackerObj = GameObject.Find("[CameraRig]/Controller (left)").GetComponent<SteamVR_TrackedObject>();
rightTrackerObj = GameObject.Find("[CameraRig]/Controller (right)").GetComponent<SteamVR_TrackedObject>();
}
void Start()
{
if (leftTrackerObj != null)
{
leftDevice = SteamVR_Controller.Input((int)leftTrackerObj.index);
}
if (rightTrackerObj != null)
{
rightDevice = SteamVR_Controller.Input((int)rightTrackerObj.index);
}
}
void FixedUpdate()
{
if (isLocalPlayer)
{
if (leftDevice != null)
{
if (leftDevice.GetPressDown(SteamVR_Controller.ButtonMask.Trigger))
{
CmdSingleLuncher();
}
if (leftDevice.GetPress(SteamVR_Controller.ButtonMask.Trigger))
{
ContinuousLuncher();
}
if (leftDevice.GetPressUp(SteamVR_Controller.ButtonMask.Trigger))
{
luncherTimer = 0;
}
}
}
}
[Command]
public void CmdSingleLuncher()
{
RpcSingleLuncher();
}
[ClientRpc]
public void RpcSingleLuncher()
{
GameObject go = GameObject.Instantiate(Bullet, LuncherPos.position, Quaternion.identity);
go.GetComponent<MeshRenderer>().material.color = RandomColor();
go.GetComponent<Rigidbody>().velocity = LuncherPos.forward * 5f;
Destroy(go, 2f);
}
public void ContinuousLuncher()
{
luncherTimer += Time.deltaTime;
if (luncherTimer >= IntervalTime)
{
CmdSingleLuncher();
luncherTimer = 0;
}
}
private Color RandomColor()
{
float r = Random.Range(0, 1f);
float g = Random.Range(0, 1f);
float b = Random.Range(0, 1f);
Color color = new Color(r, g, b);
return color;
}
}
整體Player結構:
歡迎加羣:4364930討論。