VR系列——Oculus Rift 開發者指南:一、LibOVR集成

LibOVR集成

Oculus SDK被設計成儘可能易於集成。本指南概述了C/C++遊戲引擎或應用的基本Oculus集成。

我們將討論LibOVR的初始化、HMD(頭戴式設備)枚舉、頭部跟蹤、幀同步,以及Rift渲染。

以下的許多代碼示例是直接取自OculusRoomTiny演示源碼的(Oculus/LibOVR/Samples/OculusRoomTiny均有)。當對某個特定系統或特性有所懷疑時,OculusRoomTiny和OculusWorldDemo是查看示例集成代碼的好地方。

SDK概述

使用該SDK有三個主要的階段:啓動、遊戲循環、關閉。

爲一個新的應用增加Oculus支持,需要按以下步驟操作:

  1. 通過ovr_Initialize初始化LibOVR。
  2. 調用ovr_Create並確認返回值,以檢查是否調用成功。你可以使用ovr_GetHmdDesc(nullptr)定期輪詢HMD的存在。
  3. 集成頭部跟蹤到你的應用視圖和運動代碼。這包括:
    a. 通過結合調用GetPredictedDisplayTime和ovr_GetTrackingState,獲得幀中預期的頭戴設備定位。
    b. 當將Rift結合到其他應用的控制器時,應用它的定位和位置到攝像機視圖中。
    c. 修改運動和遊戲玩法,以考慮頭部定位。
  4. 初始化HMD的渲染。
    a. 基於HMD能力選擇渲染參數,如分辨率、視野區域。
    • 參考:ovr_GetFovTextureSize和ovr_GetRenderDesc。
    b. 通過創建D3D/OpenGL來配置渲染—明確切換當前數據的結構集給頭戴設備。
    • 參考:ovr_CreateSwapTextureSetD3D11和ovr_CreateSwapTextureSetGL。
  5. 修改應用幀渲染以集成HMD的支持和適當的幀同步:
    a. 確認你的引擎支持渲染立體視圖。
    b. 添加幀同步邏輯到渲染循環,以獲得正確預測的眼睛渲染動作。
    c. 渲染每個眼睛的視圖,中和爲渲染目標。
    d. 調用ovr_SubmitFrame,提交渲染過的幀給頭戴設備。
  6. 自定義用戶界面屏幕,以在頭戴設備中良好呈現。
  7. 在關閉時,銷燬創建的資源。
    • 參考:ovr_DestroySwapTextureSet,、ovr_Destroy和ovr_Shutdown。

更完整的渲染細節總結,在第14頁的章節《渲染啓動概述》中闡述。


原文如下


LibOVR Integration

The Oculus SDK is designed to be as easy to integrate as possible. This guide outlines a basic Oculus integration with a C/C++ game engine or application.

We’ll discuss initializing the LibOVR, HMD device enumeration, head tracking, frame timing, and rendering for the Rift.

Many of the code samples below are taken directly from the OculusRoomTiny demo source code (available in Oculus/LibOVR/Samples/OculusRoomTiny). OculusRoomTiny and OculusWorldDemo are great places to view sample integration code when in doubt about a particular system or feature.

Overview of the SDK

There are three major phases when using the SDK: setup, the game loop, and shutdown.

To add Oculus support to a new application, do the following:

  1. Initialize LibOVR through ovr_Initialize.
  2. Call ovr_Create and check the return value to see if it succeeded. You can periodically poll for the presence of an HMD with ovr_GetHmdDesc(nullptr).
  3. Integrate head-tracking into your application’s view and movement code. This involves:
    a. Obtaining predicted headset orientation for the frame through a combination of the GetPredictedDisplayTime and ovr_GetTrackingState calls.
    b. Applying Rift orientation and position to the camera view, while combining it with other application controls.
    c. Modifying movement and game play to consider head orientation.
  4. Initialize rendering for the HMD.
    a. Select rendering parameters such as resolution and field of view based on HMD capabilities.
    • See: ovr_GetFovTextureSize andovr_GetRenderDesc.
    b. Configure rendering by creating D3D/OpenGL-specific swap texture sets to present data to the headset.
    • See: ovr_CreateSwapTextureSetD3D11 andovr_CreateSwapTextureSetGL.
  5. Modify application frame rendering to integrate HMD support and proper frame timing:
    a. Make sure your engine supports rendering stereo views.
    b. Add frame timing logic into the render loop to obtain correctly predicted eye render poses.
    c. Render each eye’s view to intermediate render targets.
    d. Submit the rendered frame to the headset by calling ovr_SubmitFrame.
  6. Customize UI screens to work well inside of the headset.
  7. Destroy the created resources during shutdown.
    • See: ovr_DestroySwapTextureSet, ovr_Destroy, and ovr_Shutdown.

A more complete summary of rendering details is provided in the Rendering Setup Outline on page 14 section.

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章