The Headset will NOT connect to the android phone without this adapter:

https://www.dreamworldvision.com/product-page/usb-type-c-adapter-with-integrated-battery

 

(The headset has a Type C - like connector, But it is NOT a Type C. And you may risk damaging both the headset and the phone if you force it. )

it is a known issue that phones with Android 9.0 will have problem accessing the cameras of the headset. We are working on a fix. 

Unity SDK

Unity SDK will be provided to our customers who purchased the headset. Email us at support@dreamworldvision.com to request SDK after placing your order.

Installation & Setup

Step 1: If you haven’t downloaded Unity already you can get started for free here (make sure your Unity Version is Unity 2017.1 or above, This instruction is based on Unity version 201.3.1f)

 https://store.unity.com/products/unity-personal?_ga=2.126419725.1807538334.1525282745-1298024418.1512518254

here you can also find plenty of resources to help you get started learning Unity.

 

Step 2: Once Unity is installed create a new Unity project or open an existing one and navigate to the menu Assets > Import Package > Custom Package. Select the DreamWorld unity package and import all the files.

Step 3:You will see an error in your console that says `TFSession' could not be found. Are you missing an assembly reference? In your unity project navigate to Edit > Project Settings > Player, and change the Scripting Runtime Version to Experimental (.NET 4.6 Equivalent). This will require Unity to restart.

Step 4: Delete the main camera in your Unity scene and drag and drop the DWCameraRig prefab into your scene. The DWCameraRig prefab can be found under Assets > DreamWorld > Prefabs > DWCameraRig. Set Unity’s “Game View,” resolution to 1600x1280, and you’re good to go.  Congratulations! You are ready to start creating your augmented reality app. 

DWCameraRig

Click on the DWCameraRig and look at your inspector. You will see a list of public variables that can be edited in your project.

Platform: if you’re making an app for a windows PC select “PC” in the Platform dropdown menu. If you’re making an app for Android select the name of the corresponding Android device.

Features 

Head Tracking: 
Enabled – enables three degrees rotational head tracking.
Disabled – scene is fixed to your view

 

Active Camera:
IR Hand Gesture – IR camera is enabled and hand gesture can be implemented. 
RGB Camera – RGB camera is enabled and the video stream can be used as a Texture2D in your scene. 

IR And RGB Android Only – Both cameras are enabled allowing for hand gesture and the RGB video stream, this feature currently only works on Android devices. 

 

Emulator: 
Check this box if you do not yet have a “DreamGlass.” Head rotation & hand gesture interaction  will be simulated with the mouse on a PC. Left click for an “AirClick,” right click for an “OpenPalm,” hold down the middle mouse button for a “Hold,” gesture and release the button for a “Release,” hand gesture.   

 

Developer Tools

The DreamWorld SDK provides code, example scenes and prefabs to allow developers to get started.

Hand Gesture Interaction:
 Hand Gesture Interaction can be achieved easily in your scene by adding a collider to an object and the HandGestureInteraction.cs script found in Assets > DreamWorld > DeveloperScripts.

 

The HandGestureInteraction.cs script provides a list of events that can be used to manipulate objects and call functions in your own scripts. Below is a list of the events that can be called by using this script. (see example scene DreamWorld > Examples > Scenes > HandGesturelInteraction).

HandGestureInteraction.cs 

Focus Event: the user is looking at the object with the collider. 
Unfocus Event: the user is no longer looking at an object they previously looked at. 
On Click Event: the user has performed an air click hand gesture. 
On Open Palm Event: the user has performed an open palm hand gesture. 
On Hold Event: the user has performed a hold hand gesture. 
On Release Event: the user has released the hold hand gesture. 

Smartphone Interaction:

If you’re building to an Android device, use the PhoneController prefab and the PhoneInteraction.cs script to quickly build interaction with the smartphone. The PhoneController.cs script allows the PhoneController prefab to follow the rotation of the phone and enables targeting virtual objects in front of the phone that have the PhoneInteraction.cs script and a collider attached. The rotation of the controller can be reset to face in front of the user if drifting occurs using the commands provided. By default the phones rotation is reset using a double tap and can also be reset by calling the ResetController(); function in PhoneController.cs.


 

Heading 4

CursorScript.cs

Type (Head): The cursor will follow the user’s head movement.


Type (Phone): The cursor will follow the user’s phone rotation (Android only).


Mode (Fixed): The cursor will always stay at a fixed location in front of the user.


Mode (Dynamic): The cursor will snap to the location of any object with a collider that moves in front of it and return to the starting position when no collider is detected. 


Mode (Normal Facing): The cursor will snap to the location of any object with a collider that moves in front of it and will rotate to face the normal directions of that mesh. 


Start Dist: Cursor’s starting distance from the user. It’s recommended to make this distance slightly less than the distance from the CameraRig and the other objects in the scene. Having a large distance between the cursor and the objects can result in difficulties focusing. 


Cursor Move Speed: When in “Dynamic,” & “Normals Facing,” mode this is the speed the cursor snaps to the objects in the scene and returns to it’s default position.  


Object Offset: When in “Dynamic,” & “Normals Facing,” mode this is the distance from the cursor to the object in front of it. It’s necessary to add some cushion so the cursor doesn’t intersect with the highlighted object. 


Cursor On Go: The cursor when a collider is in front of the user. This can be customized by replacing the public variable with any GameObject in the scene. 


Cursor Off Go: The cursor when a collider is not in front of the user. This can be customized by replacing the public variable with any GameObject in the scene. 

RGB Camera Texture:

If you’ve set the “Active Camera,” on the DWCameraRig to “RGB Video Stream,” you can use the RGBCameraTexture.cs script to apply the RGB camera feed as a Texture2D to any object in your scene. This can be done by dragging and dropping the RGBCameraTexture.cs script onto an object or by dragging and dropping the CameraTexture prefab anywhere in your scene. Any mesh renderers added to the array in the script will have the RGB camera texture applied. Plane meshes created in Unity will automatically match the aspect ratio of the Texture2D when added to the array. 


(see example scene DreamWorld > Examples > Scenes > RGBCameraTexture).

The PhoneInteraction.cs works similarly to the HandGestureInteraction.cs, by adding this script to game objects in the scene events can be triggered to call functions when gestures are performed. The PhoneInteraction.cs script can also be used in conjunction with the HandGestureInteraction.cs for additional interactivity on Android devices. 
 (see example scene DreamWorld > Examples > Scenes > PhoneInteraction).

 

PhoneInteraction.cs

On Target Event: the PhoneController prefab is pointing at a game object in the scene (game object requires a collider). 


Off Target Event: the PhoneController prefab is no longer pointing at a game object in the scene (game object requires a collider). 

Tap Event: the user has done a single tap on the touchscreen. 

Swipe Up Event: the user has swiped up on the touchscreen. 

Swipe Down Event: the user has swiped down on the touchscreen. 

Swipe Left Event: the user has swiped left on the touchscreen. 

Swipe Right Event: the user has swiped right on the touchscreen. 

On Hold Event: the user has one finger on the phone for a period of time. 

On Release Event: the user has taken their finger off the phone after a period of time. 

Virtual Cursor

Using a virtual cursor can help the user orient themselves in the scene and assist with interaction. A cursor can be incorporated by dragging and dropping the “CursorPrefab,” (DreamWorld > Prefabs) into the Unity scene. The “CursorScript.cs,” comes with public variables that should be adjusted to make viewing & interacting with the scene more natural.

(see example scene DreamWorld > Examples > Scenes > HandGestureInteraction).

Six Degree of Freedom tracking with NOLO

Nolo Recently had a firmware update, and you will need to install the updated Nolo Assistant listed in this section. 

 

Six degree of freedom tracking can be achieved by combining the DreamGlass with a NOLO  https://www.dreamworldvision.com/product-page/6dof-tracking-set-for-dream-glass

 

Below are the steps to needed to add 6DOF to your Unity project, starting with the Initial set up of attaching the NOLO to the DreamGlass followed by instructions on how to add the DreamGlass and NOLO SDKs to your own Unity project.

Initial Setup:

Step 1: Download and install the drivers (Nolo Assistant) for NOLO http://download.nolovr.com/download/noloassistant.html

Step 2: Restart your PC

Step 3: Pair your NOLO controllers and HMD to the Base Station.

https://www.nolovr.com/pairing?treeid=002_1

Step 4: Run the NOLO Assitant, if everything is connected you will see orange battery indicator for each NOLO device.

Step 4: On the Nolo VR_Manager(Script) attached to the NoloManager prefab make sure the VR Camera variable is set to the DWCameraRig. On the DWCameraRig(Script) attached to the DWCameraRig, change the Tracking variable to NOLO_6DOF.   

Step 5: The last step is to edit a few lines of code in the “NoloVR_Trackedevice,” (script) located in Assets > NoloVR > Scripts > NoloVR_TrackedDevice.cs. Comment out or delete lines 46-55 and add the following lines of code in their place. 
transform.localPosition = pose.pos;
transform.localRotation = Quaternion.Euler(pose.rot.eulerAngles.x, pose.rot.eulerAngles.y, pose.rot.eulerAngles.z);

With the NOLO devices connected after hitting play in the Unity Editor you will see the controllers and HMD being tracked. To view your scene in augmented reality run a build your application and wear the DreamGlass. Re-center the controllers and the headset by facing the hub, outstretching your arms and pressing the power button on each controller twice.  

Recording a video


The DWCameraRig(script) attached to the DWCameraRig prefab has public variables that allow you to record and save a video overlay of the AR content in your Unity scene together with the RGB Camera feed from the DreamGlass.  


To record and save a video click on the DWCameraRig in your Unity scene and change the Active Camera under Features to RGB_Video Capture. *Note that hand gesture interaction and RGB video to texture will not be available at this time. 


The drop down menu for Video Capture Settings contains three variables:


Capture Command: By default you have three commands you can use to start and stop recording. "Space" key and "Enter" key, press the key once to start the recording and again to stop. On Start will start the video when the application begins and saves it to the desired location after exiting the application. You can also select None and start the capture from you own scripts by referencing the DWCameraRig instance, DWCameraRig.Instance.Capture(). Calling this function once will start the video, calling it again will stop recording and save the video.


Resolution: Sets the resolution of the saved video, 720p is the highest quality, 540p mid-size, and 360p being the smallest. 
 

Video Path: You must specify where you would like the video to be saved. Quick tip, on Windows hold Shift + Right Click on the folder you want to set as the video path and choose Copy as path and paste the location into the Video Path variable. Remove the quotation marks on the pasted file location and you’re all set to go. 

Step 6: Plugin the DreamGlass USB and HDMI cables.

 

*Note that the DreamGlass must be plugged in after the NOLO is connected, otherwise the nolo will not connect.

Step 7: point the Nolo controller (Tracking ball forward) to the Nolo Base station, double click the power button to reset the scene. Do this sequentially for the left controller and then the right controller

Unity Setup:

Step1: Download the NOLO Unity SDK http://download.nolovr.com/download/UnitySDK_V1.0.rar

Step 2: If you have already imported the DreamWorld SDK  go ahead and Import the NOLO Unity SDK too. Otherwise look at the section above called “Installation & Setup,” and install the DreamWorld SDK.

Step 3: Drag the NoloManager prefab into the scene and make the DWCameraRig prefab a child of the Hmd(camera). Set the local positional and rotational values of the DWCameraRig prefab to 0 and change the “Tracking,” variable on the DWCameraRig (script) to NOLO_6DOF.

Step 5: Attach the NOLO HMD to the top center of the DreamGlass.

Attention: If your unity version is newer than 2018.1, you may see some warning & error messages complaining some obsolete libraries and also the TensorFlow, please ignore the warning/error message, the SDK will still work.

In newer Unity versions, the RunTime Version may show “.net 4.x equivalent”, select it.

When you run your application a window will pop up that displays your video recording in real time. This can be useful if you’re giving a presentation wearing the DreamGlass. If you selected a key to start and stop the capture make sure to click on the Unity application before pressing the key since the Unity window will be sent to the background. If you press the key again or exit the application the recorded video will be saved to the desired path.

Building your application

Building For PC:

When building your AR app for a Windows PC, in Unity’s build settings menu change the Architecture to x86_64(1). After you’ve created a build, plug in the USB and HDMI cable of the DreamGlass. Make sure your display settings are configured correctly so DreamGlass is treated as an external monitor (2). When running your build select the resolution to be 1600x1280 and make sure windowed is unchecked (3).

 

Occlusion:

 

Occlusion can be easily achieved by placing objects with an all black material in front of the object you want to occlude. Below in the flash light example the rotation of the phone is used to control the flashlight while a plane with a black texture and circular transparent fade.  

Building For Android:

Building an .apk for an android device requires that you install the Android SDK for Unity. If you haven’t done so already follow the steps to learn how to set up the Android SDK and deploy an .apk to your device: https://unity3d.com/learn/tutorials/topics/mobile-touch/building-your-unity-game-android-device-testing

Once you are ready to build an .apk in Unity make sure to choose the correct platform and device on the DWCameraRig before building the project.    

In the Unity Editor Under Edit >Project Settings > Player, make sure to fill out “Package Name,” under Other Settings > Identification and make sure the “Minimum API Level,” is set to Android 5.0 ‘Lollipop’ (API level 21) or higher. 


The last check is to make is sure to choose the correct device you are building to on the DWCameraRig before building the project.