Home Mobile Development How To Create A Mobile App With Virtual Reality?

How To Create A Mobile App With Virtual Reality?

by Mariya Parackal

virtual reality
A virtual reality app is a mobile app that uses the power of virtual reality to reach its goals. Virtual Reality is no longer confined to the world of Science Fiction and is more like science fact. Soon, the virtual reality market is expected to explode leaps and bounce by 2020 and hike more than $100 billion. Since the future relies on an immersive virtual world, there is no looking back.

1. Requisites for Creating a Virtual Reality App

We need an Oculus runtime 0.8 installed, along with Unity 5.3 or higher for Virtual Reality App development. The Oculus SDKs, samples, and asset packs are built to help you quickly and easily develop your VR app in your preferred development environment. We also require the below:

“¢ Hardware: Google cardboard
“¢ Software: OS X

At this time, it’s possible to develop on OSX 10.9+ with the Oculus 0.0.5 runtime, but as Oculus have paused development for OS X, we recommend Windows for native Virtual Reality functionality in Unity. The below components are compatible.

Windows: Windows 7, 8, 8.1, and Windows 10 are all compatible.
Android: Android OS Lollipop 5.1 or higher.
Graphics card drivers: Please ensure your drivers are up to date. Older drivers may not be supported. Please check the Oculus Configuration Utility to see if there are issues with your driver.

2. Process Flow of Creating your first Virtual Reality Project

Someone with an interest in Virtual Reality and coding can start with Unity. Unity is quite easy to pick up and can get your first project up and you can run it inside the same sitting that you install it in. Most people feel that the abundance Unity with respect to learning advanced graphics and networking is the reason that VR is so successful. They democratized game development so that anyone can program a game with ease. The following is the process flow of creating a virtual reality project.

Step 1: Create a new empty project from the Unity Home Screen which loads when you first launch Unity.

Step 2: Make sure that PC, Mac & Linux Standalone is selected as the platform to use by visiting File > Build Settings from the top menu.

Step 3: Create a new cube (Game Object > 3D Object > Cube) and position it in front of the default Main Camera in your new empty scene using the Translate tool.

Step 4: Save your scene (File > Save Scene).

Step 5: Go to Edit > Project Settings > Player and check the box to enable “Virtual Reality Supported”.

Step 6: Enter Play mode by pressing Play at the top of the interface.

During runtime, this can be toggled using the Virtual Reality Settings.enabled property as shown below:
using UnityEngine;
using UnityEngine.VR;

public class ToggleVR: MonoBehaviour
{
//Example of toggling VRSettings
private void Update ()
{
//If V is pressed, toggle VRSettings.enabled
if (Input.GetKeyDown(KeyCode.V))
{
VRSettings.enabled = !VRSettings.enabled;
Debug.Log(“Changed VRSettings.enabled to:”+VRSettings.enabled);
}
}
}

3. Camera Movement in Virtual Reality

We cannot move the VR Camera directly in Unity. If you wish to change the position and rotation, you’ll need to ensure that it is parented to another GameObject, and apply the changes to the parents Transform. Left and right eye cameras are not created by Unity.

If you wish to get the positions of those nodes, you must use the InputTrackingclass.
using UnityEngine;
using UnityEngine.VR;

public class UpdateEyeAnchors : MonoBehaviour
{
GameObject[] eyes = new GameObject[2];
string[] eyeAnchorNames = { “LeftEyeAnchor”, “RightEyeAnchor” };
void Update()
{
for (int i = 0; i < 2; ++i)
{
// If the eye anchor is no longer a child of us, don’t use it
if (eyes[i] != null && eyes[i].transform.parent != transform)
{
eyes[i] = null;
}

// If we don’t have an eye anchor, try to find one or create one
if (eyes[i] == null)
{
Transform t = transform.Find(eyeAnchorNames[i]);
if (t)
eyes[i] = t.gameObject;

if (eyes[i] == null)
{
eyes[i] = new GameObject(eyeAnchorNames[i]);
eyes[i].transform.parent = gameObject.transform;
}
}

// Update the eye transform
eyes[i].transform.localPosition = InputTracking.GetLocalPosition((VRNode)i);
eyes[i].transform.localRotation = InputTracking.GetLocalRotation((VRNode)i);
}
}
}

4. Render scale in Virtual Reality
Depending on the complexity of your Virtual Reality scene and the hardware you’re running, you may want to alter the render scale. This controls the texel: pixel ratio before lens correction, meaning that we trade performance for sharpness.
using UnityEngine;
using System.Collections;
using UnityEngine.VR;

namespace VRStandardAssets.Examples
{
public class ExampleRenderScale : MonoBehaviour
{
[SerializeField] private float m_RenderScale = 1f; //The render scale. Higher numbers = better quality, but trades performance

void Start ()
{
VRSettings.renderScale = m_RenderScale;
}
}
}
“¢ If we increase this to 1.5, you can see the asset looks much crisper.
“¢ if we reduce the renderscale to 0.5, you can see it’s now much more pixelated:

5. Interactions in Virtual Reality

In Virtual Reality, we often need to activate an object that a user is looking at. For the Virtual Reality Samples, a simple, lightweight, extendable system allowing users to interact with objects. There are three main scripts:

● VREyeRaycaster :

If a collider has been hit by the raycast, this script attempts to find a VRInteractiveItem component on the GameObject.
VRInteractiveItem interactible = hit.collider.GetComponent(); //attempt to get the VRInteractiveItem on the hit object.
From this, we can determine whether the user is looking at an object, or has stopped looking at an object. If the user has started or stopped looking at an object, we can do something with it, such as call a method.

● VRInput :

VRInput is a simple class that determines whether swipes, taps, or double-taps have occurred on the GearVR – or the equivalent controls setup for PC input when using a DK2.
public event Action OnSwipe; // Called every frame passing in the swipe, including if there is no swipe.
public event Action OnClick; // Called when Fire1 is released and it’s not a double click.
public event Action OnDown; // Called when Fire1 is pressed.
public event Action OnUp; // Called when Fire1 is released.
public event Action OnDoubleClick; // Called when a double click is detected.
public event Action OnCancel; // Called when Cancel is pressed.

● VRInteractiveItem:

This is a component you can add to any GameObject that you would like the user to interact with in VR. It requires a collider on the object it is attached to.
public event Action OnOver; // Called when the gaze moves over this object
public event Action OnOut; // Called when the gaze leaves this object
public event Action OnClick; // Called when click input is detected whilst the gaze is over this object.
public event Action OnDoubleClick; // Called when double click input is detected whilst the gaze is over this object.
public event Action OnUp; // Called when Fire1 is released whilst the gaze is over this object.
public event Action OnDown; // Called when Fire1 is pressed whilst the gaze is over this object.

Deployment of Virtual Reality Project

One can easily test their project on the DK2 just by pressing play in Unity and you will have to deploy your project for testing outside of Unity and sometimes for distribution.

When the deployment of Virtual Reality is done, one can consider the following:

“¢ Ensure that your scene(s) are included under the Scenes In Build
“¢ Click Build and Run and Unity will export a standard Windows build with a .exe file and supporting content folders.
“¢ Run the .exe file as usual, and if a project is built with VR support enabled, it will attempt to start in VR mode on the DK2 automatically.
“¢ If you have issues with running the .exe, you may wish to optionally force VR mode, by using the command line argument -vrmode, which is covered in the VR Overview section of the Unity manual.

Use Case of Virtual Reality

In this demo, the user has a visual experience in which he feels like he is walking through a landscape where some cubes are moving towards him. A white dot on the center of the screen represents the gaze input pointer. On targeting the gaze input pointer, the cubes will burst and disappear with in 3-4 seconds.

If you want you can download the codebase for this sample app to test your concepts.

Download VR App Code hbspt.cta.load(2725694, ‘6f015214-f875-4c6f-b5e0-800a1ad67bad’, {});

Conclusion
In the present scenario, along with the technology available to us, Visual Reality is all prepared for the development of auditory and visual immersion by specialist hardware like the Oculus. Although the Virtual reality headsets may only be few inches away in front of your eyes, it would seem like you’re looking at and is surrounded by a real environment. In the future, Virtual Reality will drive changes in OSs, GPUs, drivers, 3D engines, and apps, and ultimately enable a highly efficient Virtual Reality performance.

free consultation for mobile app hbspt.cta.load(2725694, ’59ae553e-9cc8-4c30-bb1d-57c0cc3c9fc2′, {});

Leave a Comment