VR Components in Unreal Engine 4 

 


 
The purpose of this project was to create a set of modular components that could be combined together to create complex Virtual Reality setups with minimal code. This project was written in C++ and uses delegates extensively to provide an internal interface for C++ code, but also to call blueprint events from code and allow for VR development entirely using blueprints.
 
The example character blueprint below showcases how a basic VR setup can be achieved.
first a Volume component is defined, this represents the real world space that the player will have for their VR setup.


Next a Head Mounted Display component is defined, this is tracked to the head set and wraps UE4's camera component. Other components can also be attached to the HMD. In the example a VR movement component is attached, which allows for movement in the direction the player is looking.
 

Two motion controllers are attached, each has a collider for interactions with the virtual world. The widget interaction component is managed by the character and will automatically switch hands should the player press one of the triggers.
 

The full component graph is shown below. This is how the example provided works, but the components can be combined to create a wide array of different VR game types.


Interaction Components

The system features a number of different components that allow the user to interact with the virtual world. These components are built to be generic and allow for a wide range of applications.

Grab components 

The simplest components are grab components, these allow the user to pick up and interact with objects in the scene. 
 
When placing a grab component on an object the developer can decide to use the collider of a mesh component for the grab collision, or they can position a bounding box directly in the editor to specify where an object can be grabbed. 
 
In the paintgun example, 2 grab colliders have been added to the object, with bounding boxes positioned using the editor. The firing of paintballs is set up by testing for trigger inputs in C++ and then using UE4 delegates to trigger blueprint events. 
 
Components use delegates to trigger blueprint events from C++. This allows for faster in editor development, without touching C++.

Button Component 

Button components are physical triggers that can be pressed to activate interactions in the scene
 
 
In this example the button is linked to a screen that has been setup to change the displayed widget when it is pressed. This linking is accomplished via a blueprint interface that accepts inputs from interactive elements such as buttons. The interface is ultimately triggered from C++.
 
Interaction can be added to any actor via a single interface.

By setting interactions up in this way, gameplay designers can link interactive triggers to gameplay logic directly in the level editor, without needing to use level blueprints that directly reference assets.

Linear Component

The linear component is an interactable component that is constrained to a line. The min and max distance of the line can be set, and events can be triggered at user defined thresholds.

 

Hinge Component 

Similarly to the linear component, the hinge component constrains an interactable object. In this case the object is constrained by a pivot point and plane of rotation. The angle around that pivot point can be used to further constrain the rotation of the interactable object.
The example below shows how a hinge constraint can be used to make a lever, this also uses the blueprint interface for linking the hinge to gameplay logic.
 
 
The Hinge component does not just constrain rotations to the 0 -> 360 range. It correctly tracks multiple rotations and can therefore be used for valves like the one shown below.
 
 

Advanced Uses

This bow example uses two of the interactable components, first a grab component for the body of the bow, and a linear component for the string.
 
Generic interaction components can be used for a wide variety of applications.
 
The linear component is then linked to an animation blueprint that uses the normalised distance of the component to drive a blend space that deforms the bow.
 

 
Below is the blueprint logic that drives the animation and spawns the arrow. The normalised distance is again used to determine the speed with which the arrows should fly once the string is released. 
 

No comments:

Post a Comment