logo
Community

Research Programs

BlogForum
Back to blog
virtual reality unity ui

January 25, 2017

A New Dimension for UI: Using Unity for Virtual Reality
byWilliam HarneyinPlatforms

virtual reality unity ui

The advent of virtual reality solutions, ranging from gaming to trainings and simulations, is raising new questions about previously standard industry practices. User interfaces (UI), in particular, require a complete re-thinking of function, layout, and implementation. Traditionally, user interfaces have been divided into diegetic (part of the game world), non-diegetic (separate from the game world), spatial and meta components. Most successful games use a combination of them to provide a balanced experience. In this, we break down each category, its advantages/disadvantages for virtual reality, and how to implement them in Unity. Meta UI components are rare in general and largely disregarded in VR programming. For that reason, they are not considered in this analysis.

Non-Diegetic UI

Historically, non-diegetic user interfaces have been the most common in the gaming industry. The key defining feature of them is that the components of the UI exist on a completely different plane than the actual 3D game space. Imagine here a heads-up display (HUD) as they are likely the most ubiquitous examples of non-diegetic user interfaces. A health bar, for example, does not exist within the 3D space that the game supposes nor can characters in-game interact with it. It is outside both the game’s narrative and space.

Pros/Cons

This modality offers the user a very clear display of relevant information and allows for quick navigation. The fear, however, is that the distinct separation of the game world from the structures that manipulate it results in a lack of immersion.

Use in Virtual Reality With Unity

For virtual reality, non-diegetic user interfaces can be very difficult to successfully implement. The largest obstacle is the fact that a HUD a la traditional gaming can be too close to the user’s face, resulting in highly uncomfortable eye strain. In Unity, the typical way to design a non-diegetic HUD is through the Screen Space – Overlay or Screen Space – Camera functions. It is unsupported, however, in Unity VR due to discomfort-related concerns. A developer can, however, fix a model to the user’s vector of vision. This, in effect, serves the purpose of a HUD. Once again, though, it can prove awkward. It would be like walking all day with a phone directly in front of you. In order to focus on it, you would need to re-focus your view from the rest of the world. Additionally, its presence when focusing on other tasks would be distracting. In short, stay away from strictly non-diegetic UIs when developing solutions for virtual reality.

Diegetic UI

This model of user interface holistically embeds all of the information typically represented in a HUD into the game’s 3D space. An example of this in a game would be if instead of a mini-map in the corner of the screen, the avatar/user would pull out and look at a map that exists within in the game world. Thus, the user interface is part of the game’s narrative and exists within the game space. From a player perspective, the Deadspace video game franchise is generally regarded as having implemented one of the best diegetic UIs to date.

Pros/Cons

The advantage of this style is the belief that it increases the realism of the gaming experience and thereby results in deeper immersion. The drawback, however, is it requires developers to seek ingenious ways of representing typical information, such as health, items in inventory, etc. These, in turn, must be intuitive and effective, otherwise, they will frustrate the user and result in a loss of immersion.

Use in Virtual Reality With Unity

In many ways, the goal of virtual reality is to provide a level of engagement and immersion that mimics real-life. With this in mind, diegesis seems like the logical, and even necessary, method of crafting user interfaces. The logic seems to go, if real-life is without menus and speech bubbles shouldn’t virtual real-life be so too? In lieu of this, there are several ways to create more diegetic experiences using Unity in new innovative ways. One way is to use the Raycast function to initiate interaction. Let’s imagine, for example, that in an RPG the user wishes to interact with an NPC. Instead of clicking and using a menu, the user could simply stare at them for an appropriate amount of time, which mirrors how we use eye contact in real-life to initiate conversation.

Spatial UI

A spatial UI lies half between traditional diegetic and non-diegetic models by offering elements that exist within the 3D game space but are not part of the game’s narrative. Perhaps the simplest iteration of this would be if you were to select a unit in a real-time strategy. Around the unit would appear some sort of circle or symbol to represent that the unit has been selected. In a first-person shooter, a way-marker for an objective is another example of spatial UI. The way-marker exists in the game space but if you were to live inside your character’s head, you wouldn’t see it.

Pros/Cons

In many ways, the advantages and disadvantages of spatial UIs mimic those of diegetic models. The key upside is it provides a lot of clarity to the user; all the relevant information for a user can be tagged to the relevant models. This, however, is offset by the fear that the presence of meta-information could break the immersive dimension of the game.

Use in Virtual Reality With Unity

When it comes virtual reality, spatial UI is the simplest and most effective option. When programming with Unity this means selecting World Space as the render mode for the Canvas. This allows components of the UI to be placed anywhere in the game space. In order for the best results and most comfortable experience for the user, set the text at a comfortable distance (3-5 meters) away and make sure it is clear, large, and readable.

In order to reduce clutter on the screen and keep immersion-levels high, it is often advisable not to permanently tag UI information to a model. It can appear unrealistic and unnecessary. Instead, allow notifications and status updates to flow in and out of the game as organically as possible. For example, don’t always have a health bar floating above a character’s head but instead have an aura appear around the character or have a health bar flash in the game space near the character. Unity also allows the implementation of arrows to help direct users if they’re looking in the wrong the direction. The easiest way to add this to a game is GUIArrows and customising which vector should be prioritized can be done with the Show Angle function.

An effective use of spatial user interfaces that is subtle but clear is overwhelmingly the simplest and most effective model. It provides the necessary instruction without — if done tastefully — shattering the user’s level of immersion.

Conclusion

The key consideration, whether choosing to pursue non-diegetic, diegetic or spatial components, is to strike a balance between immersion and usability. The greatest strength of virtual reality is that it’s 360° of 3D space naturally induces a degree of engagement that far surpasses even the most advanced screen-based solutions. The fear for some developers is that immersion could be broken by clunky interfaces that divorce the user from the actual experience. With this in mind, it’s important to remember that many games featuring non-diegetic/spatial features still boast impressive levels of immersion. MMOs that allow highly customizable HUDS immediately come to mind. They may clutter the screen but they also allow the user to feel at home in the experience, which in turn induces immersion.

In short, according to our experience at Program-Ace when designing an interface for virtual reality, pay careful attention to making sure the experience remains intuitive and comfortable while also trying at every moment to submerge components into the game space and game narrative.

unityvirtual reality

Recent Posts

November 22, 2024

Top Image-to-Text Conversion Tools Every Developer Should Know

See post

November 22, 2024

Beyond Coding: Developing Soft Skills for a Well-Rounded Developer Career

See post

ocr

October 29, 2024

How OCR Helps in Text Extraction From Multiple Images at Once?

See post

Contact us

Swan Buildings (1st floor)20 Swan StreetManchester, M4 5JW+441612400603community@developernation.net
HomeCommunityDN Research ProgramPanel ProgramBlog

Resources

Knowledge HubPulse ReportReportsForumEventsPodcast
Code of Conduct
SlashData © Copyright 2024 |All rights reserved
Cookie Policy |Privacy Policy