Wednesday, November 12, 2014

User Interfaces

If you've kept up with the twitter feed and last few update videos, you're probably aware that I'm working on a menu system for usage in VR. Right now, it seems as though it's going to be taking the form of a freely distributed prefab item that you can use in Unity free; in the process of making it I've been thinking quite a bit about the idea of menus as a whole about user interface elements. Text blocks, data bars, scrolling pages, buttons, icons, the whole GUI package. I'm working under the question of how things will function for a user in virtual reality, thinking about the ways they'd be affected by the presence of VR's influence. I've found a few things out simply via my experiences in Virtual Reality demos, but my time trying to design one has brought out a few questions I hadn't been asking, so noting them down here for documentation, reference, and examination seems beneficial.

The first things I set out to do with the menu was make it a 3d world object. I made 2 planes, 1 small for the GUI base and 1 large to serve as a testing area. Problematically, I don't have anything at hand that I think would make a good standard VR interfacing item (by my standards at least) so I proceeded to work out a temporary solution. I've made an elongated cube, scaled it down and am using it as a "fake finger" for the purpose of development. So's not to leave it hanging around floating in mid air with nothing there, I placed a simple capsule to serve as a representative of the player's place in space. Probably would have been better to have done that first.  ;)

Going forward, I went on to the object itself, but found that the UI system in Unity 4.5 was rather lackluster, which kind of worried me since I was banking on the newer GUI system from Unity 4.6. Thankfully, Oculus fixed their integration with the platform so I'll be able to work around in it more easily, though there are still some things that could use tweaking. For the time being, things seem pretty fine, though I'm conflicted on the methodolgy I want to use for the menu. My initial intent was to make it a form of touch screen by using the canvas surface as a scale map of the UI overlay, having the menu items interacted with by the player as a result of interacting with this interface. However, more and more I'm starting to think that having individual, addable modules that react as a form of psuedo-touch screen by making them react more as a result of the user's movements after making contact than the action itself may make sense. It kind of feels like I'm lazing out so I'll be looking deeper into methods of adding more in depth controls, but for now, I'm trying to figure out what my ideal workflow for this would be.

Outside of the actual nitty gritty, I've been thinking about "organic" user interfaces for VR a lot more. I really enjoy it when a game can create an unobtrusive way to convey important information to the user. Having large text boxes, diagrams and other such methods are inherently unintuitive so  making learning an easy part of the experience will definitely play a factor in how I design things later on and may ironically make these efforts kind of obsolete. However, I think there is a place for indulging the desire to have the fancy Sci-fi HUDs and Holograms and there are quite simply some things that a 2d plane makes easier than a 3d one.

Hopefully I can get things down and have something more decent to show sometime soon. I think I should have a good basis before the end of the week so stay tuned for updates on that soon.

No comments:

Post a Comment