Giter Site home page Giter Site logo

myhomevr's Introduction

myhomevr

A little Unity VR scene representing my real house made from scratch. It can be run either with Google Cardboard (GoogleVR already included as part of the project) or a SteamVR-compatible headset as HTC Vive, Oculus Rift or Microsoft Mixed Reality Headsets (SteamVR Unity plugin already included, but SteamVR must be installed in the system). The scene "LivingRoom 3 DoF" can be run with any headset, and the scene "LivingRoom 6 DoF" can be run only with a OpenVR-enabled headset as it requires motion controllers and 6 DoF tracking.

The vast of the actual assets were taken from the Simple Home Stuff free asset as well as a few additional free doors pulled from Turbosquid. Beyond those, there are a few objects I've done myself using simple Unity primitives (mainly cubes and cylinders) such as the rounded table at the left of the room, the moving ceiling fan hanging from the center of the living room, and all the general house layout (walls, floor, etc).

The included virtual character used is the one from the MORHP3D Male package pulled directly from the Unity Assets Store. The different animations were taken from the following packages from the Unity Asset Store: The Raw Mocap for Mecanim, the 3D-BROTHERS Action Pack and the Animations for RPG Human package. Blinks, mouth and other facial movements were animated manually using character's blend shapes.

home_1

home_1

home_2

home_2

A bit more information

  • Animated Virtual Character: the included virtual character is interactive, being possible to talk with him and answer his questions by nodding or shaking our head. The character is programmed to be rather expressive, including hand and body gestures while talking, controlled gazing depending whether he is listening or talking and facial expressions such as blink and speech mimic.
  • Materials and Textures: the floor has a material which has a parquet texture associated and the corresponding normal map so to achieve some roughness in parquet's wood. Also the walls, the ceiling and the objects from the Simple Home Stuff package has their own material. All of the materials uses the Standard shader to render the object though.
  • Scale: I've tried the scale both in Google Cardboard and HTC Vive and the room and all the objects seems to be pretty much life-size. Furthermore, to define the camera position to be used in the mobile VR headset I used the HTC Vive tracking as a helper, so to hardcode a real world camera position which corresponds to my height (after headset donned).
  • Lighting: : I added a few point lights, two on the room's ceiling , another one on the floor lamp near the two-folded door, one in the main corridor and one in each of the rooms. Take into account that some of them are not well perceived when using mobile VR due to the graphics optimizations Unity does to get it to work efficiently.
  • Animation: I animated the ceiling fan in the center of the room. I initially guessed it wouldn't be difficult to animate as it was just matter of adding a cylindrical base for the fan in which all the blades are attached, and then the animation would be achieved by rotating just the cylindrical base (considering the blades are child objects of the base, so they will also rotate accordingly). Anyway, I couldn't achieve a smooth animation using the Unity's Animation System as when the animation was configured to loop there was a bit of jumpiness between the last keyframe and the first one. The alternative solution was to animate it through a custom Script which just rotates the fan base by a few degrees on each frame indefinitely.
  • Audio: I've added a sound effect for the spinning ceiling fan so you can hear it spinning from above your head. It's basically a mono sound attached to the fan itself so Unity applies the HRTF function to get spatialized sound for it. Additionally, I added some ambient sound coming from the TV when it's powered on. For this one, I've used a VideoClip so I can texture a plane placed above the TV display with a video. Doing so it should have also reproduced the video's audio but didn't work for me, so I have to download the audio track for the video separately and import it into Unity as an audio file. Then I added the audio as a new AudioClip which referenced by the VideoClip component. I've also added some sound effects for navigation and when interacting with doors and other elements. Note: there is a bug when the VideoPlayer that causes a spike in frame time when stopped (400 ms): https://issuetracker.unity3d.com/issues/videoplayer-dot-stop-causes-a-performance-spike

User Interaction

This section describes all the features related to User Interaction and Navigation. I implemented my own navigation and interaction library based on the VR Samples project. The included interaction library supports three different types of interaction (all those can be configured in the Selection Radial element as part of the Main Camera components):

  1. Fully gazed: this type of interaction requires that the user looks in the direction of an interactable object (a selection radial will appear always an interactable object is gazed) so the selection radial automatically starts filling. When the bar completes the corresponding action is triggered. This type of interaction is useful for mobile headsets in which the only type of input is head rotation. For example, Google Cardboard. Note: it can also be used for high-end headsets though.
  2. Gaze plus 2DUI with selection radial: this type of interaction requires that the user looks in the direction of an interactable object (a selection radial will appear always an interactable object is gazed) and then click the fire button so the selection radial starts filling. When the bar completes the corresponding action is triggered. This type of interaction is useful for mobile headsets which include some form of 2D input, such as a button, a touchpad or a trigger. For example, Samsung GearVR. Note: it can also be used for high-end headsets though.
  3. Gaze plus 2DUI without selection radial: this type of interaction requires that the user looks in the direction of an interactable object (a selection radial will appear always an interactable object is gazed) and then click or double click the fire button so the action is triggered. In this case, the selection radial must not be filled and the interaction triggers automatically as soon as click is detected. This type of interaction is useful for mobile headsets which include some form of 2D input, such as a button, a touchpad or a trigger. For example, Samsung GearVR. Note: it can also be used for high-end headsets though.

Using any of the methods above, a UI tooltip text is shown indicating the action that the user can trigger when selecting the corresponding object. In the following subsections all the navigation and interaction features are detailed.

When using a high-end VR system with motion controllers (scene "LivingRoom 6 DoF") an additional interaction method is added completely based on natural interaction, so the user can just grab objects with his virtual hands.

Navigation

3 DOF

For the Cardboard version (scene "LivingRoom 3 DoF"), the navigation was implemented using little platforms among the scene so after selecting one of them by gazing and waiting the selection radial to fill, the user immediately moves to that location in a blink of an eye (it also includes a little fade out/in in order to avoid discomfort). A moving arrow will also appear as the user gaze to a navigation platform, indicating that he can move there. The locations the user can move will be pre-defined, and there are one or two navigation platforms in each room of the house, so the user is able to move along all the house jumping between these predefined spots.

home_2

home_2

6 DOF

For the high-end headsets version (scene "LivingRoom 6 DoF"), the navigation was implemented with different types of teleportation. The type of teleportation to be used can be selected through a menu that is triggered when pressing the grip buttons on the controllers. A little green sphere is shown at the left of each method indicating whether the option is active or not.

home_2

Teleport Points: this option implies that there will be many predefined spots which we can use to move through the scene, much similar to the 3 DoF case. Pressing the touchpad on the motion controllers an arc will be shown starting from the controller position. To teleport to one of the predefined spots just moving the arc above it and releasing the button press.

home_2

Free Teleporting: with this approach the user has more freedom to move as he can move anywhere in the scene without restrictions or pre-defined spots (except for unreachable places such as places behing walls, doors, etc). The way to teleport is very similar to the previous one, simply pressing the controller's touch pad to point to the position where we want to move and releasing the button press for triggering the teleportation. sing the Vive Controllers the user points to any position in the scene, press the trigger and automatically teleports to that location.

home_2

Hybrid: both methods can be enabled at the same time.

Interaction - Objects

3 DOF

There is a subset of the objects in the scene which can be selected/manipulated (mainly objects above tables, the TV, the ceiling fan, etc. but not furniture, walls and other less natural objects to interacts with). For the 3 DOF version, the selection and manipulation is fully gaze-based. So you look to an object and, depending on the interaction method configured, the selection radial starts filing up and a little UI tooltip text will be shown indicating the action to be triggered. If it gets completely filled the object is selected and depending on the object a different action will be triggered (if it's a "grabable" object it will be shown near the user for a few seconds, if it's an interactable-only object it will trigger the associated action, for example, turning the TV on/off). The interaction is kind of magical, as the user is selecting objects from the distance using just his gaze.

The following interactions are supported so far:

  1. When selecting the ceiling fan, if turned off, it will start spinning. If selected when powered on it will be turned off.
  2. When selecting the TV, if turned off, it will be turned on and start showing some pre-loaded video. If selected when powered on it will be turned off.
  3. When selecting any of the doors, it will be opened or closed accordingly.

6 DOF

In this case the selection and manipulation is based on the use of motion controllers. So to grab an object you just put your virtual hand near the object and press the trigger. In this case the interactions will be fully natural, as the user will be interacting with the different objects in the same way he would do in the real world.

In this line, the following interactions are supported so far:

  1. Books, lamps and other small objects in the scene can be grabbed with the virtual hand by pressing the trigger. When the trigger is released the object grabbed will be released (you can throw it if released while moving the hand with some velocity).
  2. A useful object is the Universal Remote that can be found above the short table. After grabbing it you can point to different objects and press the touchpad button to interact with them. For example, the TV can be powered on/off, the fan can be powered on/off, and even some of the doors can be opened/closed using it. Now that's an universal remote! Check the video below this lines.
  3. Doors can be opened/closed by grabbing the handle by pulling the controller's trigger and moving the hand back or forth as you would do to open/close a real door.

Interaction 6DOF

Click on the image to open video (will open in the same tab by default)

Interaction - Virtual characters

The interaction with virtual characters is implemented in both the 3DOF and 6DOF versions of the application. There is an interactive virtual character in the house's living room, which is initially animated with an idle animation. Once you start talking he will start listening, and as soon as you stop talking he will ask you some questions.

During all the interaction with the character he will be making some human-like body and facial expressions such as blinking and eye/head gazing, as well as moving lips when talking, yet not synced with the speech. Although fully simulating social human-like behavior is a very complex task as we as humans are very sensitive to some expressions and even process many of them subconsciously, these social expressions make the virtual character behave somewhat similar to what we would expect from a real human.

virtual character

To answer to questions you can nod (YES) or shake (NO) your head (i.e.: quickly move your head up and down, or left and right, respectively). Check the following demo video to get an idea of how the interaction is carried out as well as the multiple social expressions done by the virtual character. The mic input isn't heard in the video, just the virtual character's answers, but the overall interaction can be clearly seen anyway.

3DOF

Click on the image to open video (will open in the same tab by default)

How to run it

Run from the Unity Editor

The project can be run directly from the Editor as it includes a basic mouse-based camera movement script. Hit Play from the Editor and it should be up and running in a few seconds. Just don't forget to put the focus on the Game tab so it correctly takes the mouse cursor as an input. It's also recommended to change the Build Settings Platform to PC when running from the Editor so it runs with the highest graphics quality (by default Android Build Platform is selected).

Build and run with Google Cardboard

  1. Check Android is selected as Build Platform in the Build Settings (File > Build Settings).
  2. Check Google Cardboard is selected in the Virtual Reality Supported list in the Rendering section of Player Settings (Edit > Project Settings > Other Settings > Rendering).
  3. Initially the position must be configured manually, but now the application already sets the camera position accordingly if it detects it's running on an Android device. This position was obtained checking the camera transformation when using the HTC Vive headset, so it's based on real data. Anyway, obviously the best position will depend on the height of the user, so ideally the app should ask for the position in the case of mobile VR or let the user modify the initial position somehow (at least in the Y axis).
  4. Check that the interaction method to be used is Gaze-based. This can be selected in the Selection Radial script attached to the Main Camera component.
  5. Build and Run.

Build and run with a SteamVR-compatible HMD

  1. Check PC is selected as Build Platform in the Build Settings (File > Build Settings).
  2. Check OpenVR is selected in the Virtual Reality Supported list in the Rendering section of Player Settings (Edit > Project Settings > Other Settings > Rendering).
  3. When building the 3 DOF scene (Livingroom 3 DoF), check that the interaction method is the desired (fully gaze-based, gaze-based plus 2DUI and selection bar, or gaze-based plus 2DUI without selection bar). This can be selected in the Selection Radial script attached to the Main Camera component. If building the 6 DOF scene there is nothing to check.
  4. Build and Run. In either case you must have SteamVR running for the application to work.

Demo videos

3 DOF

3DOF

Click on the image to open video (will open in the same tab by default)

6 DOF

6DOF

Click on the image to open video (will open in the same tab by default)

myhomevr's People

Contributors

matinas avatar

Stargazers

 avatar

Watchers

 avatar  avatar

myhomevr's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.