Giter Site home page Giter Site logo

vivesoftware / viveinpututility-unity Goto Github PK

View Code? Open in Web Editor NEW
353.0 23.0 82.0 26.43 MB

A toolkit that helps developing/prototyping VR apps.

Home Page: http://u3d.as/uF7

License: Other

C# 99.96% ShaderLab 0.03% Batchfile 0.01%
vive unity vr virtual-reality oculus unity-plugin steamvr steamvr-plugin daydream htc-vive

viveinpututility-unity's Introduction

VIVE Input Utility for Unity

Copyright 2016-2024, HTC Corporation. All rights reserved.

About

The VIVE Input Utility (VIU) is a toolkit for developing VR experiences in Unity, especially with the VIVE/VIVE Pro but also targeting many platforms from a common code base including Oculus Rift, Rift S Go, Quest, Google Daydream, VIVE Wave SDK (e.g. VIVE Focus standalone) and additional VR platforms as supported by Unity such as Microsoft's 'Mixed Reality' VR headsets and more. For the latest release notes click on the (releases) link.

How Do I Use This?

Read this step-by-step guide to using the Vive Input Utility for Unity. - also see [https://developer.vive.com]

Requirements

  • Unity 5.3.6 or newer

Features

  • API to access device input/tracking by role (eg. LeftHand/RightHand), instead of device index
  • Binding system, able to bind device to specific role, help manage multiple tracking devices
  • UI Pointer (EventSystem compatible)
  • Teleport
  • Object Grabbing/Throwing example
  • Role Binding
  • OpenXR Support
  • cross platform hand tracking

Supporting device

  • Simulator (Usage)
  • VIVE, VIVE Pro, VIVE Cosmos (any OpenVR compatible device)
  • Oculus Rift & Touch, Rift S, Go, Quest
  • Daydream
  • VIVE Focus, Focus+ (any WaveVR compatible device)

(How to switch device)

Contact Us

Email: [email protected]

Repository: GitHub

Forum: [https://forum.vive.com/forum/72-vive-input-utility/]

Contributing guidelines:

  1. Read and agree the CLA.
  2. Create an issue for the work you want to contribute.
  3. Fork this project on GitHub.
  4. Create a new branch (based on master branch) for your work on your fork.
  5. After you finish your work
  6. Submit a pull request from your new branch to our develop branch.

viveinpututility-unity's People

Contributors

baratgabor avatar borishsu avatar chengnay avatar dariol avatar faymont avatar hardcorepink avatar jeffbail avatar kfarr avatar lawwong avatar mhama avatar nagachiang avatar nearo avatar shang5150 avatar unoctium1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

viveinpututility-unity's Issues

How to deal with known vibration on trackers.

I'm working on building a controller that will experience known heavy vibration, and want to turn off reading input from the tracker for a fixed amount of time (in ms) so the tracker doesn't jump all over the place. A few months ago when I started looking into Vive Input Utility someone told me that function already existed in this library, but I've been unable to locate it.

Does it exist, and what is its name? Thanks.

Vibrate the controller and no effect

Version of Unity3D : 5.5.2f1 & 2017.1.1f1
Version of ViveInputUtility : 1.7.3

Question :
I want to trigger the vibration of the controller by using the api called "ViveInput.TriggerHapticPulse" but there is no effect.
I have read the VRModuleManager and checked out the "m_activatedModuleBase" is UnityEngineVRModule, the method "TriggerViveControllerHaptic" is a virtual method and there is no implementation in UnityEngineVRModule.
So I feedback this situation, or is there something wrong in my project setting?

OculusVRModule.cs `Pose' is an ambiguous reference 9 (Unity 2017.2.0.b8)

When using Unity 2017.2.0b8 I get following error;

Assets/HTC.UnityPlugin/VRModule/Modules/OculusVRModule.cs(107,24): error CS0104: Pose' is an ambiguous reference between HTC.UnityPlugin.PoseTracker.Pose' and `UnityEngine.Pose'

To correct I changed into PoseTracker.Pose
Then I get errer that VRSettings can't be found. I changed it into XRSettings
Then again more Pose errors, corrected again using PoseTracker.Pose
Then I get error VRDevice can't be found. I changed it into XRDevice

ViveRole -> BodyRole -> Foots

Hello!

I try to use the left foot and the right foot with Vive Trackers, but they are never assigned as foot.
I use the BodyRole on them and assign the left and right foot on their GameObject.

This feature is operational or I forgot something?

Thank you in advance.

How can I change FOV

FOV의 값을 변경하는 법이 뭔가요?
저는 zoomfactor의 값을 변경해봤습니다.
그러나 기본값에서 축소 확대만 가능할 뿐이고 FOV를 기본값보다 증가시키지는 못합니다.
저는 FOV를 마음대로 바꾸고 싶습니다.
제발 도와주시기 바랍니다. 감사합니다.

How do I change the value of FOV?
I changed the value of the zoomfactor.
However, it only allows zooming in by default and does not increase the FOV beyond the default.
I want to change FOV freely.
Please help me. Thank you.

Method OnPointerClick only triggers when pad is being held down

For our project we have the basic teleport script added and working based off of the pad button. We also wrote a simple OnPointerClick script that just sends a message to the console and attached to a mesh.

using System.Collections;
using System.Collections.Generic;
using UnityEngine.EventSystems; 
using UnityEngine;
using UnityEngine; using HTC.UnityPlugin.Vive; 
using UnityEngine.UI;

public class Marker : MonoBehaviour , IPointerClickHandler {
        public void OnPointerClick(PointerEventData eventData) {
		if (eventData.IsViveButton(ControllerButton.Trigger)) {
			Debug.Log("worked");
		}
	}
}

The console only logs 'worked' when I'm holding the pad down as well as the trigger. It doesn't work on trigger only like it should.

How to send a button press event from a non vive controller.

I have a vive controller mounted to a wireless gamepad, and need to trigger the teleport function from the game pad. tying button press x to start, and button release x to teleport. I've looked at Teleportable and ViveInputVirtualButton, but am not seeing how I should go about it. I've almost got my game updated to this utility for all my input, wish I'd of known about this months ago.

Unexpected results coming from IVRModuleDeviceState

Creating a function to disable ViveRoleSetter game objects that are not active on successful bind using OnIsValidChanged. all ViveRoleSetter game objects are children of the transform. I initially expected to be able to check if deviceState.isConnectined, and if not, disable the game object, but all game objects showed they were connected, I then was going to work around it by checking if transform.position, x, y, and z were zero, inspecting the gameobjects showed them as zero, but the debug output of their values showed incorrect values, not the 0 values they had. Its not behaving at all like I would expect. Am I using it wrong?

    [ContextMenu("toggle children")]
    public void ToggleChildren() {
        foreach (Transform item in transform) {
            ViveRoleSetter VRS = item.GetComponent<ViveRoleSetter>();
            if (VRS) {
                ViveRoleProperty VRProps = ViveRoleProperty.New(VRS.viveRole);
                IVRModuleDeviceState deviceState = VRModule.GetCurrentDeviceState(VRProps.GetDeviceIndex());
                Debug.Log(VRS.viveRole + " " + VRS.transform.position.x + " " + VRS.transform.position.y + " " + VRS.transform.position.z + " " + deviceState.isConnected);
                if (VRS.transform.position.x == 0f && VRS.transform.position.y == 0f && VRS.transform.position.z == 0f)
                {
                    VRS.gameObject.SetActive(false);
                    Debug.Log("Disable: " + VRS.viveRole);            
                }
            }
        }
    }

Need more flexible binding mechanism

I bound four devices (i.e. A, B, C, D) to HandRole.RightHand, HandRole.LeftHand, BodyRole.RightFoot, BodyRole.LeftFoot.
Now, I want to temporarily replace device A & B with E & F using the same binding config file, and E & F won't be detected!!(Controller A & B are turned off, C & D are turned on)
I hope the binding not so restrict, especially when there is no bound devices, it should be more convenient to allow me bind other devices onto these role automatically.

Version 1.8.1 Controllers dissapearing

I've seen multiple times while testing release 1.8.1 my controllers disappear, every time it happens this is the error spamming the console.

NullReferenceException: Object reference not set to an instance of an object
HTC.UnityPlugin.VRModuleManagement.VRModule.GetShouldActivateModule () (at Assets/HTC.UnityPlugin/VRModule/VRModuleManager.cs:144)
HTC.UnityPlugin.VRModuleManagement.VRModule.Update () (at Assets/HTC.UnityPlugin/VRModule/VRModuleManager.cs:95)

I upgraded to 1.8.0 from the previous version 1st by deleting the HTC.UnityPlugin folder,
I was not able to reproduce this error with 1.8.0. on 1.8.0 my controllers initially didn't show up till I pressed the vive button below the touch pad.

I then deleted the HTC.UnityPlugin folder again, and imported the 1.8.1 package, and started seeing this behavior. I'm going to downgrade back to 1.8.0 and make sure I am not able to reproduce this behaviour with the previous plugin version.

Unity Version: 2017.2.0f3

Scaling problems

When scaling the components the pose tracking does not work correctly.

How to properly manage multiple tracker and controller presets with role binding

I have these Roles:
RightHand1
RightHand2
LeftHand1
LeftHand2

I have the default Controller (Left), and Controller (Right) children on my camera rig, and I have duplicated Controller (Left) 1, and Controller (Right) 1 on my camera rig. When RightHand1 is bound to the right controller I want it to be the active controller being used for it, etc.

What is the right way in the framework to tie roles into what objects get bound, based on the roles. I didn't that section of code in the example.

If it doesn't exist, where should I put in hooks for this. It looks like in ApplyBindingConfigToRoleMap() we are aware of all controllers, their serials, and roles, but I'm not sure how to properly bind the children of the camera rig based on assigned roles.

VRTK support

Is it possible to combine your plugin with VRTK? I like their mechanics (levers, doors, climbing etc.) but I prefer your way of getting input (like "ViveInput.GetPressDown(HandRole.RightHand, ControllerButton.Trigger)" instead of VRTK event-driven input) and also your simulator is much better.

ToRole<TRole> constantly returns "Invalid"

When using ViveRoleProperty.ToRole<TRole>(), it returns Invalid everytime.

I guess, that the problem is in line 167:
if (m_roleType != typeof(TRole) || roleInfo.TryGetRoleByName(m_roleValueName, out role))

You have m_roleValueName parameter in method that is getting the role from Dictionary. After replacing it with m_roleTypeFullName everything works fine.

Canvas lag tracker position in build

Hi folks,

at the moment with 1.8.0 an attched canvas lag to the tracker. I think this was an issue before,
but was fixed. Now it appears again... but only in the build not in the editor... in the editor play its ok...
any suggestion to try out ?

I have tried. lateupdate, FixedUpdate, Update...

thank you in advanced..

benjamin

Wave SDK Controller does not follow headset

Using unity 2018.2 and the latest wave SDK the controller remains in place even when the user physically walks in space. This occurs in the teleporting sample scene provided. It seems that the controller spawns at the location of the head, and then does not move? Not sure if I am missing something, but as it does not work in the sample scene I'm guessing there is a problem.

Oculus Rift Device identification is not consistant

I recently started testing for rift support, sometimes the rift controllers are identified as one device, and at other times they get identified as a different rift device. The issue this creates is they have a different rotation offset for each, so the controller angles are incorrect they they detect as the other devices.

Has anyone else seen this behavior, or know of a way to resolve it? its all that's keeping me from being able to release for oculus at this point.

image

Both of these are the same controller on 2 consecutive startups of the project.

Multiplayer support

Is your plugin intended for multiplayer?
I use simulator mode. I tried to spawn Camera Rig with Network Identity, Network Transform and Network Transform Child components, and joysticks and trackers are spawned but aren't shown in other game instance.
Also, how to tweak the simulator devices, to move it using WASD on Local Player instance only?

How to disable grab and scroll on UI elements

With UI elements that have a vertical or horizontal scroll box, and trying to click on items in the scroll box, the only consistant way I've found many users are able to click on them is with higher pose easer, and pose stabalizer script values, but that makes the ui feel very sluggish.

The issue stems from as most users are clicking, their cursor is moving a tiny about, we're using the touchpad scrolling on the background fore movement, and have no need of the cursor grab and scroll on the background, though we use it on the slider on the right side of the list.

What is a clean way to fully disable the grab and scroll, and to send the click event before the full button click happens? I've been digging through the code, but haven't found the section that controls when the selection event happens, that should be enough of a workaround for me to be able to resolve the current issue i'm troubleshooting.

How to use simulator?

I think that this asset is a little bit outdocumented... Really..
You added simulator, but how to use it? Any examples? Nothing :(

help please...

Tracker Vibration behaves differently then controller vibration using SteamVR

Here is the test scenario:

Load the role binding example
create a game object and put 2 copies of the attached script on it
set one of them to Bodyrole->Right hand, and one of them to TrackerRole->Tracker1
bind a vive controller to right hand, and bind a tracker to tracker 1
a hyperblaster is a simple device to test with, but an oscilloscope will show the same results, and a multi meter will verify them as well.

the vive controller behaves as expected and vibrates at proper intervals.

The tracker though, vibrates when it should be still, and doesn't vibrate when it should vibrate, it also skips 1/4 of its times it should vibrate.

Watching the voltages coming off a reading between pin 1 and 2 the voltage is inverted, it stays high at ~ 3.7v, and drops low when it should be vibrating. From the look of The vive tracker documentation it is designed to directly drive a small vibration motor, but the signal its sending does the opposite, and always keeps it vibrating, then stops vibrating when it should vibrate.

I also tested hooking up a 3v vibration motor directly to the tracker, and received the same results.

Let me know if any additional testing is needed.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using HTC.UnityPlugin.Vive;

public class VibrationTest : MonoBehaviour
{

public ViveRoleProperty viveRole;
public bool pulled = false;
public float counter = 0;
public float interval = 0.5f;
public ushort strength = 2000;
public bool active = true;

IEnumerator HeartBeat()
{
    while (true) 
    {
        counter += Time.deltaTime;
        if (counter > interval && active)
        {
            Debug.Log("Vibrate");
            ViveInput.TriggerHapticPulse(viveRole, strength);
            counter = 0;
        }
        yield return null;
    }
}

void Start()
{
    StartCoroutine(HeartBeat());
}

void Update()
{
    if ((ViveInput.GetAxis(viveRole, ControllerAxis.Trigger) == 1) && (!pulled))
    {
        pulled = true;
        active = !active;
        Debug.Log("Trigger Pressed");
    }
    else if ((ViveInput.GetAxis(viveRole, ControllerAxis.Trigger) < 0.5) && pulled)
    {
        pulled = false;
        Debug.Log("Trigger let go");
    }

    if (ViveInput.GetAxis(viveRole, ControllerAxis.PadY) > 0.07f)
    {
        interval = ViveInput.GetAxis(viveRole, ControllerAxis.PadY) * 2;
    }
}

}

UGUI UI flickering when world space

Hello!

The problem is that UI flickers on World Canvas UI. But, not on all canvases. Only if canvases is moving (in my case - chil object of VR Contorller model).

Can you help me to solve this problem please?

UPD:
It's happens more often if you trying to use UI that's really near for ui raycaster. (but still must be properly highlighted, because m_NearDistance is 0.

Now i set m_nearDistane to -0.5 and that's help.

Changing the distance of dragged objects

I really like the way Oculus Dash allows you to pick up a window, drag it around and then move it closer or further away from you so you can hang it in just the right position and size. I'm going to play around with adding support for this in Draggable.cs from the 3DDrag example. Although I'm using Dash as an example I'm not planning on working with windows or desktops from the OS; I'm just going to move around GameObjects which are already in the scene. If anyone has worked on something similar or has suggestions on a better way to go about it please let me know. One initial concern I have is recognizing which controller is doing the dragging; for now I think I'll assume everything is right handed.

HTC Controllers are spawning on 0,0,0 on version 1.8

I'am using Steam 1.2.3 and Unity 2017.1
The issue is every time I start the scene the controllers spawn on the scene origin.
before
Only when I press the system button, go to the steam dashboard and go then press the system button again to back to my scene the trackers work again.
after

HandRole already mapped

I binded two controllers to HandRole - right hand & left hand, and connected two unbound controllers. Then, my program will get the error message from ViveRoleMap.cs.

"roleValue(HandRole[1]) is already mapped, unmapping first."

Fix it please...

Switch left / right Eye as Display Monitor

Version of Unity3D : 2017.3.0f3
Version of ViveInputUtility : 1.7.3

Question :
Is it possible to mirror the right eye of the HMD to the Monitor.. ?
At the moment its the left... for shooters using the right eye for aiming,
it would be more impressive...
and my client requested it ;-)

thank you in advance

benjamin

Moving the Camera Rig changes the teleport start and end offsets

attach this script to a vive controller in the teleport demo, move your character around with it, and try to teleport. The origin is not where it should be once the camera rig gets moved.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using HTC.UnityPlugin.Vive;

public class ViveControllerMovement : MonoBehaviour
{
    public ViveRoleProperty viveRole;
    public GameObject CameraRig;
    public ViveRoleProperty controllerDirection;

    void Update()
    {
        if (ViveInput.GetAxis(viveRole, ControllerAxis.PadY) > 0.5f)
        {
            Vector3 movementVector = Vector3.ProjectOnPlane(VivePose.GetPose(controllerDirection).up, Vector3.up);
            CameraRig.transform.position -= movementVector / 3;
        }

        if (ViveInput.GetAxis(viveRole, ControllerAxis.PadY) < -0.5f)
        {
            Vector3 movementVector = Vector3.ProjectOnPlane(VivePose.GetPose(controllerDirection).up, Vector3.up);
            CameraRig.transform.position += movementVector / 3;
        }

        if (ViveInput.GetAxis(viveRole, ControllerAxis.PadX) > 0.5f)
        {
            Vector3 movementVector = Vector3.ProjectOnPlane(VivePose.GetPose(controllerDirection).right, Vector3.up);
            CameraRig.transform.position += movementVector / 4;
        }

        if (ViveInput.GetAxis(viveRole, ControllerAxis.PadX) < -0.5f)
        {
            Vector3 movementVector = Vector3.ProjectOnPlane(VivePose.GetPose(controllerDirection).right, Vector3.up);
            CameraRig.transform.position -= movementVector / 4;
        }
    }
}

Clickable widgets in ScrollRect

Version of Unity3D : 5.5.2f1
Version of ViveInputUtility : 1.7.3

Question :
I want to make a button list so I created some buttons in scrollrect, when the raycast points to the button and I try to drag the list, it can not work correctly.

How to identify VR Hardware Make when Role binding in BodyRoles.cs

I'm borrowing a rift from a friend to make some software updates, the 1st issue I ran into when I started testing, is the controller angle is vastly different of a rift controller, compared to a vive. I've already created new objects to bind them to, but on rift controllers I want to on device init,bind them to the other objects in BodyRoles.cs I'm just not sure what variable to check for me to know if they should be bound to another device.

Document usage with Wave/Focus

I only knew there was support in VIU for the Focus because a mod on the Wave SDK forum told me.

There's nothing in the docs to indicate this and no info on how to handle the unique aspects of the Focus.

The docs could do with expanding in general.

It might even be time to consider a name change - calling this "Vive Input Utility" reminds me of the joke about the Holy Roman Empire - it was neither holy, roman nor an empire...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.