Giter Site home page Giter Site logo

Unity Support about mediapipe HOT 58 CLOSED

google avatar google commented on April 27, 2024 53
Unity Support

from mediapipe.

Comments (58)

LogonDev avatar LogonDev commented on April 27, 2024 19

Some potential use cases:

  • VR Hand tracking (similar to devices like leap motion). Some VR headsets such as the HTC Vive Pro and Valve Index already have RGB cameras built in
  • AR hand tracking
  • Object detection/segmentation using cell phone camera/hololens camera/pc camera

from mediapipe.

mgyong avatar mgyong commented on April 27, 2024 13

We are looking into this and will update the thread when we have something more definite. We welcome contributions from folks in the thread

from mediapipe.

RafaelSPBarbosa avatar RafaelSPBarbosa commented on April 27, 2024 10

I also support this!
We are a Unity VR and AR Advergame studio, our clients would loose their minds over this. Initially, we would like to be able to do normal image tracking or arcore tracking to position the objects on the scene, then, we would use this to track the hands and enable our users to directly interact with the game. This would be fenomenal!

from mediapipe.

mgyong avatar mgyong commented on April 27, 2024 9

@nadir500 We have gotten several request to support MediaPipe in Unity engine. Can you share with us your use cases? We love to work with some teams to get contribution to MediaPipe for getting MediaPipe into Unity
@mcclanahoochie @chuoling

from mediapipe.

KDevS avatar KDevS commented on April 27, 2024 7

Would love to get a Unity port as well. There aren't any open-source options for hand-tracking in AR, especially for mobile devices. This works perfectly on my phone. There are some like ManoMotion which support hand tracking in 2D but they are paid and in closed beta. If this can be used with Unity then that would help a lot of developers around who are looking to integrate a more natural interaction into their Augmented Reality experiences. The use case for VR is even more obvious.

from mediapipe.

iBicha avatar iBicha commented on April 27, 2024 6

Maybe this can be in the style of c api from tflite?

And the string containing the definition of the graph can be passed from the .Net runtime to the native API with PInvoke calls.

I would say it can be even possible to create custom calculators in C#, and the managed methods (GetContract(), Open(), and Process()) can be passed to the C API as a function pointer to be invoked from there.

The incentive would be to make it possible to use alongside arcore-unity-sdk (in a fashion where ARCore will be passing the camera image to mediapipe through the CameraImage), and maybe ARFoundation as well (which also has an API for retrieving the camera image), so it would be in the form of a subsystem.
Because this is where most of the AR creations are happening, so this would enable a lot of devs to expand on their existing projects.

These are only ideas as I didn't dive into mediapipe enough to have a solid opinion

from mediapipe.

seberta avatar seberta commented on April 27, 2024 6

Hi, just wondering why this issue is closed now? Is the Unity plugin available?

from mediapipe.

b29b avatar b29b commented on April 27, 2024 6

not sure if its official or anything but this one works..
https://gitlab.com/thangnh.sas/mediapipe-unity-hand-tracking
https://www.youtube.com/watch?v=nNL7zOq3fmo

from mediapipe.

mcclanahoochie avatar mcclanahoochie commented on April 27, 2024 5

Hi
We definitely welcome contributions to MediaPipe, and Unity support would be great to have.
I like reading about the use cases, and @iBicha has some nice ideas on approaches.

I can add a few more ideas, and would say there is a spectrum (as always in programming) of how to solve this (i.e. adding Unity support):

       quick/easier/faster <--> more-involved/harder/tedious 
less-flexible/less-generic <--> more-flexible/more-generic

On the right side, there is the ARCore approach, where the majority of ARCore API is mapped into C#, including all the C/C++/C# wrappers. This obviously requires a lot of C# code (and C interfaces) to be written, but provides the greatest flexibility on developer use-cases and how tightly you can integrate into the C# application.

On the left side, there is the option of minimizing the amount of C#/C-wrapper that needs to be written, via writing a native class (similar to hello_world.cc, or FrameProcessor.java) to handle all the graph initialization/running. In the simplest case, there could be just a few functions in a custom graph runner exposed C#: InitAndStartGraph, AddPacket, RetrieveResultPacket, ShutdownGraph. This would be more of a black-box approach, treating running a graph like calling a function.

I think depending on the application, one approach may be more fitting than the other (considering amount of effort involved), or some hybrid of the two.
Hopefully this discussion can get people going in the right direction for them.

A side note for future reference:
To link the OpenGL context between Unity and MediaPipe (where Unity is parent context), you would need to follow something similar to what is done for Android in nativeSetParentGlContext
and connect with Unity

OnGraphicsDeviceEvent(UnityGfxDeviceEventType eventType) {
  switch (eventType) {
    case kUnityGfxDeviceEventInitialize: {
#if HAS_EGL
      external_context = eglGetCurrentContext();
#elif
    ...
    }
  ...

from mediapipe.

ROBYER1 avatar ROBYER1 commented on April 27, 2024 5

@nadir500 We have gotten several request to support MediaPipe in Unity engine. Can you share with us your use cases? We love to work with some teams to get contribution to MediaPipe for getting MediaPipe into Unity
@mcclanahoochie @chuoling

It's been 2 years, I could write an essay of use cases if you need. Many current projects would benefit from this

from mediapipe.

martdob avatar martdob commented on April 27, 2024 4

Unity support would be an excellent approach. Especially for variety of Augmented Reality apps and solutions, the Mediapipe Plugin for Unity would be a great help.

from mediapipe.

MahmoudAshraf-CIS avatar MahmoudAshraf-CIS commented on April 27, 2024 3

Actually there is an sdk from vive port that supports hand tracking on unity
For both Android and HTC vive
I didn't try it on Android though, but it worked on vive just fine

The sdk is still early access
https://developer.viveport.com/documents/sdk/en/vivehandtracking_index.html

I don't know the difference between the two implementations but I guess it would be great if there is some sort of cooperation between the two teams to poost the process.

And for some use cases, check the videos here
https://developer.vive.com/resources/knowledgebase/vive-hand-tracking-sdk/

from mediapipe.

TheBricktop avatar TheBricktop commented on April 27, 2024 3

The aristo API from htc currently is much inferior to the Google's approach due to the API that relies on the stereo cameras from vive pro, as tested on original vive mono camera it works really bad, and the Android version is very limited and burns the battery easily thus limiting it's use with other computational heavy tasks like XR.
That's why we look forward to see a port of media pipeline to Unity.

from mediapipe.

jmartinho avatar jmartinho commented on April 27, 2024 3

HTC Vive solution is not good enough compared with this mediapipe handtrak. I tested both, and even Vive solution runs on a desktop, is to not compare with mediapipe runing inside a slow android smartphone. Mediapipe handtracking is stunning good and it will be a gamechanger in UI in the nearfuture.
Hope mediapipe will support very soon Unity.
A good start is looking at TensorFlowSharp, already implemented in Unity:
Using TensorFlowSharp in Unity (Experimental)
https://github.com/Unity-Technologies/ml-agents/blob/develop/docs/Using-TensorFlow-Sharp-in-Unity.md
Here it is as unity package: https://s3.amazonaws.com/unity-ml-agents/0.4/TFSharpPlugin.unitypackage

from mediapipe.

TheBricktop avatar TheBricktop commented on April 27, 2024 3

I kinda find this weird that feature request is closed without any answear.
Intel git does that too.

from mediapipe.

romaindebraize avatar romaindebraize commented on April 27, 2024 3

I also support this. When will it be available ?

from mediapipe.

nadir500 avatar nadir500 commented on April 27, 2024 3

Support for Unity barracuda would make more benefit for the developers.

from mediapipe.

DBrown12 avatar DBrown12 commented on April 27, 2024 2

Like Seberta, I'd like to know. Has any headway been made on this endeavor?

from mediapipe.

Thaina avatar Thaina commented on April 27, 2024 2

Why this issue was closed though? We should have unity integration of this library

from mediapipe.

ostryhub avatar ostryhub commented on April 27, 2024 1

Hey, the issue is closed, but does anyone know if there are plans to support unity integration for iOS and Android ?

from mediapipe.

vivi90 avatar vivi90 commented on April 27, 2024 1

Yes, Unity- or Qt-Integration would be great!
Need this for my studies.

from mediapipe.

midopooler avatar midopooler commented on April 27, 2024 1

Any conclusion to this thread?
I actually needed a human segmentation feature in unity (but for mobile applications)

from mediapipe.

vivi90 avatar vivi90 commented on April 27, 2024 1

Yes, we really need this stuff for Unity!

from mediapipe.

mgyong avatar mgyong commented on April 27, 2024

from mediapipe.

mgyong avatar mgyong commented on April 27, 2024

from mediapipe.

ahmadpi avatar ahmadpi commented on April 27, 2024

I have been looking for this solution as well. I fully support this effort!

from mediapipe.

Ericbing avatar Ericbing commented on April 27, 2024

+other folks

On Wed, Aug 21, 2019 at 6:34 PM Logon13 @.***> wrote: Some potential use cases: - VR Hand tracking (similar to devices like leap motion). Some VR headsets such as the HTC Vive Pro and Valve Index already have RGB cameras built in - AR hand tracking - Object detection/segmentation using cell phone camera/hololens camera/pc camera — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#36?email_source=notifications&email_token=AAQTXUNNQJIGCXLGHGCE7E3QFXUJNA5CNFSM4IOHEEL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD43S7XA#issuecomment-523710428>, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQTXUNPXU6O3CSVNSUSI2LQFXUJNANCNFSM4IOHEELQ .

+other folks

On Wed, Aug 21, 2019 at 6:34 PM Logon13 @.***> wrote: Some potential use cases: - VR Hand tracking (similar to devices like leap motion). Some VR headsets such as the HTC Vive Pro and Valve Index already have RGB cameras built in - AR hand tracking - Object detection/segmentation using cell phone camera/hololens camera/pc camera — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#36?email_source=notifications&email_token=AAQTXUNNQJIGCXLGHGCE7E3QFXUJNA5CNFSM4IOHEEL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD43S7XA#issuecomment-523710428>, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQTXUNPXU6O3CSVNSUSI2LQFXUJNANCNFSM4IOHEELQ .

-VR hand tracking for Oculus Quest/Rift S. It is so important for getting a better user experience with more natural interaction input, finger tracking is definitely on top of the list. Also there will be more and more VR HMD using inside-out tracking meaning they all have camera onboard.

from mediapipe.

lqz avatar lqz commented on April 27, 2024

I hope this feature too. It is very importance for us.

from mediapipe.

GilbertoBitt avatar GilbertoBitt commented on April 27, 2024

I fully support this. imagining been able to interact with the game/object itself using hand tracking.

from mediapipe.

lukamilosevic avatar lukamilosevic commented on April 27, 2024

A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates.

First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results!

from mediapipe.

TheBricktop avatar TheBricktop commented on April 27, 2024

There are other models for hand tracking but probably not so performant as the googles version https://github.com/lmb-freiburg/hand3d

from mediapipe.

asierras avatar asierras commented on April 27, 2024

I am also looking forward to this. A lot of clients are asking me hand and finger tracking for AR apps on iOS and Android.

Any news about this?

Thank you!

from mediapipe.

HooliganLabs avatar HooliganLabs commented on April 27, 2024

Also interested in support of this.

from mediapipe.

boehm-e avatar boehm-e commented on April 27, 2024

Interested too

from mediapipe.

moemenYmoemen avatar moemenYmoemen commented on April 27, 2024

Interested in Unity support for it

from mediapipe.

rtrn1337 avatar rtrn1337 commented on April 27, 2024

I am also interested in an Unity implementation!

from mediapipe.

justinduynguyen avatar justinduynguyen commented on April 27, 2024

Anyone have done on implementing mediapipe in Unity ?

from mediapipe.

huangshenlong avatar huangshenlong commented on April 27, 2024

Is anyone working on this? I want to do this, does anyone have skills to help? I'll be streaming my efforts on https://twitch.tv/huangshenlong

from mediapipe.

huangshenlong avatar huangshenlong commented on April 27, 2024

@lukamilosevic I'm going to try your approach. I'm a Unity noob though so it might take me a while.

You said "A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates.

First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results!"

from mediapipe.

lukamilosevic avatar lukamilosevic commented on April 27, 2024

@lukamilosevic I'm going to try your approach. I'm a Unity noob though so it might take me a while.

You said "A step in the right direction would be headless hand tracking and using UnityPlayer.UnitySendMessage to send the coordinates from Android to Unity Activity. Then just position the hand on the camera viewport according to received coordinates.

First step could be just adding some code to the current "Hand Tracking GPU" app to send data to a server running in Unity Editor during play mode. That itself could become a great development tool. I might try to do this myself but if someone does something like this, please post your results!"

Looking back at what I wrote, it's not a good solution. Actual good step would be a server running on the phone. What the app needs is basic server that will serve it's clients hand tracking data (same data shown on the screen)

This way, a separate phone can be dedicated for hand tracking. Idea is to, instead of showing the data on the screen as it currently is on the app, show it on a server.

Here's what I would do if I had the time and the patience:

  • get the code, figure out the environment it needs to be in, build and run it. To make sure I got the environment, the tools and everything the devs had to get the app built and running.
  • find the tracking data variables in the code. Debug with debugger, log, print etc. until variables holding the important data are found.
  • add code to serialize and logcat the data, build and run. Point of this step is to make small change in the code, but also extract the data and log it from the phone while the app is running. This will be the proof of concept that data can be extracted.
  • add a server to serve that data, in whatever way, whatever works. (Simplest tcp server for example, that responds once, when client connects)
  • connect to that device from another device on a local network and read the data.
  • when that's done, meaning it's doable 100%, the server should be done to be a proper UDP socket connection where client just listens for all the hand tracking data the server sends from the app.
  • maybe remove the camera view from the app so its not taking resources

From here, writing a simple UDP client in unity should do the trick to get the data.

Also from here, making both run on a same phone means either merging unity Android build project and this Android project or editing this project to run in background (maybe as a service).

But, doing the first part only, where hand tracking is done on a separate phone, basically makes a hand tracking product just like any other out there, except everyone has the hardware at home.

from mediapipe.

ROBYER1 avatar ROBYER1 commented on April 27, 2024

We are looking into this and will update the thread when we have something more definite. We welcome contributions from folks in the thread

Is this thread closed for a reason? I am hoping to use this with Unity for easier cross-platform motion tracking, object detection/objectron support also.

Currently I am constrained by what devices support ARCore/ARKit so the Instant Motion Tracking would be a fantastic alternative with broader reach

from mediapipe.

vivi90 avatar vivi90 commented on April 27, 2024

@midopooler Nothing until now, like it seems.

from mediapipe.

vivi90 avatar vivi90 commented on April 27, 2024

Why this issue was closed though? We should have unity integration of this library

Same opinion.

from mediapipe.

ostryhub avatar ostryhub commented on April 27, 2024

Why this issue was closed though? We should have unity integration of this library

Same here.

from mediapipe.

rtrn1337 avatar rtrn1337 commented on April 27, 2024

Maybe this could help? https://github.com/homuler/MediaPipeUnityPlugin I didnt test it, but got it on my list.

from mediapipe.

nadir500 avatar nadir500 commented on April 27, 2024

The main challenge in Unity currently is to make hands tracking with depth perception an option in vr/ar interactions using mobile camera instead of tracking devices such as leap motion which has no android sdk, of course it's all going to be tuned in real-time since the slightest lag would hurt the experience on some cases.
The projects I make on unity often include merging research with vr environments using different tools used to interact. Other features like gestures detection would be useful for different AR concepts.

from mediapipe.

KDevS avatar KDevS commented on April 27, 2024

Unity support for MediaPipe would be really welcome. Anyone who has worked in the AR/VR/MR applications would love a stable hand-tracking option without being dependent on expensive and/or walled-off hardware to pull it off. I have seen some really good work done on hand-tracking done with just a single RGB camera by some university teams a few years back but none of them were opened up to public, and I won't be surprised if most of them ended up in either Facebook or Microsoft's collection of patents.

OpenCV seems to be the only open option available at the moment but the hand-tracking options are not polished enough to be used commercially. Options like ManoMotion are way too expensive for individual developers.

from mediapipe.

FerLuisxd avatar FerLuisxd commented on April 27, 2024

Just an API would be really neat to have to make really cool 3D applications

from mediapipe.

yugosaito4 avatar yugosaito4 commented on April 27, 2024

@rtrn1337 have you tried the plugin?

from mediapipe.

rtrn1337 avatar rtrn1337 commented on April 27, 2024

@rtrn1337 have you tried the plugin?

yes i did. Some features are working in Unity Editor. But I get an error in Xcode when I want to build on a device. I haven't had time to test it more closely yet.

from mediapipe.

manwithsteelnerves avatar manwithsteelnerves commented on April 27, 2024

@ROBYER1

I was following this thread and it started long back where there are not much devices which support AR Core. But, now there are many.

Would be great if you can share the use-case list!

from mediapipe.

EigenSpiral avatar EigenSpiral commented on April 27, 2024

Hi,
Does anyone know how the hand tracking performance of Mediapipe & Leap motion compares?

from mediapipe.

boehm-e avatar boehm-e commented on April 27, 2024

@EigenSpiral
I have a leap motion since years now.
I find that Mediapipe is better at guessing partially occluded finger's positions.
I cannot compare the performance (in term of speed), since I run mediapipe on my phone, and leap motion on my laptop. Both run smoothly. Maybe mediapipe is even faster.

from mediapipe.

EigenSpiral avatar EigenSpiral commented on April 27, 2024

@boehm-e Thank you for sharing!
How does it compare on depth perception?

from mediapipe.

ROBYER1 avatar ROBYER1 commented on April 27, 2024

There isn't really depth with Mediapipe so Leap or any depth sensing camera is better there. I would strongly advise you try the linked sample repo above and compare for yourself in your own project!

from mediapipe.

manwithsteelnerves avatar manwithsteelnerves commented on April 27, 2024

We recently release Easy ML Kit which targets to get google's ML Kit, Media Pipe and Tensorflow in Unity. Currently we released 2 features (Barcode scanning and object detection) and others are in developement.

We have ARFoundation Camera as one of the input sources (along with live cam and image/textures) which will be quite useful for AR Apps.
Would be great if you can share us what features you are expecting so we can prioritize.

from mediapipe.

EigenSpiral avatar EigenSpiral commented on April 27, 2024

from mediapipe.

manwithsteelnerves avatar manwithsteelnerves commented on April 27, 2024

Sure! Will definitely have a note!
We just released with Text Recognition and current features in development are Face detection and Pose estimation.

from mediapipe.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.