jimwest / mefamo Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
How to use android camera for this. i have Realme x2 . which has 2Mp depth camera . i think which will give more accurate movement than webcam.
i tested the first realease on window and i have a black screen with obs virtual camera and other
In code of computing a stretch_max_right value, looks like there is a copy-paste mistype
stretch_max_right = -0.45 +
(0.45 * mouth_smile_right) + (0.36 * mouth_left)
Should it be mouth_right used here instead ?
It's amazing work!
I found that the LLink Face with Metahuman-generated human data doesn't 'sync' or 'calibrated' properly.
Below is the captured expression from video, the smiling person. mediapipe tracked it quite accurately.
The metahuman response to MeFaMo-transfered data.
Is there any parameter or step I should check or need to improve?
In your demo video, your smile is quite well-synced.
https://www.reddit.com/r/unrealengine/comments/r8wbe3/my_livelink_facetracking_without_an_apple_device/
Does the fresh-exported metahuman data need blueprint modification following below link you mentioned?
https://docs.unrealengine.com/4.27/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/
#self._live_link_face.set_blendshape(FaceBlendShape.MouthClose, mouth_open)
in blendshape_calculator.py
if I delete this code,the program will run well.Could you help me explain the meaning of this code
mediapipe
package doesn't exist for Apple silicon. Use/install mediapipe-silicon
instead.
This could probably be noted in the README.
Thanks to the livelink method you shared, I want to send the pose matrix of the body, but I don't know the pose format name that Livelink's bones need
I generated the arkit csv file from blendshapes_calculator. By using the arkit csv file to drive the metahuman to generate animation of the metahuman. But the result of matahuman face animation performed unnormal shaking. However, I used live link face to drive the metahuman live face, the animation performed normally without shaking mentioned before. Could anyone help me, why this unnormal shaking happened? Thank you very much!
Hello
I am trying to use this repository on my Ubuntu system with UE 5 and I am not able to find source for PyLiveLink python library. I have tried many things but I am not able to connect with udp server with LiveLink.
I am new to UE 5, so I don't know if it's UE issue or any other issue.
Thank You. Cheers.
Hi I am reading your code. I don't know what is the purpose for get_metric_landmarks
MeFaMo/mefamo/custom/face_geometry.py
Line 2482 in 341813f
I tried to use mediapipe with iClone. My major problem with mediapipe is the jitter. Even applying some filters I was still unable to get rid of the jitter enough to make it useful. Did you find a way to filter out the jitter?
Here is my attempt with the hand mocap. https://www.youtube.com/watch?v=j6JboJIlpfM
what would it take to enables the iris tracking? could it be enabled simply, or would it need some deeper implementation?
2nd thing that would be very cool is some kind of recroding data straight from the app, to some .csv or something.
I noticed that the results vary with different face rotation / head tilt, since the values used are tuned to the neutral / upright face rotation. I think you should first rotate the landmarks into a neutral pose before doing the calculations, so that results are rotation invariant. Are there plans on adding this feature?
Is there a way to set the detected face index to track?
Sometimes there are multiple faces on the image, and it would be useful to select the index (from left to right for example) of the target face.
Hi Jim, thanks for creating this repo, this combined with pylivelink
opens a lot of a opportunities.
Can you share some details on how you created the stand along binary? Specifically how you've incorporate the libraries such as mediapipe
and opencv
.
Hello.
Super work. Thanks for sharing.
How did you work out the Blend shape type, min and max value.
Thanks ๐
Hi, this is some really amazing work. I am trying to read the Live Link data in UE but I dont see any source detected there. I used your PyLiveLinkFace lib to verify if any data was actually being sent and it is receiving the data from MeFaMo correctly. Do you have any idea what the issue might be? Thanks
When I start the mefamo.exe the screen remains black and there is no video input from the webcam.
Blackmagic software was the issue.
Hello! Your work here looks fun I'd love to try it out!
But I'm having problems getting 'mediapipe' installed on my system.
Which version of Python and of mediapipe do you use?
Cheers!
hi Hey I downloaded the EXE file from your github, it works and identifier my face, but I don't know how to get the data inside unreal. Can you help me?
Originally posted by @game-alle in #8 (comment)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.