slimevr / slimevr-server Goto Github PK
View Code? Open in Web Editor NEWServer app for SlimeVR ecosystem
Home Page: https://slimevr.dev
License: Apache License 2.0
Server app for SlimeVR ecosystem
Home Page: https://slimevr.dev
License: Apache License 2.0
After SlimeVR/SlimeVR-OpenVR-Driver#1 need to implement elbow tracking with references based on the controllers.
Implementing new GUI on JavaFX instead of swing, with new design
Either through tapping the tracker, controller button press and gesture, or any other potential idea.
This will be a huge QOL upgrade to people who moves a lot, having the need to reset tracker in multiple succession for experimenting tracker adjustments, or unlucky enough to just have a lot of drift happening to them, worth to be in the issue here as a constant reminder.
Update 1: Tap function for BNO is too inconsistent, this has be done with gesture/button press or both.
Knee can only bend along one axis (pitch), and legs don't twist much (in reality, they do, see #7, and ankle can be twisted a bit along yaw too). Ankle and thigh have always the same roll axis.
This can be used to compensate for some of #7, or just to improve precision of the leg joints. Naive approach of averaging yaw and roll using euler angles created gimbal lock (c2b4d30), so we either need to figure out math without euler angles, or make and even more complex model for it.
When using all available trackers in the server, SteamVR will prompt you to recalibrate your playspave every time you restart the server. If you toggle the trackers down, the old playspace still loads fine.
There was some debate wether it is cause by base station tracked headsets but no confirmation yet.
Trouble having this work on Windows 11, the initial handshake is not completing. I do have an AP MESH setup. But I doubt it's related to this.
Pardon my poor documentation, but packet IDs 5 and 6 are actually already in use by the owoTrack Android app.
This code appears to be using these packet IDs for other purposes: https://github.com/SlimeVR/SlimeVR-Server/blob/5c6d02de30d646bf095fe190a1bc2245e6de7a17/src/main/java/io/eiren/vr/trackers/TrackersUDPServer.java#L235L249
These aren't implemented in the owoTrack server, but the Android application sends these packets anyway. This might be causing some issues.
Packet 5 tells whether or not the magnetometer is enabled with a single char 'y' or 'n': https://github.com/abb128/owoTrackVRSyncMobile/blob/11f6718337aafce2db8daeb62076bb0e9129cee0/app/src/main/java/org/owoTrackVRSync/UDPGyroProviderClient.java#L254L269
Packet 6 is sent when the power button is pressed and the device screen is turned on (the method name is recenter_yaw, but that was just a suggestion): https://github.com/abb128/owoTrackVRSyncMobile/blob/11f6718337aafce2db8daeb62076bb0e9129cee0/app/src/main/java/org/owoTrackVRSync/UDPGyroProviderClient.java#L271L278
Currently pelvis in skeleton model tilts with waist, which is not strictly correct. People can tilt left and right while standing on both legs straight, but in current model, one leg would rise up. Need to create a more complicated model of the pelvis using waist and both legs angles.
Naive average between two legs should probably work, but needs testing. There might be better models.
Add support for Open Sound Control protocol trackers server to recieved and send tracking data to support multiple programs like Unity and others.
Remove the FORWARD/LEFT/RIGHT/BACK dropdown and make the tracker's mounting rotation based on the headset when performing a reset.
The trackers could then automatically know where forward is and have automatic, more accurate, mounting rotation values than 90, 180, 0 or -90.
This would thus allow for the trackers to be placed at any rotation around a body part.
We have empirically concluded that current reset and adjustment algorithm works fine, but it doesn't pass unit tests. We need to rewrite tests from scratch that would have rigorous proof of it working; or we need to adjust test to pass for current reset algorithm.
Correct Pitch and Roll when calibrating sensors offset to allow imprecise mounting of sensors, including chest sensor mounting.
Feet can't be rotated too far from ankles, so for the cases where feet use more drifty IMUs it may be helpful to auto-calibrate them if they drift from ankles more than a set limit. This would let users fast reset feet by spinning them around too - reset them to an ankle+limit when the limit is reached.
This may be useful for other bones in the future, like ankles and thighs, but main goal is to let users use worse IMUs on feet.
With Autobone skeleton now repeats twice. Originally, HumanSkeleton class and it's ancestors were meant to be as an abstract skeleton, but they iclude too much logic that depend on the server and can't be freely used by other systems. Need to make one skeleton class that doesn't rely on any outside system and can resolve skeleton with a lot of different settings, inputs and outputs.
Right now it works if you don't have waist or chest trackers, but it doesn't rotate at all. Yaw of waist and chest should match yaw of the headset if chest and waist trackers are missing probably. This is bad either way, but it will let test some stuff more easily without the need for full set of slimevr.
Legs with missing trackers should also match their yaw to the previous tracker in the chain.
I'm using a visual capture system. Because the data is transmitted by the same device, the IP is the same. I want to know how to distinguish different nodes.
The effect of the program is about this
The process of setting up body profiles seems rather tedious and laborious. Can we take a page out of other software such as the those AR piano tutorial things for the quest that locate objects in the real world using the controllers?
What I'm talking about is, instead of trying to figure out where the trackers are via a lengthy dance, or manual clicking around on the body proportion thing, I would like to use the pointiest end of the controller to define where each tracker is in space on your body.
Workflow would be like this:
I know nothing about openvr or what it would take to accomplish this, but if possible, could be super beneficial when sharing slimeVR with other people/guests
The ability to save and reload body config profiles would be invaluable for households with multiple people who use the same VR set up instead of having to write down and re-input their body config set up.
While the server/firmware is being developed this would also be incredibly useful for members of the DIY community like thebutlah who has to use a specific set up for dancing to isolate the hips but it doesn't work for day to day activities that involve bending the spine
Need to implement installer and binary wrapper for Java. Can make our own (I have code for the wrapper) or use one of the third-party ones.
Need to add functionality to record BVH files with animations. This should have start and stop buttons/commands, and record all tracker animations in a new file.
So im seeing like 50 ppl a day in diy and tech support with "my trackers arent connecting to the server", but it turns out they are connecting but there is an issue with their defines or imu wiring, etc. It would be nice if its possible to have a bit more information.
Suggestion is instead of just not showing up in the server window, it shows up when it can connect but with something like IMU connection error. This will help narrow down when it is a problem with the tracker connecting or a problem with just the IMU for easier troubleshooting.
instead of connecting via wi-fi, you can use any Arduino with an IMU
There can be a lot of things to improve calibration (reset) and skeleton settings that will be more precise than pressing reset while standing still, or asking users to input their measurements or align trackers to their body. Here will be a list of ideas. We will need a framework to implement them in the future, and add more.
@ButterscotchVanilla had some success of improving precision by adding an additional bone between chest and waist as an average between the two during her AutoBone experiments. It's probably because lower back bends less than upper back, and additional average bone reduces the curve on the lower back. Basic solution like this should be easy to implement, but there might be better ways to do it.
Need to add calibration code and GUI for trackers that require calibration. It should be multi-step controlled by the server. For example, user clicks on "Calibrate gyroscope", server ask the user to put trackers down, and then records some data for gyroscope offset. Same for accelerometer and magnetometer calibration. Different calibrations should depend on the board and sensor used. owoTrack app should say it doesn't need calibration from the server, just move your phone in the 8 pattern, etc.
SEE UPDATE 1, THIS SHOULDN'T BE AT THE TOP PRIORITY AS IT'S ACTUALLY PRETTY MUCH VRCHAT'S FAULT.
While the default Head Offset value of 10 is great at increasing skeleton stability, it seems to have an effect to VRChat avatar bind-in because how it shifts the whole skeleton back, most noticeably causing some avatar's leg to not completely bend or bend too much when raising over hip level, this can be fixed by simply decreasing Head Offset to 0, this also offers more consistency when binding into different avatars, but at the cause of skeleton instability.
An additional Skeleton Offset or similar feature would be neat to balance out the skeleton offset caused by Head Offset, helping bind-in consistency in VRChat while maintaining skeleton stability.
Update 1: Problem described above happens in avatar with view position a bit on the back side, same avatar with view position that is more forward will work fine, so this doesn't have much to do with binding consistency, but more about to cope with VRChat avatar's different view position, and to get the best body motion without editing the avatar, or having to move view position to potentially undesirable location.
When i try to connect more than one iphone as trackers it doesnt work
re: discussion in development discord
Plan to overhaul UDP and IPC protocols, maybe using what we learned in implementing new Driver-Server protocol to help build new system that is both backwards and forwards compatible. Plan is to use flatbuffers for ease of development in any language and backwards compatibility.
A few notes:
Add support for JoyCons as SlimeTracker.
Using PlatformIO CLI we could build specific firmware by server with just gui interface, defines.h would be changed according to gui selections.
This would be totally optional thing and would not install platformio and python, user would have to enable this themselves
Server should check if another instance is running and close itself
Implement some sort of auto updater to server, for easy distribution
Add support for Virtual Motion Capture trackers server for both reading and writing data to support motion capture and vtubing software.
In case of misclick or user still have tracker forgot to be turned off.
Slime commons library should be a git submodule in this repo. Need to adjust workflow if needed, and update README accordingly with new instructions too.
Current process is not ideal, it requires restarting everything manually with certain order each time.
All it needs is an input field, would be easy to build into the doc pages while setting up firmware to change the port field there
Implement installing and updating (after #36) the SteamVR driver, either by button press, notification or automatically.
Reset and fast reset should use vectors instead of euler angles to align yaw properly, otherwhise they hit gimbal lock in some positions, especially fast reset that should be usable in any position. Should match forward vector of HMD with forward vectors of the trackers in both modes.
There can be an option added for feet always trying to touch floor when users raise legs, even if they don't have feet trackers. If the ankle is slightly above the floor, foot can snap down emulating user standing on their toes. This should be an opt-in option, since it can mess up some user's inentions.
Right now all interactions with Serial interface of trackers is done when user presses WiFi button. This should be changed, a new background worker should monitor serial devices and wait for trackers to connect. When it finds new connected tracker, it should start talking to it. A few features that should be implemented:
They conflict for ports, pipes, and other resources, and there is no need for this.
Right now if a tracker reconnects with a new IP to a running server, it will create a new tracker, even if their serial (MAC address) matches the old one. This situation should be specifically handled in the server, and old tracker should either be removed, or new tracker should be assigned to it, and old connection closed.
Add support for one leg. For this multiple skeletons (SkeletonWithWaist, SkeletonWithLegs) should be merged in one and be able to be used with any configuration of trackers, just returning warnings when chains are missing. This will also be a nice refactoring.
(Moved from #128 to its own issue to keep progress moving)
This issue is for listing tasks to improve upon AutoBone, especially in its accuracy, simplicity, and consistency
AutoBone
error function to be more accurate and consistentAutoBone
to be more object oriented to support a variable number of error functionsAutoBone
to use SkeletonConfig
internally instead of EnumMap
sAutoBone
within the code to make it easier to useDifferent approach of issue #75, would be having prebuild firmware and storing defines.h configs in flash with serial commands to change it. But there is a problem with board type, prebuilding for different hardware is not ideal.
To support Natural Locomotion SlimeVR needs to report speed (or acceleration) values in the driver. This requires support from firmware, software, and driver. Creating issue here for this feature, but it needs to be implemented everywhere.
There is a way to tell SteamVR through driver that the device is disconnected. Idk if it will actually work in games, but we still should use it and notify driver and SteamVR about it when user, for example, unchecks Knees in steamvr config in server.
Need to add a system to save and load configuration profiles, so different people could use the same PC to play, and to do easy experiments.
Some people's legs twist when they bend or twist them, like on the pictures below:
We need to add a coefficient that can be configured when the person sits, that will compensate for the twist. The coefficient should untwist the leg based on the pitch of it. The compensation should be applied to the base rotation of the tracker.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.