Giter Site home page Giter Site logo

rsvfx's Introduction

Rsvfx

Rsvfx is an example that shows how to connect an Intel RealSense depth camera to a Unity Visual Effect Graph.

gif gif

System requirements

  • Unity 2019.2 or later
  • Intel RealSense D400 series

How it works

inspector

The PointCloudBaker component converts a point cloud stream sent from a RealSense device into two dynamically animated attribute maps: position map and color map. These maps can be used in the "Set Position/Color from Map" blocks in a VFX graph, in the same way as attribute maps imported from a point cache file.

blocks

Frequently asked questions

Is it possible with Azure Kinect?

There is a project for you. Check out Akvfx.

Which RealSense model fits best?

I personally recommend D415 because it gives the best sample density in the current models. Please see this thread for further information.

rsvfx's People

Contributors

keijiro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rsvfx's Issues

Releasing vertex data

Thanks for such a great tool.

I review the SDK RS code and you make your own improvements. I assume that the MemCopy native function is now embedded on the native side.

How you release Vertex data on your RS SDK version, previosly you need to call .Release().

[Request] Motion responsive

Hi Keijiro, thank you for this amazing project!

If it is not ask too much, could you develop an example where the particles are motion responsive?

Cheers!

Erratic PointCloudMap (on iGPU)

Hi, I'm experiencing various issues in both 2018.3.11 and 2019.1.0 versions.
The PointCloudMap is incorrect as visible on the GIF; as the RealSense SDK and VFX both work otherwise, it seems to me related to the type of graphic card (on my system it's a Intel iris 650).

rsvfx issue

By any chance the creator or any user of this repository can verify this theory by checking if it works on his recent iGPU by Intel?

PS: the ColorMap doesn't work too, but that's less relevant for me and probably related to the core issue on PointCloudMap.

Trying with Orbbec Astra Depth Camera

Hello,
Hope you are doing fine. I am trying to make this code work with the orbbec astra depth camera. But the unity crashes when it try to run the Unsafe Utility code when setting the buffer data. I have tried to set the data by using unity's own set data by converting the intptr to an array but the output texture comes out empty with that. Any help would be much appreciated.
Thanks!

/////Code//////
void UpdateColorData(Astra.ColorFrame frame)
{
if (frame.DataPtr == System.IntPtr.Zero) return;

        var size = frame.Width * frame.Height;

        if (_colorBuffer != null && _colorBuffer.count != size)
        {
            _colorBuffer.Dispose();
            _colorBuffer = null;
        }

        if (_colorBuffer == null)
            _colorBuffer = new ComputeBuffer(size, 4);

        //Me trying to set color buffer data using set data...
        byte[] managedArray = new byte[size];
        Marshal.Copy(frame.DataPtr, managedArray, 0, size);
        _colorBuffer.SetData(managedArray);
        _dimensions = new Vector2Int(frame.Width, frame.Height);

        //Usafe utility code that causes the unity crash...
        //UnsafeUtility.SetUnmanagedData(_colorBuffer, frame.DataPtr, size, 4);
    }

Support for Intel RealSense L515 (LiDAR scanner)

I have tried to get the LiDAR L515 working with your project but it doesn't recognise the device. I updated the RealSense SDK with no luck. I am going to see if I get the same issues with the RealSense package on its own (I suspect they haven't updated it for this device).

I will keep you informed of what I find, and may fork the repo to make adjustments if I can. Be great to get your fantastic VFX working with this new device.

External Data Question

I had a question pertaining to external data usage, As long as the format of the data was the same (the point source and color source) would it be possible to stream external data from ROS using something like ros-sharp? Or it the RealSense too integral to be able to use other data. For reference I'm trying to stream point cloud data from ROS into Unity and display the cloud in AR. I also know this is fairly similar to this issue so I'm sorry if you are repeating any info here.

Device not detected while using Intel RealSense D455 Model

Hello, i was trying this project, my realsense model is D455, and when i pressed play it shows error device not detected. also i can only open it in 2019.2.5f1 editor (it would'nt change to 2020 or newer version), i tried to import the newest realsense sdk and it still doesn't do nothing, it just show another error when i play it. is there anything that i could do to fix this? thanks a lot!

Using URP

Hi Keijiro,

Is it still possible to use this repo with URP instead of HDRP? If so, what settings / materials must be updated in order for this to work? Thank you

Work with Spout

Hi! I want use you Asset with KlakSpout. I'm not can use it in this project?

Changing resolutions?

Hello I have this working with realsense d410 and I was trying to change the resolution. But can't seem to get any other resolution to work. I tried 1280x720 both at 60 fps and 30 fps which I thought the d410 supported, but that doesn't work. Is it possible to set it to a resolution other than 640x480?

Particle Demo not working on Mac

I know this is not designed for Mac, but I was able to get all of the example scenes working on Mac OSX except for Particles.

I figured out what is wrong, the Quad Output module requires a Scale.X.random or Scale.X. from Map block. Without that block, the effect works while editing the VFX graph, but does not compile or work in play mode.

Apple Silicon support

Is there are any chance of updating this to work with M1 Apple silicon macs?

I haven't managed to get librealsense working on M1 yet, but my D415 does work in TouchDesigner with librealsense2.2.38.dylib.

Thanks

Maximum distance from the camera

Hello, I was playing around with this asset with intel realsense 435 camera and noticed that it only works when you are maximum 2 feet away from the camera once you go further than that it seems to stop detecting you. Is there a way to increase the detection range to something like 7 or 8 feet. Looking forward to your response and thank you for the great asset.

mapping the waveform to a particle system

Hello, I tried to map the waveform to a particle system in the vfx graph. The particle system is positioned in a AABox (size 10 0.1 10). He is responding to the input sound in the Y axis by going up and down but each particle is responding for the moment in the same way (by getting positioned at the same height).

I tried to use the x position of each particle to sample the waveform texture. The vfx effect is positionned at the origin of the world (0,0,0). I have divided the x position by 5 (because the aabox is 10 in the x axis), used remap to get a value between 0 and 1( remap(value,-1,1,0,1) ), and multiplied this value by 1024 to get a different value for each particle from the sample texture. What I am missing ?

You can find the vfx here: https://www.dropbox.com/s/72n05py86ldbxyr/Wavos.vfx?dl=0

How can I import external package via git?

After downloading the repo, the console displays many package missing issues. I install git but I cant figure How to import these package inside the project.
Also, I use 2019.2.0f1, the VFX files inside the project are outdated and it says they need to be rebuilt. But I can neither open any of these VFX files or create a new VFX file inside of it, the VFX and HDRP package are already inside the project.

image
image

Best,
Hendrix

RSDevice Pipeline.Start Error

I have an issue where the RSDevice throws an error calling Pipeline.Start

ExternalException: rs2_pipeline_start_with_config(pipe:000001B822500F50, config:000001B822501070)
Rethrow as Exception: Couldn't resolve requests
Intel.RealSense.ErrorMarshaler.MarshalNativeToManaged (System.IntPtr pNativeData) (at <1ab02197d096429db5e25d131c03a77b>:0)
Intel.RealSense.Pipeline.Start (Intel.RealSense.Config cfg) (at <1ab02197d096429db5e25d131c03a77b>:0)
RsDevice.OnEnable () (at Assets/RealSenseSDK2.0/Scripts/RsDevice.cs:71)

I am running a D435 and the camera works every time within the same project with the RealSense Scenes but has only worked once with your RealSense Prefab. Once working I could run any of your scenes without issue.
I am running Unity 2019.1.0f2 and have the latest RealSense SDK 2.21.0 running with the latest 5.11.6.200 firmware installed onto the camera.

Enabling RealSense on OSX

Your examples work well on OSX, once the right dependencies have been bound.

For that, I merely pulled the most recent librealsense2 via brew and copied the files:

$ brew install librealsense2
$ cp /usr/local/Cellar/librealsense/2.20.0/lib/librealsense2.dylib $PROJECT_PATH/Assets/RealSenseSDK2.0/Plugins/librealsense2.bundle

I suspect you've used an older RealSenseSDK2.0 unity package, which seems to be incompatible. Due to this, I removed all other files in Plugins, downloaded the 2.20.0 .unitypackage from here: https://github.com/IntelRealSense/librealsense/releases/tag/v2.20.0 and imported only the Plugins part. After restarting the editor all your samples work fine with a D435.

D435i seems not to work currently due to incompatibility issues with librealsense2 at time of writing.

Thanks!

question and not an issue

Great work, im working on something similar but with a little bit different approach.
questions:
1- In the computeShader code, what value the RemapBuffer is holding? i couldnt understand SetUnmanagedData() function!! could you please explain it a bit?

2-I cant use VFX graph in my project. so I have to do everything in computeShader. my problem is doing math for pointcloud position while im using tracking camera! So, i need to read the depth(position) from depth map, add tracking camera position to them and then get the final data on CPU. adding depth info and tracking is where my problem is. Can you help me on that or refer me to somewhere that i can understand the math please?!

Cheers

Using RealSense recordings

Hi, I'm using some recordings from RealSense I found online, and I found and fixed a bug, but things are still not perfect.

The recordings are linked from https://github.com/IntelRealSense/librealsense/blob/master/doc/sample-data.md

The bug was caused because recordings load the color data in rgb format, not rgba, and I couldn't find a way to get the realsense SDK to do the conversion. I needed something working fast, so I modified DepthConverter.cs to allow rgb:
Added a member, private uint[] rgbToRgbaBuffer; and modified LoadColorData() as follows:

int stride = frame.BitsPerPixel / 8;
if (stride == 3)
{
    if (rgbToRgbaBuffer == null || rgbToRgbaBuffer.Length != size) rgbToRgbaBuffer = new uint[size];
    unsafe
    {
        fixed (uint* p = rgbToRgbaBuffer)
        {
            byte* inp = (byte*)frame.Data;
            for (int i = 0; i < size; i++)
            {
                p[i] = (uint)(*inp + ((*(inp + 1)) << 8) + ((*(inp + 2)) << 16));
                inp += 3;
            }
            IntPtr ptr = (IntPtr)p;
            UnsafeUtility.SetUnmanagedData(_colorBuffer, ptr, size, 4);
        }
    }
}
else
{
    UnsafeUtility.SetUnmanagedData(_colorBuffer, frame.Data, size, 4);
}

I was hoping there would be an api to fill a compute buffer with differently-strided data so that the conversion could happen on upload (e.g. how glTexImage() allows the source format to be different than the texture format), but alas SetUnmanagedData complains if input and dest strides are not multiples - it's not an item-aware copy really. So, I created a temp buffer to unpack from rgb to rgba.

In order for the code to work I needed unsafe in order to read in the frame, so I had to modify Rsvfx.asmdef - I changed "allowUnsafeCode" to true. Player Settings to allow for unsafe code should be set to true (default is true anyhow at least in 2019.3).

In PointCloudBaker.cs, I modified RetrieveColorFrame(Frame frame) to allow for rgb8 as well as rgba8: changed profile.Format == Format.Rgba8 to (profile.Format == Format.Rgba8 || profile.Format == Format.Rgb8) in the relevant if statement.

So, I did all of this and I'm getting a correct color stream, but depth is still wonky... Not sure why...

Also, my way is a bit dirty, if you have a suggestion for a cleaner way to deal with this, I could implement and PR

Using VFX graph to visualise wirelessly streamed data from Realsense

I am using the Realsense D435 , but instead of plugging it in and using the Unity package, I am streaming the data from realsense connected with another computer, to the Unity computer.
The data I am receiving is the color(RGB8) and position(Vector3) data for each particle in two separate arrays respectively.

I am using the implementation in this repo as a basis for visualising wireless realsense data in VFX graph but I have a few questions

  1. How does the 'Set position from Map' node get position information from a render texture and which render Texture format must it be?
  2. I am not sure what is being done with the remapBuffer. Why is remapping required?
  3. Is there a way to directly use the arrays received ( RGB8 color and Vector3 position) arrays into the 'Set Position/Color' nodes of VFX graph or is the only way to bake this data into render textures which can be used in the 'Set Position/Color from Map' nodes?)

Any help will be really appreciated! Also if someone does understand the code and could add comments it would also be very helpful while trying to reuse and modify it.

Multiple real sense camera

Is it possible to use multiple real sense cameras in this project connected to the same PC?

Thank you

Antonio

Any chance of uploading a Realsense recording somewhere?

I don't have access to a Realsense but I'd love to try out this project. As the Realsense Unity component has 3 modes: live, record and playback - I'm guessing I could get a good idea of how it works if I could use a recorded file.

Is there any chance of uploading one somewhere? It's going to be too big for Github but I'd be happy to provide hosting for it if that helps.

Not seeing anything using playback with .bag files. No D400 connected

I downloaded the project and get no errors, I got some sample .bag files from here: https://github.com/IntelRealSense/librealsense/blob/master/doc/sample-data.md

Is the D400 type camera being connected a requirement even for playback?
Sorry if this is not an issue with the code, maybe I don't have the right kind of .bag files
Or some setting is wrong to show playback, just black screen, no errors.

*just realizing, is the playback only for .bag files created within this projects record function?

Rsvfx doesn't use full frame data

Hey Keijiro,
first of all, very nice work with the Rsvfx.
I noticed, that your implementation only uses part of the information captured by the Realsense camera. The frame seams to be cropped on all 4 edges when compared to the visual in the Intel Realsense Viewer.
I did try 2 resolutions, 640x480px as well as 1280x720px. Setup works fine except for the crop.
Any hints on that?

ARKit

Would it be possible to adapt this effect to work on iOS using ARKit?

Nothing on screen, using D435

Hi, after importing the project on Unity 2019.4.2, and fixing some errors, I tried playing the scenes and there is no response on any of them. I'm using the sensor D435, and on using the latest unityPackage from the realsense repository it is working without issues.

I also tried importing the unityPackage in your project, but the realsense2.dll isn't copied because it prompts an access denied error.

Do I need to change any project settings? I've used your HDRP config file in graphic settings

Is incompatible with Unity 2019.1.7f1

Hello, I get the error below while using Unity 2019.1.7f1 :

ExternalException: rs2_pipeline_start_with_config(pipe:000002821EF7BD60, config:000002821EF7C1E0)
Rethrow as Exception: Couldn't resolve requests
Intel.RealSense.ErrorMarshaler.MarshalNativeToManaged (System.IntPtr pNativeData) (at <39ff202c67d64a63acbba4ae542e7167>:0)
Intel.RealSense.Pipeline.Start (Intel.RealSense.Config cfg) (at <39ff202c67d64a63acbba4ae542e7167>:0)
Rsvfx.CombinedDriver.Start () (at Assets/Rsvfx/Runtime/CombinedDriver.cs:114)

What can I do ?

Rotation issue

Great work,

If you rotate the camera in z axis, then the points will be in wrong position. but this is not happening in the Realsense sample scene. I think the problem is happening because the camera image also rotates. but, i couldnt figure out how to fix this issue in your sample! any thoughts?

Cheers!

Reverse movement

Is there a way to reverse the movement? i.e. if I walk past the camera left to right it simulates right to left.

Is incompatible with Unity 2019.2.0a9?

I tried to open 688fefc version of Rsvfx with Unity 2019.2.0a9 on Windows 10.
However, I faced some Package related errors.
Is Rsvfx incompatible with Unity 2019.2.0a9 so far?

[Edited] I've also tried with 2019.1.0a9. There was no problem at all!

Best regards,

Update to 2020?

Any chance of updating this project to 2020 or 2021? This is the only project with SRP that I can run in Unity. The intel sample project only runs on much older versions of Unity.

doesn't work with D435i

pic435i
Depth camera is on, but it's output nothing.
(Unity Version2019.1.0b10,everything is default)

wiper vfx can not be edited in another project

Hello, when I try to edit wiper vfx inside another project the effect do not work anymore as soon as I open it. It's not the case for the other vfx (like particle, scanner or simple).

I can edit it in the original project but not in a new one. I get these errors:

Remove 2 child(ren) that couldnt be deserialized from of type UnityEditor.VFX.VFXGraph
UnityEditor.VFX.VFXGraph:OnEnable()

Remove 2 child(ren) that couldnt be deserialized from of type UnityEditor.VFX.VFXGraph
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Any idea to solve the problem ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.