Giter Site home page Giter Site logo

googlesamples / arcore-depth-lab Goto Github PK

View Code? Open in Web Editor NEW
747.0 747.0 148.0 78.44 MB

ARCore Depth Lab is a set of Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. (UIST 2020)

Home Page: https://augmentedperception.github.io/depthlab/

License: Apache License 2.0

ShaderLab 18.37% C# 77.62% HLSL 2.53% GLSL 1.48%
ar arcore arcore-unity depth depth-api depthlab interaction mobile

arcore-depth-lab's People

Contributors

baobao avatar cs-util avatar kidavid avatar ruofeidu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arcore-depth-lab's Issues

How access dense depth texture

Hello

I read in this publication that "a dense depth texture created by arcore sdk in the gpu has every pixel in the color camera image with a depth value mapped to it" (page 5).

How can I get the dense depth texture and its depth values? This would help me get the depth value for each pixel in the high resolution camera background rgb texture.

I am guessing pairing the depth value for each pixel in the camera background rgb texture can be hard because the depth texture has very low resolution (160x120) versus the camera background rgb texture resolution (2560x1440 for pixel2xl), i.e. not 1:1 when comparing pixels

Thank you for your help, Sergio

[Question] Collision Detection object placement

First of all, this is a very impressive collection of different use cases for the Depth-API. In particular the localized depth examples are very interesting.
I'm developing with ARCore and still "Sceneform" and therefore I'm using a traditional way to place anchors by a hittest and processing the hitresults. That is working great, but there are areas where it is not possible to retrieve a hitresult, or it is just a pain in the ** until something is returned. Exactly for that I adopted some ideas from Oriented 3D reticles and Collision checking for AR object placement. So far it is working but I'm not very satisfied with the anchoring. I'm using the session to create a anchor based of a Ray, which is a result of the Depth computation I made. The problem is, that the anchoring is not very stable and tend to move with the camera. If I compare the anchroing of the Collision checking for AR object placement example it feels more solid to me. So I'm not a Unity guy and I have some difficulties to understand all pieces of the written code.

  • How is the anchoring of the object achieved
  • Has someone a workaround to get a more stable anchoring

Continuous evironment mesh building

hello @ruofeidu, thank you for all the effort you guys put into making this repo and the research behind it. i am doing something similar to #31, and I am looking at KinectFusion and some SLAM alternative.

One question though, how could I reuse the primitives that you have for obtaining the depth information and the point cloud? I have found resources online that i have to write my own shaders etc, and was wondering if there as an API I could use (I have not tried or used the ARCore Raw Depth API, could you maybe explain what is possible and what is not possible with it? what did you have to do on your own that I could maybe reuse from your work?

Why exclude the Packages directory?

It makes this much more fragile and prone to breakage. The whole point of putting an entire Unity project in a repo is to have a reliable, reproducible set up. The vast majority of Unity projects on Github include Assets, Packages and ProjectSettings and in my experience the ones that don't are much less likely to work without some fixes at my side.

Device compatibility

All ARCore compatible devices are compatible with this API? I have a Samsung Galaxy S8 and S9.

Camera Choppiness/Jittering

When running the Depth API there is a very noticeable choppiness/jittering - https://youtu.be/1KoE8fglirM
When running the Depth Lab App from Play Store everything is very smooth - https://youtu.be/iRogGT21ke8
I tested this on a S9+ and S20 Ultra and had same issue.
I believe the camera is, for some reason, is at a very low FPS.
I checked the FPS settings and camera settings within the Depth API and everything appeared correct.
Not sure what is causing this.

Is the deep solving algorithm open source

Thanks for the excellent work of open source,
For whether the algorithm for deep solving in the project is open source, look at the relevant code in that file。

Linking ToF Depth data with main camera's image.

For doing turntable photogrammetry I'm wondering if there's an easy way to save the depth map from my LG V60's ToF sensor along with the RAW pic from my main camera. There's a way to sorta do it with an annoying workaround.. taking the photo using portrait mode then extracting the depth map from the meta data. This however is very time consuming but also you lose all control of the cameras manual settings. I'm taking the dataset back to Metashape on my PC so all the hardware intensive computing will be done on that machine. I cant believe how this simple task of tagging depth information beside a photo is so far impossible to do. How did that dumb fruit company implement ToF/Lidar hardware so much better than Android? Like, its not even close. Embarrassing.

Continuous evironment mesh building

Hello, first of all - thank you for such a great project!

I have a question about continuous evironment mesh building - could you point me a way how to implement creating enviroment "scanner", i want to create approximate model of around environment - i see this like every new frame will fill and deform already existing "scanned" mesh and make it more accurate (in future - with environment texture mapping). I think the simpliest way is to subtract new scanned mesh from already scanned.

Maybe it can be a good and very interesting objective for a new sample!)

Depth API iOS Platform support

i'm using Iphone 7, Running ios 15.8 and having single camera, My Question is that does this ARCore Depth API Supports IOS devices or not. if it supports i can use this in my app for sensing depth data.

Collider runtime error after local Unity 2018 build/install (was: Build error in Unity)

I'm trying to Build and Run the ARRealismDemos/Collider/Scenes/Collider
for Android Platform
on Windows 10 version of
Unity 2018.4.22f1 Personal

I'm seeing build errors

Error building Player because scripts have compile errors in the editor
Build completed with a result of 'Failed'
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr) (at C:/buildslave/unity/build/Modules/IMGUI/GUIUtility.cs:179)
UnityEditor.BuildPlayerWindow+BuildMethodException: Error building Player because scripts have compile errors in the editor
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x00242] in C:\buildslave\unity\build\Editor\Mono\BuildPlayerWindowBuildMethods.cs:194
at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x0007f] in C:\buildslave\unity\build\Editor\Mono\BuildPlayerWindowBuildMethods.cs:97
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr) (at C:/buildslave/unity/build/Modules/IMGUI/GUIUtility.cs:179)

Any suggestions on how to fix this ?

ScreenSpaceDepthMesh example broken?

Hi,
I am trying ScreenSpaceDepthMesh examplese but nothing happens. I loaded ScreenSpaceDepthMesh and StereoPhoto scenes.

Is it expected? What do I miss?

Point Cloud Example Crash when using Raw Depth

The point cloud example built on the android device crash when pressing
the update button to call ComputePointCloud() in PointCloudGenerator.cs

This example works fine without checking "Use Raw Depth" in the inspector

Branch : arcore_sdk_unity
Unity Version: 2020.3.9f1
Device : pixel4a

How does Depth Lab provide 0-65m depth values?

I noticed that the Depth Lab uses AR Foundation v.4 but provides depth estimation for 0-65m, although AR Foundation v4 does not support this.
Currently, I am working with AR Foundation v.5 since I need to work with the whole 0-65m depth estimation, but I also need to work with the ARCore Extensions, which are only available for AR Foundation v.4.
When I try to use AR Foundation v.4, I cannot access the 0-65m depth estimation but only the 0-8m depth estimation since everything above 8m is set to 0. However, when using the Depth Lab app and activating the Depth Map, it seems that Depth Lab is able to detect the depth even for higher values than 8m.
I looked into https://github.com/googlesamples/arcore-depth-lab/blob/22cd7f1ce4eb2ed73bda19a0ea1bf3e636831799/Assets/ARRealismDemos/Common/Scripts/DepthSource.cs, but I don't get it why Depth Lab is able to provide the 65m depth values.

It would be great if someone could help me with that. Maybe someone can explain where the values greater than 8m are set to 0 and how Depth Lab bypasses this. And perhaps also how I can do this as well.

I really appreciate any help you can provide.

Implemetation of ARCore Depth API functions in AR Foundation 4.x.x?

I want to implement some depth functions of the ARCore Depth API into an already existing AR Foundation project.
Is it possible to copy the scripts and adapt the gradle settings?

I want to improve the functionalities to toss a dice on a ground with occlusions. You should be able to cover objects with your hand or a person.

PointCloud not getting updated each frame

Hi, I was trying to build an application using the RawPointCloud Scene, I want to save the point cloud generated inside a .ply file, while I was able to create a sample PLY which can be found here, what I noticed each frame is producing new point clouds not updating last points if users haven't moved.

So in my ply file, I can see layers of points with slightly different positions in world space, so are the depth points tracked across frames through some identifiers?

for creating .ply I'm creating a list that saves vertices and their color in a session and saves them as .ply

if(vertex != Vector3.zero) { updatedPoints.Add(vertex); updatedColors.Add(color); }

URP?

Isn't it supporting URP?

Collider demo is not working properly..

Hi there,
When I build the collider scene to test physics simulation, the app is not responding on clicking the throw button. I tried tweaking the project settings but still those struggles didn't yield any result. The project is setup on unity 2019.4 LTS and I ran the app on pixel 2 xl. Can anybody help me in getting this work?
Thanks in advance,
Vishal

Building Error (Unity 2019.3.3f1)

I'm using Unity 2019.3.3f1 Personal. After opening arcore-depth-lab and importing arcore-unity-sdk-1.20.0.unitypackage I set the version of ARCore Foundation from 2.0.2 to 3.1.3.

But if I build the app I always receiving following errors:

BuildFailedException: GoogleARCore detected. Google's "ARCore SDK for Unity" and Unity's "ARCore XR Plugin" package cannot be used together. If you have already removed GoogleARCore, you may need to restart the Editor.
UnityEditor.XR.ARCore.ARCorePreprocessBuild.EnsureGoogleARCoreIsNotPresent () (at Library/PackageCache/[email protected]/Editor/ARCoreBuildProcessor.cs:79)
UnityEditor.XR.ARCore.ARCorePreprocessBuild.OnPreprocessBuild (UnityEditor.Build.Reporting.BuildReport report) (at Library/PackageCache/[email protected]/Editor/ARCoreBuildProcessor.cs:30)
UnityEditor.Build.BuildPipelineInterfaces+<>c__DisplayClass15_0.b__1 (UnityEditor.Build.IPreprocessBuildWithReport bpp) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEditor.Build.BuildPipelineInterfaces.InvokeCallbackInterfacesPair[T1,T2] (System.Collections.Generic.List1[T] oneInterfaces, System.Action1[T] invocationOne, System.Collections.Generic.List1[T] twoInterfaces, System.Action1[T] invocationTwo, System.Boolean exitOnFailure) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

BuildFailedException: 'Preferences > External Tools > Android > Gradle' is empty. ARCore SDK for Unity requires a customized Gradle with version >= 5.6.4.
GoogleARCoreInternal.ARCoreAndroidSupportPreprocessBuild.CheckGradleVersion () (at Assets/GoogleARCore/SDK/Scripts/Editor/ARCoreAndroidSupportPreprocessBuild.cs:123)
GoogleARCoreInternal.ARCoreAndroidSupportPreprocessBuild.OnPreprocessBuild (UnityEditor.BuildTarget target, System.String path) (at Assets/GoogleARCore/SDK/Scripts/Editor/ARCoreAndroidSupportPreprocessBuild.cs:59)
GoogleARCoreInternal.PreprocessBuildBase.OnPreprocessBuild (UnityEditor.Build.Reporting.BuildReport report) (at Assets/GoogleARCore/SDK/Scripts/Editor/PreprocessBuildBase.cs:52)
UnityEditor.Build.BuildPipelineInterfaces+<>c__DisplayClass15_0.b__1 (UnityEditor.Build.IPreprocessBuildWithReport bpp) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEditor.Build.BuildPipelineInterfaces.InvokeCallbackInterfacesPair[T1,T2] (System.Collections.Generic.List1[T] oneInterfaces, System.Action1[T] invocationOne, System.Collections.Generic.List1[T] twoInterfaces, System.Action1[T] invocationTwo, System.Boolean exitOnFailure) (at <9a184ab867bb42c296d20ace04f48df3>:0)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Error building Player: 2 errors

Build completed with a result of 'Failed'
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

UnityEditor.BuildPlayerWindow+BuildMethodException: 3 errors
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x00275] in <9a184ab867bb42c296d20ace04f48df3>:0
at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x00080] in <9a184ab867bb42c296d20ace04f48df3>:0
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

I always receive following Android Resolver message:
Unbenannt

I'm using Android Studio 4.0.1 and I already have jdk-14.0.2. I don't understand the error.
How to fix the error?

device should be compatible

Hi,

I setup the DepthLab Project as described on my Samsung S8 (which should be compatible). In fact i ran a Sample that used the ARCore Depth feature thru ARFoundation and it worked fine. However in the DepthLab project the app compiles but states that my device is incompatible and does not receive a cameraimage (background is black).
I ran this on Unity 2019.4.4f1 with the XR Legacy Input Helpers and Multiplayer HLAPI packages. I did not change anything other than adding those packages and switching to android as a plattform.
Any Ideas on what is going wrong?

Thanks for your hard Work
ARPatrick

relighting implementation ignores camera intrinsic parameters

hi, I observe that your implementation ignore the camera intricsic parameters at

float3 result = SampleColor(uv);
float depth = SampleDepth(uv);
if (_RenderMode == kRenderCameraImage) {
return result;
}
if (_RenderMode == kRenderDepthMap) {
return TurboColormap(depth);
}
result = lerp(result, result * 0.5, _GlobalDarkness);
// Common inputs:
float2 aspectRatio = CalculateAspectRatio(uResolution);
// aspectRatio = _AspectRatio;
float2 normalizedUv = NormalizeCoord(uv, aspectRatio);
float3 samplePos = float3(normalizedUv, depth);

How much is the relighting results related to this calculation.
As the pinhole camera is one approximation of world imaging and the vertical and horizitional focal length most equal,

Given:
P = vec3( uv, depth)  
P = K*W  // K is one 3*4 matrix
so, 
W = K^-1*P // W is [x,y,z,1]

calculating W to perform raymarching may be not neccseary at all.

Depth-lab in WebXR

Can we use arcore-depth-lab with WebXR? I want to try 3d cursor which uses arcore-depth-lab for accurate placement and orientation of the raticle.

Unreal engine

What about for Unreal engine?
It's been long time to support for Unreal engine with many features missing?

Is it deprecated for UE4? Or planning to release for UE5 with Depth API support?

MaterialWrap and ScreenSpaceDepthMesh issue on Samsung S20+

Hi there,

I have a small issue regarding the MaterialWrap and ScreenSpaceDepthMesh scenes of the Depth Lab.
Instead of getting a sculpted mesh that corresponds with the colored depth map of the DepthEffects scene, I get a flat, oval shaped mesh (see image below). This happens with the MaterialWrap scene, and also with the ScreenSpaceDepthMesh scene when I press 'freeze'.

Screenshot_20200705-224526_Depth Lab - All

The issue happens on both the play store version of the app, as well as the version I built in Unity. I have a Samsung Galaxy S20 plus with a seemingly working ToF sensor in other scenes.

Is there a fix for this issue?

Thanks, Nick

can run sample app on Mac silicon?

Hello I run sample app with 2020.3.29f1 / 2021.3.0f1(Mac silicon)
and causing error

Assets/ARRealismDemos/Common/Scripts/Recorder.cs(24,7): error CS0246: The type or namespace name 'Google' could not be found (are you missing a using directive or an assembly reference?)

Assets/ARRealismDemos/Common/Scripts/Recorder.cs(38,12): error CS0246: The type or namespace name 'ARCoreExtensions' could not be found (are you missing a using directive or an assembly reference?)

Assets/ARRealismDemos/Common/Scripts/Recorder.cs(40,13): error CS0246: The type or namespace name 'ARRecordingManager' could not be found (are you missing a using directive or an assembly reference?)

what maybe the problem?

Depth in WebCam on Web Browser

Great work with the project!

We would like to apply real time 3D AR interactions with the video stream from a WebCam in a browser.

Is this package available on JavaScript? It would greatly improve accessibility.

I was unable to find the 3D portrait that only works on humans. We are looking for the entire 3D surface so that we can create our robotic interface.

Many thanks!

Create a simple working demo

Hi everybody,
I'm having some problems with creating a simple working demo from 0.
My goal is to put some GameObject in the scene with a custom material and use the Depth map to occlude part of the cube when it's not visible. Which are the steps to get it?

  • Add prefabs for the camera (ARcore Device) and for illumination (Environmental Light).

  • Enable Depth Mode in the session config attached to the ARcore Device.

  • Add DepthSource component to the ARcore Device.

  • Add some Cubes with the DepthTarget component attached.

When I run, no depth info is used and cubes appear in the old way.
What's is my fault?

v1.18.0?

I am excited about these samples, but v1.18.0 is still not available for us until now. So when will it be published?

No Shadows in Collision Scene

Nice work. I'm trying to get shadows from the throw objects in the Collision scene but they seem to be missing. They are there in the non AR Foundation version. Looks like the shadow receiving mesh is obscured by the AR Foundation occlusion.

Problem with updated packages

I faced difficulties assembling this project as per the provided instructions. The suggested versions of Unity (2020.3.6f1), AR Foundation (4.2.0-pre.7), and ARCore Extensions (1.24) didn't work for me. I had to use the latest available versions to make it functional: Unity (2023.1.0b20), AR Foundation (5.0.6), and ARCore Extensions (1.38.0).

Unfortunately, I encountered some issues with this updated configuration. In comparison to the Google Play version of project, objects tend to partly fall through surfaces and behave worse in my assembly. Overall, the project's performance is considerably worse, than Google Play version of project on the same device.

Given the nature of library updates, I understand that such discrepancies are expected. However, being new to this technology, I'm curious to know which specific aspect is causing the most difficulties.

I haven't made any changes to the code. The issues arose solely from updating the libraries, which were necessary for the project to function.
image

If it's necessary, I can provide all information about fail in build with requirements in readme.

Material doesn't have a texture property '_CurrentDepthTexture'

Hi,
I'm running depth lab samples on Unity 2019.4.9, ARCore SDK 1.18, on Pixel 2. Seems SetDepthTexture(DepthTarget) assumes the target material has a _CurrentDepthTexture property which is not always the case. In the collider sample, none of the depth target occlusion materials have a _CurrentDepthTexture property. This results in like huge numbers of log errors on the device monitor.

09-09 09:17:47.552: E/Unity(8720): Material doesn't have a texture property '_CurrentDepthTexture'
09-09 09:17:47.552: E/Unity(8720): DepthSource:SetDepthTexture(DepthTarget)
09-09 09:17:47.552: E/Unity(8720): DepthSource:Update()
09-09 09:17:47.552: E/Unity(8720): [./Runtime/Shaders/Material.cpp line 1452]
09-09 09:17:47.552: E/Unity(8720): (Filename: ./Runtime/Shaders/Material.cpp Line: 1452)

Offending code in DepthSource.cs:

private static readonly string k_CurrentDepthTexturePropertyName = "_CurrentDepthTexture";
........
private static void SetDepthTexture(DepthTarget target)
    {
        Texture2D depthTexture = DepthTexture;

        if (target.SetAsMainTexture)
        {
            if (target.DepthTargetMaterial.mainTexture != depthTexture)
            {
                target.DepthTargetMaterial.mainTexture = depthTexture;
            }
        }
        else if (target.DepthTargetMaterial.GetTexture(k_CurrentDepthTexturePropertyName) !=
            depthTexture)
        {
            target.DepthTargetMaterial.SetTexture(k_CurrentDepthTexturePropertyName,
                depthTexture);
        }
    }

BUG: Occlusions is not working properly into HelloAR using Depth API on android app

Hello, I am using:
Unity 2018.4.22 for windows 2010
Arcore arcore-unity-sdk-1.18.0
Huawei p30 pro (Depth API supported)

I pasted the folder arcore-depth-lab-master including the project settings to the new unity project.
I could build the Depth Lab apk and running on my phone without a problem.

But when I try to build the HelloAR example enabling Depth API.

The occlusion does not work properly as Depth Lab app built from unity and run from the same phone.

image
Screenshot_20200721_005922_com google ar unity arcore_depth_lab 1
Screenshot_20200721_005922_com google ar unity arcore_depth_lab
Screenshot_20200721_010202_com google ar unity arcore_depth_lab

Sometimes it gets dark as a shadow or gets the color of the object. But I think it has to disappear the object´s body when a physical object is between it and camera as Avatar depth lab
Any solution of this?

S20 Ultra - InvalidOperationException("Invalid depth value");

When testing on S20 Ultra is throwing InvalidOperationException("Invalid depth value");
Objects and wraps are not being pinned and are just floating around when moving phone.
I did not encounter this issues when testing on a S9+.
Also S20 Ultra has no issue when running Depth Lab app.

Noisy Point Cloud

Hi,
I tried the point cloud sample and I noticed "flying points" when there are depth discontinuities in the scene, for example if I want to visualize the point cloud of a chair in a room there will be points flying in the air behind the chair besides the ones correctly visualized. As you know this is a common problem with ToF sensors where you can adjust a confidence threshold in order to minimize the anomalies.
I tried the sample with a Samsung Galaxy S8 and with a Samsung Galaxy S20+ that has its own ToF camera with VGA resolution, they have the same problem:
https://www.youtube.com/watch?v=YlbYWrjLWAQ

Is there a way to access the confidence of the DepthSource?

App opens on Pixel 2 XL but does not run AR - Google Play Services for AR Required

Hello all

I'm on 2018.4.19f1 and a pixel 2 XL Android 11 and the build installed on the smartphone does not run the AR experience.

I have Google Play Services for AR installed and updated.

I can see the sprites to choose the AR experience but the rest of the screen is black. I build & run a single scene and when the app opens I get "this application requires th latest version of google play services for AR", however I have Play Services for AR installed and updated.

If I build all scenes with the caroussel I get the same message that I required Play Services for AR and after pressing ok it shows me the Play Services for AR app to install but there is nothing to do because it's already installed. So next I go back to the app I built that is open and it says "depth api is not supported on this device. Please make sure your device is compatible".

Pixel 2 XL is indeed an AR enabled device.

Furthermore, I can confirm I can run AR on this phone with Unity's AR Foundation and the ARCore Depth Lab example on the play store

I did a factory reset on the phone but I still get this behaviour.

I thought it may be a problem with the arcore-unity-sdk version but in trying to upgrade from arcore-unity-sdk-1.18 to 1.21 I get gradle errors and can't build.

Anyone else experiencing this? Any guidance really appreciated please

Got Error when I tried to Build the master's code on my device( NullReferenceException)

I got the issue below and I can't fix it for a long time..!
How can I fix this?

I got this android's sample app code(master's one) from this page and tried to build on my android device.
I imported ARCore-Extension Package using Package Manager.
Thank you!

⑴ NullReferenceException: Object reference not set to an instance of an object
Google.XR.ARCoreExtensions.Internal.RuntimeConfig+<>c.b__7_0 (UnityEngine.Object x) (at Library/PackageCache/com.google.ar.core.arfoundation.extensions@c3bc1636a644-1622675792614/Runtime/Scripts/Internal/RuntimeConfig.cs:79)

⑵ Error building Player: NullReferenceException: Object reference not set to an instance of an object

⑶ Build completed with a result of 'Failed' in 0 seconds (268 ms)
UnityEditor.EditorApplication:Internal_CallGlobalEventHandler () (at /Users/bokken/buildslave/unity/build/Editor/Mono/EditorApplication.cs:428)

⑷ UnityEditor.BuildPlayerWindow+BuildMethodException: 2 errors
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (UnityEditor.BuildPlayerOptions options) [0x002be] in /Users/bokken/buildslave/unity/build/Editor/Mono/BuildPlayerWindowBuildMethods.cs:190
at UnityEditor.BuildPlayerWindow.CallBuildMethods (System.Boolean askForBuildLocation, UnityEditor.BuildOptions defaultBuildOptions) [0x00080] in /Users/bokken/buildslave/unity/build/Editor/Mono/BuildPlayerWindowBuildMethods.cs:9

⑸ Asset Packages/com.google.ar.core.arfoundation.extensions/Editor/BuildResources/DependenciesTempFolder has no meta file, but it's in an immutable folder. The asset will be ignored.

NullReference to DepthSource

Hi @ruofeidu ,

I tried installing master code with Demo carousel as well as seperate scenes, but everytime I am getting NullReference to Object DepthSource, I tried adding loggers in awake method of DepthSource but it never gets printed.

In your documentation, you mentioned at least one Depth Target should be present in scene in order for DepthSource to work, Can you please explain for which game object DepthTarget is attached by default (I tried searching for same but could find).

NullReferenceException stack trace:
Scene : Collider
Script : DepthMeshCollider.cs
Method : private void Update() 
Line # 252 ->
 else
        {
            if (DepthSource.Initialized)


Can you please check.
Thanks and Regards,
Mihir

Depth API not supported: OnePlus 11

I'm using a OnePlus 11 and I installed this app from the Google Play Store. Upon opening the app, I get the warning: "Depth API is not supported on this device. Please make sure your device is compatible."
However, OnePlus 11 is listed as supporting the Depth API: https://developers.google.com/ar/devices
What could be causing this issue?

can't build

Hello,
Can't build on Unity 2018.4.19f1 android 11 on win10. Getting error
CommandInvokationFailure: Gradle build failed.
C:\Program Files\Unity\2018.4.29f1\Editor\Data\PlaybackEngines\AndroidPlayer/Tools\OpenJDK\Windows\bin\java.exe -classpath "C:\Program Files\Android\gradle-5.6.4\lib\gradle-launcher-5.6.4.jar" org.gradle.launcher.GradleMain "-Dorg.gradle.jvmargs=-Xmx4096m" "assembleDebug"

stderr[

FAILURE: Build failed with an exception.

  • Where:
    Build file 'E:\dirs\arcore-depth-lab\Temp\gradleOut\build.gradle' line: 136

  • What went wrong:
    Could not compile build file 'E:\dirs\arcore-depth-lab\Temp\gradleOut\build.gradle'.

startup failed:
build file 'E:\dirs\arcore-depth-lab\Temp\gradleOut\build.gradle': 136: expecting '}', found '' @ line 136, column 1.
1 error

Is it possible to play recorded ARSessions in the Unity Editor or on an emulator?

Hello, first of all really great work on the arcore-depth-lab!

I was wondering whether it is possible to play recorded ARSessions with the ARPlaybackManager in the Unity Editor or on an android emulator. This would allow people to iterate faster when developing AR applications using ARFoundation.
When trying to test our application in an android studio emulator, the camera didn't show us anything in our application despite being set to a virtual scene. It did work in the camera app.
In case this feature doesn't work in both an emulator and the editor, will this be a planned feature?
Thank you in advance!

Rodion

Doesn't compile in Unity v2019.3 > several scripts are missing

This is the output after importing using the arcore sdk in unity with the realism examples. Project won't compile past this and i checked to make sure all the scripts were imported. (none of the below scripts are included in this repo)

Assets/ARRealismDemos/Common/Scripts/AttachDepthTexture.cs(64,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/DepthTextureController.cs(90,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/DepthVisualizationEffect.cs(130,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/AttachDepthTextureToMaterial.cs(83,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/Common/Scripts/NoticeHelper.cs(101,22): error CS0117: 'Session' does not contain a definition for 'IsDepthModeSupported'
Assets/ARRealismDemos/Common/Scripts/NoticeHelper.cs(101,43): error CS0103: The name 'DepthMode' does not exist in the current context
Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scripts/DepthPulseEffect.cs(213,27): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'
Assets/ARRealismDemos/CollisionDetection/Scripts/FreeSpaceRenderer.cs(157,35): error CS0117: 'Frame.CameraImage' does not contain a definition for 'UpdateDepthTexture'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.