Giter Site home page Giter Site logo

calvr-for-android's People

Contributors

klucknav avatar weichenliu avatar zhangmenghe avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

noavocadothx

calvr-for-android's Issues

[Note] ARCore

ARCore

Enable Arcore in NDK Project

Refer to this commit to see how how to include arcore lib in ndk project.
Then we can use native arcore functions in native codes.

What ARCore supposed to do

  • Motion Tracking: use Trackable class
  • Environment understanding ....emmmm...
  • Lighting Estimation : use lighting estimate class

Concepts and APIs

Concepts

  1. Type Value and Reference Value: The former owned by app, e.g. ArImg, ArFrame...Use create/destroy. The latter owned by ARCore. e.g. ARAnchor. Use acquire/release. Acquire could be a long-lived or acquire each frame is transient large data.
  2. ArPose model matrix in meters

[8.18-8.21] Progress

Moudles that are at least partially sync

  • CVRViewer
  • SceneManager
  • MenuManager
  • ConfigManager

Next Step

  • Figure out what is the aim of Interaction?
  • What "kernal" moudles that definitely not needed.
    • ComController multinode communication and synchronization
    • Tracking system.
    • Navigation System: originally used for move camera. But camera need to follow real camera
    • Screen
    • CollaborativeManager and ThreadedLoader (ignore yet)
    • InteractionManager System. Really close related to tracking and navigation. Necessary?
    • Load Profile design. Necessary?

[Notes] Build OpenCV library for Android

OpenCV provides pre-built SDK for varies Android platform, however, those APKs and staticLibs are compiled by GNU which is not compatibale for us.
To build from scratch :

  • Download OpenCV from OpenCV REPO.
  • Set cppFlags in gradle files with -std=c++17 while arguments -DANDROID_STL=c++_static
  • Try to compile and build, there are some imcompatible between C++11 and C++17 in 3rd party libs for example the throw() and register. use #define register to ignore them
  • Create mk file refer to official SDK files or just replace the staticLibs and 3rdParty libs by yourselves.
  • Include OpenCV to your project:
      set(OpenCV_DIR "${EXT_PROJ_DIR}/opencv-3.4.3/build/${ANDROID_ABI}/sdk/native/jni")
      find_package(OpenCV REQUIRED)
      message(STATUS "OpenCV libraries: ${OpenCV_LIBS}")
      target_link_libraries(cvrUtil ${OpenCV_LIBS})

[Updates] 7.03-7.10

  1. Make the ARCore samples as well as our project work on S9 :
    See environment details listed in README file.

  2. Render AR plane:
    effectively detect horizontal surfaces but limited depth perception. How they detect vertical plane?

    • use anchor clouds(works automaticlly) : wall should contain constrasting colors or patterns. I believe this is what google refer as "environment understanding". A possible open source point cloud library : pcl
    • search a frame plane by point of scene view: detect a horizontal surface first and get the borders. Go through the scene along Y-axis pixel by pixel to find intersection point.
    • search a plane normal using ray casting: ray casting along the Y-axis until a hit test is found or the angle reaches perpendicular to the ground, meaning there are no planes in front.
      reference

  1. Render point cloud and anchors:

  2. Render objects with lighting estimation :
    use of light estimate: color correction to fragment shader ArLightEstimate_getColorCorrection + Retrieves the pixel intensity in gamma space of the current view ArLightEstimate_getPixelIntensity

drawing

  1. Add actual Touch action

drawing

  1. Add Android button to switch between normal camera rendering and sobel edge detection

[Updates]6.27-7.02

  1. Learn to use Android Studio : Build a simple pure Java Android App. Includes 2 activities and simple interactions.

  2. Get to know NDK/JNI : Build simple NDK project

  3. Use OpenGL ES : Build NDK project with OpenGL ES lib #3

    • 062702
  4. Get to know ARCore : #4

    • Integrate ARCore into NDK project with OpenGL ES(this is a loaded apk)
      - 062701

    • Create a ARCore application.Try ARCore samples (NDK Projects)

      • Automaticly detect surface to draw grid and click to generate figures

  5. All together: MyGLES project
    Refer to the sample ARCore project to put all parts together:

drawing


Discussion

  • Is ARCore open source?
    No.

  • Does Emulator work?
    Yes and yes for ARCore as well!(there will be a virtual scene)

  • What role it plays in a rendering process?
    It is not the controller. We can easily access the camera, trackable planes, points ... to get to known the environment via ARCore APIs. Now we create an activity with surface ->surface.setRenderer(GLSurfaceView.Renderer )-> do actual rendering in native codes using OpenGLES renderer, which could be encapsulated by CalVR.

  • What's next

    • Finish this ARCore application using pure OPENGLES.
    • Start to work on integration with CalVR

[Notes from siggraph18]

Amazon Sumerian

Create AR/VR/3D app on web easily.

  • Web-based and cloud-based
  • Platform-agnostic:Once created, run on multiple platforms
  • Graphics Rendering:WebGL
  • VR Support: WebVR
  • AR Support: ARKit and ARCore
  • Scripts: JS
  • Powered the same as Alexa
  • Demo: Sumerian room and Weatherbug(real-time augmentation)
    For AR since rely on ARKit and ARCore, not fully web-based.

Train on Synthesis data

Specificlly useful for rare data.E.g. red left turn traffic light

Sensors could be simulated

  • camera vision detection
  • camera vision segmentation
  • Lidar semantic Segmentation
  • Radar Detection

Application

  • ZOOX: Automous mobility.Start from point cloud map->Lidar lane map->Houdini created lane map->generate other details.

[Reconstruction Notes] VERY IMPORTANT

For GL 2.0 and 2.1, you could make use of built in uniforms such as gl_ProjectionMatrix and gl_ModelViewProjectionMatrix and many others.
It is strongly recommended that you get rid of these and use your own uniforms.
In core contexts version 3.1 or greater, most of the built-in uniforms were removed.

We do not update scene camera, instead, manully set those uniforms using createXXUniform from AndroidHelper

osgUtil::IntersectVisitor Deprecated - use IntersectionVisitor instead.

[Progress till 8.13] CalVR Integration

Our work on building CalVR on Android based on Weichen's Note. Here is a brief summary of the steps to build CalVR and abstract of what I've achieved.

Basic Facts

  • Our CalVR alteration branch on Github is android-support brach.
  • Basiclly, we tried to do the minimum modification to the original CalVR and use #ifndef __ANDROID__ in most cases.

Build Steps

  • Pre-request: OSG and MXML library.
  • The steps to dynamiclly build OSG, MXML, CalVR, please refer to Weichen's repo. The successful build version can be get from this commit
  • The static link version of those CMakeLists can be gained from my repo. You can switch REBUILD variable to choose whether rebuild or use the static libs.

Progress For Now

  • Get CalVR staticlly build on Android
  • Process config files : Currently, I put all resources needed by CalVR in the asset folder CalVRAsset. On creating the Activity, those resource will be copied to runtime.
  • get environment variables: Environment variables are grabbed from getenv(). On Android, we need to get the runtime file location. We created our own Environment processor to manully set environmental variables and override getenv method
  • Correct display the initial BoardMenu without touching events: Currently I discard CalVR.cpp and create my own calvrController and use calvr viewer, scene, menu and config tools. For board menu, use a fix position for now to get it displayed on screen first.
  • Freetype Font: I notice that calvr use osgText::font to load .ttf fonts from resource. Therefore, I first build freetype on android. As a drawable, under osg framework, succeed rendering some strings. However, osg still can not get .ttf reader.

TODOs and Problems:

  • Fail to get the menu with proper menubuttons back like this, actually I have no idea where do those menu button items comefrom...

  • OSG Font fail to get ttf reader even if Freetype libs successful linked(since ttf text can be rendered in a gl drawable way)

  • We may need to figure out what components of CalVR are not required for Android.

  • Started to integrate Karen's project. The submenu entrance is here

[Notes] NDK

NDK(Native Development Kit):

  • Use C/C++ with Android
  • Access native activities / physical device component

Supportive SDK

  • NDK, CMake, LLDB

Structure

  • *.cpp under cpp and CMakeLists.txt under External Build Files

Build your own Native Proj

  • Gradle calls upon CMakeLists.txt
  • Compile C++ source into .so(shared opbject library), Gradle then packages it into APK
  • Runtime:Activity loads teh native library using the following. Then can call cpp func in java
   static {
     System.loadLibrary("native-lib");
 }

Configure CMake

  • Sets the minimum version of CMake required to build the native library.
    cmake_minimum_required(VERSION 3.4.1)
  • Build your cpp as lib: Specifiy the library and also property
    	add_library(yourOwnLibName 
    				SHARED 
    				src/main/cpp/cppFileName.cpp)
    
    Notice you can use ${ANDROID_NDK} to include local NDK lib
  • Import prebuilt library
     	add_library(importedLibName
     				SHARED
     				IMPORTED)
     	set_target_properties(importedLibName
     						  PROPERTIES IMPORTED_LOCATION
     						  ${ANDROID_ABI}/importedLibName.so)
    
  • Include the path of header files
    	include_directories(path-to-include-folder)
    
  • Include an NDK Library(e.g. log) and store its path as a variable(e.g. named log-lib)
     	find_library(log-lib
     				 log)
    
  • Link between your lib to NDK libs)
    	target_link_libraries(yourOwnLibName
       				 	  ${log-lib}
       				 	  ${ndklib2}
       				 	  ${ndklib3}) 
    

Import a Native Project

Build a PURE Native App

Template for NDK

  • Create a java class to call lib
     	public class libClass{
     		// load shared lib
     		static{
     			System.loadLibrary("yourCppSharedLibName");
     		}
     		public static native void init(int width, int height);
     		public static native void step();
     	}
  • Cpp Code
    cpp files contain the functions that will be called from java, for example, file linkerCppJava.cpp contain function getCppLinker() which will be called by java class javaClass.java under the path projectName/src/com/example/myProj/javaClass.java. Then the cpp code should be
     	JNIEXPORT jstring JNICALL
     	Java_com_example_myProj_javaClass_getCppLinker(JNIEnv * env, jobject)
    where JNIEnv * point to VM and jobject points to this. The name is Java_PATHofJAVA_functionName
  • Pass Java class to cpp via JNI
    Java part
     	public static native void JNI_FUNC_CALL(JAVACLASS instance);
    cpp Interface Func
     	JNI_METHOD(void, JNI_FUNC_CALL)(JNIEnv*env, jclass, jobject jobj){
     	/////Some way to cast jobject to the class used in native codes
     	AAssetManager * cpp_asset_manager = AAssetManager_fromJava(env, asset_manager);
     	cpp_actual_func(cpp_asset_manager);
     	}
     	void cpp_acutal_func(AAssetManager * manager);
    Notice that AAssetManager provides interface to obtain the corresponding native AAssetManager object. A more common method is
     jclass localClass = env->FindClass(InterestedClass);
     dest_Type globalClass =reinterpret_cast<dest_Type>(src_Type);
     // for example
     myNativeClass * native_cls;
     jlong addr = reinterpret_cast<intptr_t>(native_cls);
     // for example 2
     jlong addr;
     myNativeClass * native_cls= reinterpret_cast<myNativeClass *>(addr);

Add a new interface function

  1. Add the native (+ java helper)function in Java class, which could be called from an Activity
  2. Declare and implement the native function in cpp code.

[Notes] JNI Callback Template

Reference:
Signature
Calling Java Methods

    jclass helper_class = env->FindClass( "com/samsung/arcalvr/MainActivity" );
    if(helper_class){
        helper_class = static_cast<jclass>(env->NewGlobalRef(helper_class));
        jmethodID button_move_method = env->GetMethodID(helper_class, funcName, "()V");
        jobject object = GetMainActivityObj();
        env->CallVoidMethod(object,button_move_method);
    }

        /* 
         * Lambda func in cpp
         * [ capture clause ] (parameters) -> return-type{definition of method}
         */
static struct JNIData{
        jclass helper_class;
        jmethodID test_method;
        jmethodID button_move_method;
    }jniIds = [env]()->JNIData{
        constexpr char kHelperClassName[] = "com/samsung/arcalvr/JniInterface";
        constexpr char kTestFunctionName[] = "testCallBack";
        constexpr char kButtonMoveFunction[]="popButtons";
        constexpr char kSignature[] = "()V";

        jclass helper_class = env->FindClass( kHelperClassName );

        if(helper_class){
            helper_class = static_cast<jclass>(env->NewGlobalRef(helper_class));
            jmethodID test_method = env->GetStaticMethodID(
                    helper_class, kTestFunctionName, kSignature);
            jmethodID button_move_method = env->GetMethodID(
                    helper_class, kButtonMoveFunction, kSignature);
            return {helper_class, test_method, button_move_method};
        }
        LOGE("===CAN'T FIND HELPER CLASS");
        return {};
    }();
    if(!jniIds.helper_class)
        return;
    jmethodID constructor = env->GetMethodID(jniIds.helper_class, "<init>", "()V");
    env->CallStaticVoidMethod(jniIds.helper_class, jniIds.test_method);
    jobject object = env->NewObject(jniIds.helper_class, constructor);
    env->CallVoidMethod(object,jniIds.button_move_method);

[Milestone] OSG + GLES + ARCore

Tasks

  1. Since CalVR use the framework of OpenSceneGraph and also there are some drawables implemented directly using old version OpenGL(please refer to Weichen's note)
    In order to make CalVR work on Android platform, we need to
    • Figure out how OpenGL can be embedded into OSG framework
    • Figure out how GLES works inside this framework instead of opengl 1.0 (glBegin/glEnd,etc).
    • Pay attention to to dynamic draw which is not well supported by pure OSG without gl drawable.(For example, change IBO buffer both length and contents each frame)
  2. Fit ARcore detected results inside OSG and reimplement those features: background from phone camera, dynamic pointclouds and planes, objects, lightEstimation and effects under the new framework.
  3. Figure out how to align the real camera with the scene(viewer) camera in OSG.

Method Notes

  • GLES + OSG
    • Create glDrawable class as the parent class to all kinds of drawable using gl commends directly.
    • The subclass need to override void Initialization(AAssetManager * manager,std::stack<utils::glState>* stateStack)
      and
      void drawImplementation(osg::RenderInfo&) const; For dynamic draw, you may also need to write a updateOnFrame function to update buffer data.
    • Remember to use the state stack to save/pop gl state. This is a reimplement for glPushAttrib and glPopAttrib(GLES removed those functions).
    • We MUST use Vertex Array Objects(VAOs) and Vertex Buffer Objects(VBOs) to draw gl stuffs under this framework. Similar to the usage in OpenGL, we can set the mode to DYNAMIC and update buffer data as needed.
  • Coordinate Difference
    ARcore uses the same world coordinate as OpenGL(+x to the right, +y is up, eye towards -z), while OSG use a different one(+z is up, eye towards +y). For gl Drawables, feel free to use all coordinates in OpenGL coordinates(since the final pos is calculated in shader with view/proj from arcore camera). Otherwise, use OSG coord.
  • Background occlusion
backgroundNode->getOrCreateStateSet()->setMode(GL_DEPTH_TEST,osg::StateAttribute::OFF);
backgroundNode->getOrCreateStateSet()->setRenderBinDetails(1,"RenderBin");
sceneGroup->getOrCreateStateSet()->setRenderBinDetails(2,"RenderBin");
sceneGroup->getOrCreateStateSet()->setMode(GL_DEPTH_TEST,osg::StateAttribute::ON);
root->getOrCreateStateSet()->setMode(GL_DEPTH_TEST,osg::StateAttribute::OFF);
  • Results for now
    Image 1 shows point clouds detection results and multiple plane detection. OSG Sphere with dynamic lighting position.
    Image 2 shows the object with correct texture/lighting
    Image 3 put the OSG sphere in the real world by changing osg camera every frame.

  • TODOs:
    • RE-IMPLEMENT MYSELF MANIPULATOR
    • Figure out whether it's necessary and how to change viewer camera along with real camera. If no change, the osg sphere(without shader) will stick on the same position onscreen. If change, background easily disappear when user postion/rotation change.

[Note] OpenGLES-NDK proj

OpenGL ES

Setup Environment

Android Project Declaration

In manifest file

<uses-feature android:glEsVersion = "0x00020000" android:required="true"/>

Also, when setup context
surfaceView.setEGLContextClientVersion(2)

to use OpenGL ES 2.0. To use 3.x+ check for the avaliablity of 3.x API at run time and then use those features if supported

<supports-gl-texture android:name = COMPRESSION_METHOD/>

to support Texture Compression

Related Classes

2 fundational classes in Andriod framework that let you create and manipulate graphics with the OpenGL ES API:

  • GLSurfaceView
    Like the SurfaceView, could be fit into setContentView() in activity.
    Extend it to implement touch events etc.
  • GLSurfaceView.Renderer
    Interface. Define methods required for drawing graphics in GLSurfaceView. Implement it and attach to GLSurfaceView by GLSurfaceView.setRenderer()
    • onSurfaceCreated() setting OpenGL env params and initialization
    • onDrawFrame() call on each redraw
    • onSurfaceChanged() call on view geometry change(size/ orientation)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.