Giter Site home page Giter Site logo

react-xr's Introduction

xr

Turn any R3F app into an interactive immersive experience.


NPM NPM Twitter Discord

npm install three @react-three/fiber @react-three/xr@latest

What does it look like?

A simple scene with a mesh that toggles its material color between "red" and "blue" when clicked through touching or pointing. recording of interacting with the code below
import { Canvas } from '@react-three/fiber'
import { XR, createXRStore } from '@react-three/xr'
import { useState } from 'react'

const store = createXRStore()

export function App() {
  const [red, setRed] = useState(false)
  return (
    <>
      <button onClick={() => store.enterAR()}>Enter AR</button>
      <Canvas>
        <XR store={store}>
          <mesh pointerEventsType={{ deny: 'grab' }} onClick={() => setRed(!red)} position={[0, 1, -1]}>
            <boxGeometry />
            <meshBasicMaterial color={red ? 'red' : 'blue'} />
          </mesh>
        </XR>
      </Canvas>
    </>
  )
}

How to enable XR for your @react-three/fiber app?

  1. const store = createXRStore() create a xr store
  2. store.enterAR() call enter AR when clicking on a button
  3. <XR>...</XR> wrap your content with the XR component

... or read this guide for converting a react-three/fiber app to XR.

Tutorials

Roadmap

  • 🤳 XR Gestures
  • 🕺 Tracked Body
  • ↕ react-three/controls

Migration guides

Sponsors

This project is supported by a few companies and individuals building cutting-edge 3D Web & XR experiences. Check them out!

Sponsors Overview

react-xr's People

Contributors

abernier avatar bbohlender avatar sawa-zen avatar shpowley avatar thesavior avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

react-xr's Issues

useXR() hook providing nothing

I am working in an ARCanvas I include <DefaultXRControllers> in my canvas as well in the component I create const {controllers} = useXR` as shown in the documentation. However when I just try to console.log the controllers they are undefined. I am trying to gain access to the information of the controller so I can understand what it is taking in so that I can put it in text on the screen as well as use it to develop more easily.

Is there a specific way to use the controllers to expose its information in a console.log?

usehittest not mapping a plane?

I have created hit test reticles with a flat ring in Three and webXR without React. One thing I am able to do by using the Matrix4 returned by the hit test is to lay a ring mesh flat on the physical surface. However when doing the same thing using the react-xr I am getting a hit test but it is not mapping the retticle flat to the plain.

ref.current.matrix.fromArray(hitMatrix.elements) is the callback I am using to map the hitMatrix objects Matrix4 to the ref's matrix. however it does seem to be getting the rotation of the plane. Is this have something to do with the XRHitTestOptions.entityTypes would I need to some how set this to be planes? if so how does one do this in react XR component?

Support for Bloom in AR Canvas?

Hello Everyone!

I am trying out XR for first time, with little experience in react-three-fiber ecosystem. I have made a simple scene with a glowing orange sphere. The scene renders fine in normal Canvas, as expected. (as show below)

Live Demo | CodeSandbox

image

When entering the AR Session, no bloom effect is rendered.

image

I suspect this is issue with Alpha blending or something? Thanks in Advance.

How to always show VR controller ray?

I'm using the Interactive component, but the ray only shows when the controller ray is hovering over the interactive object. It would be good UX to have a prop somewhere to always show the ray from the controller.

I can see here is where the ray visibility gets set, but can this be overwritten so that the ray is always visible?

how to apply hit-test to model

The sample uses useResource, which doesn't work with v6, so I switched to useRef.

useHitTest seems to be working, but how do I make a tap event "plant" the hitTest location? is there an example of that?

I tried adding 'hitMatrix' to useHitTest((hitMatrix, hit) => { // use hitMatrix to position any object on the real life surface })

but when I add hitMatrix, I get this error:
Uncaught TypeError: hit.decompose is not a function at App.js:60 at XRSession.<anonymous> (index.js:357)

if I remove hitMatrix, and just:
useHitTest((hit) => { hit.decompose( ref.current.position, ref.current.rotation, ref.current.scale ); });

the hit testing will work, but I am not sure how to make the model "stick" to the plane.

thanks,
Dale

Some of the demos in the ReadMe need to be updated

I did a quick overview using Oculus Browser v16 or my phone when using AR

https://codesandbox.io/s/react-xr-paddle-demo-v4uet ✅ might need to update dependencies but it still works

https://codesandbox.io/s/react-xr-simple-demo-8i9ro?file=/src/index.tsx✅ might need to update dependencies but it still works

https://codesandbox.io/s/react-xr-simple-ar-demo-8w8hm?file=/src/App.tsx ✅ might need to update dependencies but it still works

https://codesandbox.io/s/react-xr-hands-demo-gczkp ✅❌I tested the hands feature "locally" yesterday and it worked fine. But the hands doesn't show here, even after a quick dependencies update. Not sure why, probably something minor.

https://codesandbox.io/s/react-xr-hands-physics-demo-tp97r ❌The one with physics is probably the one that needs the more work as it seems to have outdated dependencies. After updating everything, the hands does show but they go through the cubes without doing anything.

Can't use GLB with AR

I was trying to use the test example with a glb object but I have the following problem: if I don't press the AR button, I can see the GLB model loaded, but if I press the "Go to AR" button, the camera does not start and I only see the same page in full screen.

Does anyone know why this happens?

The code:

const ThreeViewer = () => {
    function LoadGLB() {
        const gltf = useGLTF("MODEL-URL" , true)
        return <primitive object={gltf.scene} />
    }
    function HitTestExample() {
        const hitPoint = useRef()

        useHitTest((hitMatrix, hit) => {
            hitMatrix.decompose(hitPoint.current.position, hitPoint.current.rotation, hitPoint.current.scale)
        })

        return <Box ref={hitPoint} args={[0.1, 0.1, 0.1]} />
    }

    return (
        <>
            <ARCanvas sessionInit={{ requiredFeatures: ['hit-test'] }} camera={{ position: [0, 0, -0.3] }}>
                <ambientLight />
                <mesh position={[0, 0, 0]}>
                    <LoadGLB/>
                </mesh>
                <pointLight position={[10, 10, 10]} />
                <HitTestExample />
                <DefaultXRControllers />
            </ARCanvas>
        </>
    )
}

1

Access camera feed element on a smartphone

Hello
I would like to access the camera feed and use it as an source for other functions. Is there anyway to gain access to the video element while running a webxr session on a smartphone?

react-three-gui compatibility

If I use an app that use react-three-gui and I'm not in VR, I see the menu (or react-three-gui) but control seems not added.
Example:
<ControlsProvider><Canvas>...</Canvas><Controls /></ControlsProvider> Work but <ControlsProvider><VRCanvas>...</VRCanvas><Controls /></ControlsProvider> not work

Note: I understand that in VR (after pressing the button), we can't use react-three-gui then I understand that for VR it may be a minor issue.

to patch my app, I do something like that but it not clean.

function Canvas2({ children, ...rest }) { var vr = true; // in really VR come from a parameters of the URL (HTTP://app?vr=true/false) return vr?<VRCanvas>{children}</VRCanvas>:<ControlsProvider><Canvas>{children}</Canvas><Controls /></ControlsProvider> }

virtual tour

does this contain a feature for virtual tour like google street view

Help with useHitTest

Hello, I'm a beginner developer. For a week now I have been suffering and do not understand how useHitTest works. As I have already understood, this hook allows you to determine the hit in the AR environment. But how to use it to achieve the placement of any object on a real surface. I will be glad for any help or hint.

XR seems to be taking control over any camera

I have created simple AR demo with portal - it works fine in web browser with AR disabled. Portal renders object placed in paralell scene and when I move arround it doesn't change position (as intended).

When I move to AR and launch this app on my phone strange things are happening. It seems like portal is rendering stuff based on my phone position. I have added a THREE.CameraHelper to see what is going on and it seems that XR takes control over my other-dimension camera and snaps its position and rotation to my device position and rotation.

I thought that maybe it just takes first camera created from code and sets is as an XR camera which is used to displaying thus I have created another dummyCamera. It turns out that no matter how many cameras I create (these cameras aren't even on current THREE.Scene), XR always takes control over them and snaps them to device transform

On PC with AR disabled I can see cameras' wireframes by using THREE.CameraHelper and on phone I can just see this helper stick to my position. How do I tell XR not to control cameras?

VRButton overwrites sessionInit.optionalFeatures with an XRSession promise -> Bug when Entering, Exiting then Entering VR again with non default optionalFeatures (like layers)

First thanks for the lib, it is a huge time saver !

Now about the bug : Getting in, out and then in VR again causes errors if you use WebXR layers because VRButton overwrites sessionInit.optionalFeatures when clicked on

This means that any non default feature (again like WebXR layers) will break the second time you enter VR.

Just removing the sessionInit.optionalFeatures = should fix it. I will propose a PR shortly

Can you view HTML or use react-three-gestures in AR?

Working on a project to display a few meshes in AR but useDrag from react-three-gestures doesn't seem to work so I opted for some HTML slider to move the position around but does not seem like HTML appears in AR mode. Does anyone know how to achieve this? Any help or pointing me in the right direction would be great, thx in advance 🙏

When I use Next.js to build demo code

/node_modules/three/examples/jsm/webxr/XRControllerModelFactory.js:1
import {
       ^

SyntaxError: Unexpected token {
    at Module._compile (internal/modules/cjs/loader.js:872:18)
    at Module._compile (/Users/rich/.config/yarn/global/node_modules/pirates/lib/index.js:99:24)
    at Module._extensions..js (internal/modules/cjs/loader.js:947:10)
    at Object.newLoader [as .js] (/Users/rich/.config/yarn/global/node_modules/pirates/lib/index.js:104:7)
    at Module.load (internal/modules/cjs/loader.js:790:32)
    at Function.Module._load (internal/modules/cjs/loader.js:703:12)
    at Module.require (internal/modules/cjs/loader.js:830:19)
    at require (internal/modules/cjs/helpers.js:68:18)
    at Object.<anonymous> (/Users/rich/i18n-fe/event/node_modules/react-xr/dist/index.cjs.js:12:32)
    at Module._compile (internal/modules/cjs/loader.js:936:30)

the source demo code is

import React, { useRef, useState } from 'react'
import ReactDOM from 'react-dom'
import { VRCanvas } from 'react-xr'
import { Canvas, useFrame } from 'react-three-fiber'
import * as three from 'three'

function Box(props) {
  // This reference will give us direct access to the mesh
  const mesh = useRef()

  // Set up state for the hovered and active state
  const [hovered, setHover] = useState(false)
  const [active, setActive] = useState(false)

  // Rotate mesh every frame, this is outside of React without overhead
  useFrame(() => (mesh.current.rotation.x = mesh.current.rotation.y += 0.01))

  return (
      <mesh
        {...props}
        ref={mesh}
        scale={active ? [1.5, 1.5, 1.5] : [1, 1, 1]}
        onClick={(e) => setActive(!active)}
        onPointerOver={(e) => setHover(true)}
        onPointerOut={(e) => setHover(false)}>
        <boxBufferGeometry attach="geometry" args={[1, 1, 1]} />
        <meshStandardMaterial attach="material" color={hovered ? 'hotpink' : 'orange'} />
      </mesh>
  )
}

export default () => {
  return (<VRCanvas>
    <ambientLight />
    <pointLight position={[10, 10, 10]} />
    <Box position={[-1.2, 0, 0]} />
    <Box position={[1.2, 0, 0]} />
  </VRCanvas>)
}

Background scene got faded(color contrast) after change <Canvas> to <VRCanvas>

Hello,

I am using react-three-fiber with three. And, I want to make my web with 360 panoramic background can be opened by VR by changing from <Canvas> to <VRCanvas>

import { Canvas} from 'react-three-fiber'
<Canvas >
    <ambientLight />
    <Controls />
    <Box imageURLs={imageURLs}/>
</Canvas>

Screen Shot 2563-10-15 at 15 49 11

to

import { VRCanvas } from '@react-three/xr'
<VRCanvas >
        <ambientLight />
        <Controls />
        <Box imageURLs={imageURLs}/>
</VRCanvas>

Screen Shot 2563-10-15 at 15 49 24

but the background colour has been faded to the picture above compared
to the normal one. I don't know what is the problem. Can anyone suggest me how to solve it? Thank you a lot

Ray geometry artifact running simple AR demo on WebXR Viewer by Mozilla

Hi there!

First of all, thanks for an amazing library! I've been fiddling around with the AR demo on WebXR Viewer by Mozilla (iOS). When running the demo I've noticed a meshbeing rendered that is covering a large portion of the screen, which occurs when the Select and Hover components are being used on a raycast hit. Digging into the code base I realized that this was due to that the raycast mesh (rendered once per controller by DefaultXRControllers) is too close to the camera when visible. See this line, which places the mesh's closest side at the same position as the camera. By adding an offset to the mesh moving it away from the camera the artifact is gone, eg:

      const OFFSET = 0.25
      ray.visible = true
      ray.scale.y = it.hoverRayLength - OFFSET / 2
      ray.position.z = -it.hoverRayLength / 2 - OFFSET

I'm quite new to three.js and WebXR so I'm not 100% sure if this is the correct approach of solving this problem. Another approach might for example be to set some clipping option on the camera.

Futhermore, the artifact is not present on Android and Chrome (latest release).

Let me know if you need any further info. Wouldn't mind creating a PR for whatever solution we come up with.

WebXR Viewer

Hi,

The one demo linked to on the read me looks likes pass through mobile AR. How do I check out that demo? Use WebXR Viewer?

Grabbing

Since this library contains the pointing interaction, I think a grabbing interaction would be a very good fit to implement in here. Especially cause both Interaction paradigms should possibly coexist regardless of the input (Controller, Hand, Mouse, ...).

Maybe we can discuss If Grabbing would fit into this library and how it can interoperate with the current pointing interaction.

(I've already started a POC for two- & one-handed grabbing but It would need some refinement to be submitted as a PR)

Support for Hololens

Hi,

I apologize for posting this question here, in case this is not appropriate.

I have looked into the examples and I'm trying to understand if react-xr supports Microsoft Hololens Hands recognition/interaction? if not, do you have any plans to support it?

Thanks in advance,
EM.

Controller Help

I'm trying to add a squeeze event on each controller I just lose context of the state not sure if im doing this right, I am able to attach a cube object on the controller though.

function Controls() {

const [control, setControl] = useState(null);

const { controllers } = useXR()

const onSqueeze = useCallback((e) => { control.material.color.set("blue") }, [])
useXREvent('squeeze', onSqueeze)

useEffect(()=>{
  controllers.forEach(({ controller, grip, inputSource }) => {
    var geometry = new THREE.BoxGeometry( .2, .2, .2 );
    var material = new THREE.MeshBasicMaterial( {color: 0x00ff00} );
    var cube = new THREE.Mesh( geometry, material );
    controller.add(cube)
    console.log(controller.id)
    setControl({ control: cube })
})

}, [controllers])

return null

}

Refresh Rate is Low (Quest 2)

hi,

When trying either the demo with the blue-box you can point at or the demo the stack of cubes, I had a choppy frame-rate. I'm on a Quest 2. I ran the demos in both the view where you can see the code also but also the dedicate preview window.

ps. I did not try the other demos, but I assume it'd be the same issue.

@react-spring/three useSpring stops updating in XR session

This is a duplicated bug report from pmndrs/react-spring#1518

I'm not sure if this a bug related to @react-spring/three, @react-three/fiber, @react-three/xr or a combination of all three.

Feel free to close if you think this bug relates to react-spring only and not this library.

To Reproduce

  1. Load repro link below
  2. The mesh changes scale on a timer ever 1s - this is animated with a spring
  3. Start a VR session
  4. The mesh stops animating scale and stays at the scale it was when the xr session was started

Expected behavior

The mesh should continue to animate scale inside the XR session.

Link to repro

https://codesandbox.io/s/react-xr-react-spring-bug-fkzt3?file=/src/index.tsx

Environment

  • @react-spring/three >= v9.0.0
  • @react-three/fiber >= v6.0.3
  • @react-three/xr >= v2.1.0
  • react v17.x

Works as expected with the following old versions:

  • @react-spring/three v9.0.0-rc.3
  • react-three-fiber v5.x.x
  • @react-three/xr v2.0.1

Using drei HTML is not showing anything

I have tried to add html when using react-xr. I am using HTML from drei but cannot get anything to show up.

import React from "react";
import { VRCanvas, DefaultXRControllers } from "@react-three/xr";
import { Html } from "drei";

function App() {
  const style = {
    heigh: "100px",
    width: "100px",
    backgroundColor: "red",
  };
  return (
    <div className="App">
      <VRCanvas>
        <ambientLight intensity={0.5} />
        <pointLight position={[5, 5, 5]} />
        <mesh position={[1, 1, 1]}>
          <Html
            prepend
            center
            fullscreen
            scaleFactor={10}
            zIndexRange={[100, 0]}
          >
            <h1 style={style}>hello</h1>
            <p>world</p>
          </Html>
        </mesh>
        <DefaultXRControllers />
      </VRCanvas>
    </div>
  );
}

export default App;

Class constructor Loader cannot be invoked without 'new'

Hello!

Getting this error with version 3.0.1. Error disappears when rolling back to 3.0.0.

Deps:

    "@react-spring/three": "9.1.2",
    "@react-three/drei": "4.3.2",
    "@react-three/fiber": "6.0.17",
    "@react-three/xr": "3.0.1",
    "prop-types": "15.7.2",
    "react": "17.0.2",
    "react-dom": "17.0.2",
    "react-scripts": "4.0.3",
    "three": "0.128.0"

Live controller position

Hello I am trying to get live position of the controllers to attach an object to them.

Currently I am using the useControllers() hook, it works great at getting the position and rotation data but the values won't update as the controller moves

Btw Currently working on a open sourced project about learning webxr (by building a vr drumkit) and react-xr has been incredibly useful and easy to get up to speed. I'll leave the links if anyone wants to check it out.
https://learning-webxr.now.sh/
https://github.com/Marguelgtz/learning-webxr

const leftController = useController('left')

return (
      {leftController ? (
        <Box
          castShadow
          args={[0.04, 0.04, 0.5]}
          position={[
            leftController.controller.position.x,
            leftController.controller.position.y,
            leftController.controller.position.z,
          ]}
          rotation={[
            leftController.controller.rotation.x,
            leftController.controller.rotation.y,
            leftController.controller.rotation.z,
          ]}
        >
          <meshStandardMaterial color="#966F33" />
        </Box>
      ) : null}
)

Add access to XRFrame

Hi,

I made a little demo which includes react-xr. And for that, I wrote an useXRFrame hook. . It basically passes the current XRFrame to a callback in the loop of a XR session, so it can be used in a component like this:
useXRFrame(callback)

Maybe it can also be useful for this package? XRFrame holds some useful information about an active XR session.

selected interaction doesn't report intersection field

I'm trying to implement TextInput in vr space and to place a a caret I need to understand where selection happened in local space. In non-immersive mode I use onClick handler and point and object fields of an event to calculate it. In XRInteractionEvent there's nullable intersection field that contains these field, however I'm struggling to produce an event where intersection would not be undefined

Trigger select event on closest object only

I am trying to build an interaction where one of many cubes in the scene can be selected. All the cubes are wrapped separately in an <Interactive> component.

However what I've noticed is that when multiple objects are on the hit ray's path, the interaction will be triggered for all of them. This is a common case since the scene is fully packed with objects.

Would it be desirable to introduce some mechanism that specifies that only the closest object hit by the ray should have an interaction triggered?

I think the workaround right now is to add an onHover/onBlur and store the distances extracted from the intersection objects externally, and work from there... but it seems a bit involved for such a common use case

useHitTest doesn't work

When trying to use the useHitTest hook, I encounter this error message:

TypeError: Cannot read property 'requestAnimationFrame' of null
    at Object.current (index.js:346)
    at renderGl (web.js:70)
    at web.js:1156
    at onAnimationFrame (three.module.js:22288)
    at Object.onAnimationFrame [as callback] (three.module.js:12286)
    at XRSession$1.<computed>.onDeviceFrame (<anonymous>:1281:42)

Looking at the useHitTest function (XR.tsx), sessionRef.current returns null on line 77. Unfortunately I'm not exactly sure how all of this is working so it's a bit tricky for me to debug any further.

I am using the appropriate 'sessionInit={{ requiredFeatures: ['hit-test'] }}' property on the 'ARCanvas' tag. I think it would be great to get a demo up and running for this feature.

Moving camera on XR load

Hello!

Thank you for developing this library, I'm really enjoying the react-three-fiber ecosystem to learn Three.js! I have being playing around with ARCanvas component and was wondering if there is a way to move the initial camera after activating AR. Is there a hook, input or technique to move the initial starting point from (0,0,0) to a target coordinate?

Cheers! 😄

Input events API discussion

I can see two types of events that would be usefull in xr apps:

  1. Events that can be attached to ingame objects (e.g.: hovering a button)
  2. 'Standalone' events (e.g: trigger press)

For standalone events I think this should be straightforward, exposing a hook like this:

useXREvent(() => console.log('pew!'), { type: "select", handedness: "left" }); 

Now for events attached to objects I have few ideas

hooks could look something like this:

const ref1 = useXRInput("pointerover", () => setHover(true));
const ref2 = useXRInput("pointerout", () => setHover(false));

<mesh ref={mergeRefs([ref1, ref2])} />

or to limit amount of refs it may look like this:

const ref = useXRInputs({
  'pointerover': () => ...
  'pointerout': () => ...
})

<mesh ref={ref} />

but I also propose to expose more high level components to handle events

<Hover onChange={value => setHover(value)}>
  <mesh />
</Hover>

and under the hood inside those components attach events to the threejs group object

what do you think?

cannot open locally hosted site in XRviewer

I followed the examples exactly as they are listed in the react-XR examples to clone the repo, yarn, and then start. I am using an iphone XS with the latest version of the XR viewer and iOS. however when I try to access the locally hosted instance by going to https://<computer's-ip-address.: with both my computer and iphone on the same network WebXR is not able to access the site. Is there something missing from the instructions for accessing WebXR from a localhost? do I need to do anything specific to be able top access locally hosted instance?

Using Select Component when child object is positioned at [0,0,0] (AR)

If an object is positioned at the origin of the scene (0,0,0) and that object is wrapped by a Select component, the controller’s hovering state for that object is set to true at start before any event handler is hooked up. This results in that the user has to click besides the object once before the Select component works. This behavior can be seen in the demo AR scene if you position the Box at [0,0,0].

The reason for this behavior is that controller’s initial position is [0,0,0], and therefore the controller’s ray intersects with the object at the origin at start.

My suggestion is to add a check that inputSource is defined before adding an object to the hovering state of a controller.

            if (it.inputSource && !hovering.has(eventObject) && handlers.onHover.has(eventObject)) {
              hovering.add(eventObject)
              handlers.onHover.get(eventObject)?.({ controller: it, intersection })
            }

At: https://github.com/react-spring/react-xr/blob/299273209decc8d6c49351b2e4df799b475f0380/src/XR.tsx#L91

This works since inputSource gets defined in first the controller's connected event handler. When running in AR mode (haven't tested in VR mode) inputSource is only defined when the user is touching the screen (not sure if this is the intended behaviour, if not my fix above might not be the best solution).

Tested this on Android in Chrome 84.

Let me know if this is a desired solution. If it is I can make a PR.

ARCanvas is pixelated

I have the following two repos and associated github pages that show building nearly the same thing in three.js/webxr and react-xr and three-fiber.

In react-three-fiber/xr the AR Canvas is rendering everything in a very pixelated form.
this component is where I am initiated my ARCanvas in React which is having pixelation issues

and this is the basic three and webxr where I initiate my scene and do not have any pixelation problems.

Can someone please help me understand what configuration I am missing in my react version to prevent pixelation?

The device I use is an iphone XS with the XR viewer app.

<Hands> doesn't show in the example using oculus browser v16

Hello,

I'm barely starting with VR so I might be reporting something wrong.

I opened one of the hand demo in the readme https://github.com/pmndrs/react-xr with my oculus quest 2 and the hand didn't show.
I tried after updating all dependencies too but still couldn't see the hands.

I believe I'm using oculus browser v16 since navigator.userAgent returned
"Mozilla/5.0 (X11; Linux x86_64; Quest 2) AppleWebKit/537.36 (KHTML, like Gecko) **OculusBrowser/16.0.0.2.13.297174309** SamsungBrowser/4.0 Chrome/91.0.4472.88 Safari/537.36"

I recorded what I had after opening the current demo from the readme. We can see that the hand don't show when I close the Oculus "menu".

com oculus browser-20210613-232819 1

I looked at the browser log through chrome Remote Debugging and had only those two warnings.

Warning: meshopt_decoder is using experimental SIMD support
Warning: Each child in a list should have a unique "key" prop.

If it's really a bug and not just me, I'd love to try fixing it myself if you want.
I'll probably need some pointers on where to look or something.

Let me know if I can at least provide more informations / do more tests.

How to use arjs with react-xr ?

I am learning to switch from A-frame to RTF's Ecosystem.
In @react-three/xr 's README.md, I found an ar 's example, which is great. I believe that AR interaction is already a huge demand.It connects the real space and adds great value to the objects in the real space.
Now I hope to develop an AR app like Aframe to make some fun recognition behaviors, such as Image Tracking, Location Based, Marker Based,
but in @react-three/xr I have not found any support or examples. Is anyone with similar development experience can give me some guidance here? thank you~

useXRFrame does not run if the hook is initially called in a non XR session

Relatively simple example with this issue: https://codesandbox.io/s/lingering-wood-6i2zj?file=/src/App.tsx

Here the useXRFrame animation only works after something else forces a re-render of the component.

Looking at useXRFrame PR: https://github.com/pmndrs/react-xr/pull/39/files I can see that the useEffect that sets things up has gl.xr.isPresenting as a dependency; however, it appears this value is not reactive/won't get changed when the XR status changes.

What is the recommended way to deal with this? I suppose ideally useXRFrame would observe this status and start calling the callback if it changes?

I there a recommended workaround?

Suggested solution

Perhaps it is possible to create a useIsPresenting hook, then it could both be used within useXRFrame and available generally?

Having a look at source code seems like it could be added like:

function useIsPresenting() {
  return useContext(XRContext).isPresenting;
}

If this makes any sense (I am very new to this library) I can make a PR? Or maybe just change the gl.xr?.isPresenting (and dependency) to useContext(XRContext).isPresenting.

Allow custom optional features for VRButton

VRButton does not allow any customization of the session like ARButton does with sessionInit.

Since VRButton's option parameter has been removed, I propose allowing users to specify sessionInit in the VRButton as well or optional features, with the addition of layers support in three.

// Configures layers with R3F (see https://threejs.org/examples/#webxr_vr_layers)
<VRCanvas sessionInit={{ optionalFeatures: [ 'local-floor', 'bounded-floor', 'hand-tracking', 'layers' ] }} />

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.