expo / expo-three Goto Github PK
View Code? Open in Web Editor NEWUtilities for using THREE.js on Expo
License: MIT License
Utilities for using THREE.js on Expo
License: MIT License
I can't get expo to properly suppress the warnings. I put this at the top of my root file:
import { THREE } from 'expo-three';
THREE.suppressExpoWarnings(true);
...but I still get tons of webgl warnings. Am I doing something wrong here? Using Android.
I wasn't sure if this was intentional, but this issue gets me a lot when I'm prototyping w/ random images off the internet :)
If I use a URL that includes anything after the file extension, I get the "Unrecognized File Type" red screen. It's pretty easy to reproduce.
It's not a huge issue, since there's a workaround (strip the stuff after the file extension).
// fails with query string (cache busters, etc)
const url = 'http://2.bp.blogspot.com/-NpG5ASzb7Uo/URjmnl0PTGI/AAAAAAAAEOs/UbGARn9OTsE/s1600/heart_pixelart_grid.png?haha=haha';
// fails with hash
const url = 'http://2.bp.blogspot.com/-NpG5ASzb7Uo/URjmnl0PTGI/AAAAAAAAEOs/UbGARn9OTsE/s1600/heart_pixelart_grid.png#hashie';
// works
const url = 'http://2.bp.blogspot.com/-NpG5ASzb7Uo/URjmnl0PTGI/AAAAAAAAEOs/UbGARn9OTsE/s1600/heart_pixelart_grid.png';
const spriteTexture = await ExpoTHREE.loadAsync(url);
Not sure how to approach this, tried:
const pickedImage = await ImagePicker.launchImageLibraryAsync({
base64: true,
allowsEditing: false
});
if (pickedImage.cancelled) { return; }
const base64 = "data:image/jpeg;base64," + pickedImage.base64;
const uri = pickedImage.uri.replace("file://", "");
const file = pickedImage.uri;
const texture1 = await ExpoTHREE.createTextureAsync(base64); // error
const texture2 = await ExpoTHREE.createTextureAsync(uri); // error
const texture3 = await ExpoTHREE.createTextureAsync(file); // error
const texture4 = THREE.ImageUtils.loadTexture(base64); // white mesh
const texture5 = THREE.ImageUtils.loadTexture(uri); // white mesh
const geometry = new THREE.PlaneGeometry(0.1, 0.1);
const material = new THREE.MeshLambertMaterial({
map: texture // 1..5
});
const mesh = new THREE.Mesh(geometry, material);
mesh.position.z = -0.1;
this.scene.add(mesh);
Nothing seems to work :)
Images can be loaded as a material using code like the following:
const img = require('./assets/images/myImage.jpg');
const material = new THREE.MeshBasicMaterial({
map: await ExpoTHREE.createTextureAsync({
asset: Expo.Asset.fromModule(img),
})
});
How can we create a material from a remote file determined at runtime (where require()
is not available) and we cannot create an Expo.Asset
?
Here are some attempts I've made, with no luck:
var expoAsset = new Expo.Asset();
expoAsset.uri = "https://s3.amazonaws.com/exp-brand-assets/ExponentEmptyManifest_192.png";
const material = new THREE.MeshBasicMaterial({
map: await ExpoTHREE.createTextureAsync({
asset: expoAsset
})
});
var textureLoader = new THREE.TextureLoader();
textureLoader.load("https://s3.amazonaws.com/exp-brand-assets/ExponentEmptyManifest_192.png", function(loadedTexture) {
const material = new THREE.MeshBasicMaterial({ map: loadedTexture });
});
var loadingManager = new THREE.LoadingManager();
var imageLoader = new THREE.ImageLoader(loadingManager);
imageLoader.load("https://s3.amazonaws.com/exp-brand-assets/ExponentEmptyManifest_192.png", function(image) {
var texture = new THREE.Texture();
texture.image = image;
var material = new THREE.MeshBasicMaterial( {map: texture});
});
I am using expo-three to create a 3D space and handling touches using a PanResponder. How would I approach converting the PanHandler x, y coordinates to identify where in my 3D model is it with respect to the set camera?
My assumption was this would be built in, but have not been able to find a suitable solution.
Thanks in advanced.
I faced this problem so many times, it works at the beginning when I run the example that you provide and suddenly this appear and always appears even after restarting my laptop.
Didn't figure it out why it happens.everything is setup correctly, I used the last version from master branch.
Hello, thank you for your wonderful library.
I have spent so many days trying to figure out how to use the camera to be ThreeJS background.
Right now, we have two options:
createCameraTextureAsync
function inside Expo.GLView instance
to get a new WebGLTexture instance
ExpoTHREE.AR.BackgroundTexture
also return a new WebGLTexture instance
, however, it only supports iOS currently. (Mine is Android)And I have already found two examples:
createCameraTextureAsync
, instead, it just pass the Expo.camera instance
as uniforms and it work. I have no idea why that happen.And right now my code is this
onContextCreate = async gl => {
const cameraTexture = await this.GLView.current.createCameraTextureAsync(
this.Camera.current
);
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(
75,
gl.drawingBufferWidth / gl.drawingBufferHeight,
0.1,
1000
);
const renderer = new ExpoTHREE.Renderer({ gl });
renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
camera.position.z = 4;
const texture = new THREE.Texture();
const properties = renderer.properties.get(texture);
properties.__webglInit = true;
properties.__webglTexture = cameraTexture;
scene.background = texture;
}
After running this code, my app only shows up the first camera captured image, and it will stop.
My GLView will not auto refresh, it will only show the first image my camera captured.
I want to ask why ExpoTHREE.AR.BackgroundTexture
will auto refresh, but createCameraTextureAsync won't?
What is the right way to use createCameraTextureAsync
with ThreeJS?
Thank you for listening my question!
Hi, it's me again.
If I was using three in a browser, I would call the method toDataUrl from canvas. Is there anything similar available in Expo-Three ?
I need to get the rendered buffer and save it to a high quality png.
Edit: I used Expo.takeSnapshotAsync from Expo on the view containing the Expo.GLView and it worked for iOS but on Android I get only a black png.
Thanks.
I was following this tutorial: https://blog.expo.io/introducing-expo-ar-mobile-augmented-reality-with-javascript-powered-by-arkit-b0d5a02ff23
And then when I had tu run this command: npm i -S three expo-three
I got a lot of errors.
Command line errors:
PS D:\Moji dokumenti\diplomska\app> npm i -S three expo-three
@expo/[email protected] postinstall D:\Moji dokumenti\diplomska\app\node_modules@expo\browser-polyfill
find ../ -name .babelrc -delete
FIND: Parameter format not correct
my-new-project@ D:\Moji dokumenti\diplomska\app
`-- [email protected]
npm WARN [email protected] requires a peer of eslint@^3.17.0 || ^4.0.0 but none was installed.
npm WARN [email protected] requires a peer of expo@^25.0.0 but none was installed.
npm WARN [email protected] requires a peer of react-native@^0.51 || ^0.52 || ^0.53 || ^0.54 but none was installed.
npm WARN [email protected] requires a peer of [email protected] but none was installed.
npm WARN [email protected] requires a peer of react-native@^0.44.1 but none was installed.
npm WARN [email protected] requires a peer of expo@^28.0.0 but none was installed.
npm WARN [email protected] requires a peer of [email protected] but none was installed.
npm WARN @expo/[email protected] requires a peer of expo@^28.0.0 but none was installed.
npm WARN @expo/[email protected] requires a peer of [email protected] but none was installed.
npm ERR! code ELIFECYCLE
npm ERR! errno 2
npm ERR! @expo/[email protected] postinstall: find ../ -name .babelrc -delete
npm ERR! Exit status 2
npm ERR!
npm ERR! Failed at the @expo/[email protected] postinstall script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ npm update check failed โ
โ Try running with sudo or get access โ
โ to the local update config store via โ
โ sudo chown -R
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\User\AppData\Roaming\npm-cache_logs\2018-08-02T08_21_58_080Z-debug.log
What can I do to solve this problem?
My installed versions:
Is there a way to use shaders and THREE.EffectComposer
along with ExpoTHREE?
The actual wanted behaviour is something like this:
// In the onContextCreate method
AR.setPlaneDetection(AR.PlaneDetectionTypes.Horizontal);
this.renderer = new ExpoTHREE.Renderer({ gl, width, height, pixelRatio });
this.scene = new THREE.Scene();
this.scene.background = new ThreeAR.BackgroundTexture(this.renderer);
this.camera = new ThreeAR.Camera(width, height, 0.01, 1000);
this.renderPass = new THREE.RenderPass(this.scene, this.camera);
this.shaderPass = new THREE.ShaderPass(CustomGrayScaleShader);
this.shaderPass.renderToScreen = true;
this.shaderPass.enabled = true;
this.effectComposer = new THREE.EffectComposer(this.renderer);
this.effectComposer.add(this.renderPass);
this.effectComposer.add(this.shaderPass);
// And in the render method
this.effectComposer.render();
Hello,
Versions:
Expo: 28
Expo-Three: 3.0.0-alpha.2
Firebase: 5.0.4
Issue:
Whenever I import anything from expo-three
, some components of the Firebase SDK do not work properly. Specifically, querying the realtime database does not work. The callback functions do not get called, but no errors are logged.
firebase.database().ref("test").on("value", fnSuccess, fnFailure);
I've narrowed it down to expo-three
and put a simple app here:
https://github.com/ericmorgan1/expo-three-firebase
In App.js
, commenting/uncommenting the expo-three
import will cause the firebase callbacks to either fire or not fire.
I know this issue is related to a third-party product, and I would be happy to do some debugging or other testing. But since there were no error messages, it was difficult figuring out where to begin.
I also tried this with Firebase SDK v4.10 and got the same issues.
Here's the code: https://github.com/abdielou/npot-bug-demo
When rendering models with NPOT texture images, the app breaks with the following stack trace:
Can't find variable: HTMLImageElement
makePowerOfTwo
uploadTexture three.js:18101:25
setTexture2D
setTexture2D
setValueT1 three.js:4971:2
...
Version:
"dependencies": {
"expo": "^25.0.0",
"expo-graphics": "^0.0.3",
"expo-three": "^2.2.2-alpha.1",
"react": "16.2.0",
"react-native": "0.52.0",
"three": "^0.90.0"
}
Hi,
was trying to test out the augmented reality portion but i only have an Android device. Given that ARCore was recently releasing a new version, will you also make it compatible with ARCore?
I really like what has been done to get three.js working properly on react native and appreciate that this project is literally about three.js on expo but I don't use expo and don't want to add 30 odd megabytes to my app size just to get WebGL and three.js working.
I tried installing it and even though expo isn't in the dependency list (it should be as it stands) it fails to work as it needs expo.
Trawling through the code I see most of the references are to small expo utility functions so it seems like the dependency could be unwound pretty easily.
Is there any interest in doing this from the maintainer(s) or is there a want for business reasons of platform lock in?
sorry I am having a hard time to learn how to use this method, can someone recommend where I can find examples to learn placing an object to a detected plane(such as a length measurer example)?
I tried google many times but all the examples I find mainly involve three.js only, without any plane manipulation with objects
and I am sorry i hope I am not asking these questions in a wrong place I also tried to find [expo-three] on stackoverflow but with no luck
kevin
Hi, I am having an issue getting shadows to work on my ios device(iphone se, ios 11.3). Made a simple scene to illustrate the problem, the topleft image on the screenshots below is the output of the shadowcamera.
When running examples from threejs.org in the browser the shadows works as expected.
Anyone else had problems like that?
Any ideas what could be causing it?
Thanks!
how to use ShaderMaterial as a texture ?
i do not want to use webgl apis.
Using a localUri like this: file:///data/user/0/host.exp.exponent/cache/ExperienceData/%2540arttu%252Fsignlab-ar-prototype/ExponentAsset-edefa90125f1739a3f962678f2406a92.bin
Android outputs an error from THREE.FileLoader:
https://github.com/mrdoob/three.js/blob/master/examples/js/loaders/GLTFLoader.js#L1778
The problem sounded similar as in the below link, being an issue only on Android and relating to localUri. However Expo.FileSystem is not being used and Expo.Asset.downloaded flag is set to true.
https://github.com/expo/expo-three/issues/10/
I also noticed that AssimpLoaderExample.js from expo-three outputs a similar error on Android. I guess there are somehow related as both use THREE.FileLoader and are in binary buffer file format.
All of this works perfectly on iOS!
Displaying text in a plane with Three requires text placed on a canvas, then used a texture.
However, doing this in expo causes the error 'Invalid pixel data argument for gl.textImage2D()' I imagine because ExpoThree needs to wrap this texture data somehow.
Am I missing something, or is native magic needed for this?
Steps to Reproduce:
"assetBundlePatterns": [
"assets/sprites/bg.png"
],
to app.json.
And on this step local project worked properly, but in app installed from builded apk there is no background image, just black (with 2.. expo-three versions) or white (with 3.0.0-alpha.2) screen. There is bg.png in extracted by apktool folder, and i can use it for example for RN Image like this
<Image source={Asset.fromModule(Files.sprites.bg)} style={{width: '100%', height: '100%'}}/>
, and it appears properly in standalone app.
I tried to remove Asset.fromModule().downloadAsync() since we have bundled file, i tried to create texture by Three.Texture constructor and THREE.TextureLoader().load passing to them, Expo.Asset, require(), or just string with path, and i tried to use different paths like "assets/sprites/bg.png", "./assets/sprites/bg.png", "/assets/sprites/bg.png", and so on.
There is another game I develop, it use different than flappy asset system, and also tried different variants to get assets and create textures, but it have exactly the same issue.
I tested it on Android 5.0 device.
Getting an error -> "ExpoTHREE.loadAsync: OMG this asset couldn't be downloaded! Open an issue on GitHub"
App.json ->
{
"expo": {
"sdkVersion": "27.0.0",
"packagerOpts": {
"assetExts": [
"dae",
"obj",
"amf",
"mtl",
"3DS"
]
}
}
}
uncapitalised, I'm getting an unexpected character error
I'm trying to load an obj with an mtl, but for some reason the mtl is never applied.
code ->
import Expo from 'expo';
import React from 'react';
import * as THREE from 'three';
import ExpoTHREE from 'expo-three';
console.disableYellowBox = true;
export default class App extends React.Component {
render() {
return (
<Expo.GLView
ref={(ref) => this._glView = ref}
style={{ flex: 1 }}
onContextCreate={this._onGLContextCreate}
/>
);
}
_onGLContextCreate = async (gl) => {
const width = gl.drawingBufferWidth;
const height = gl.drawingBufferHeight;
const arSession = await this._glView.startARSessionAsync();
const scene = new THREE.Scene();
const camera = ExpoTHREE.createARCamera(arSession, width, height, 0.01, 1000);
const renderer = ExpoTHREE.createRenderer({ gl });
renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
scene.background = ExpoTHREE.createARBackgroundTexture(arSession, renderer);
const ambientLight = new THREE.AmbientLight(0xaaaaaa);
scene.add(ambientLight);
// //model files
const model = {
'artest.obj': require('./models/artest.obj'),
'artest.mtl': require('./models/artest.mtl'),
};
/// Load model
const mesh = await ExpoTHREE.loadAsync(
[model['artest.obj'], model['artest.mtl']],
null,
name => model[name],
);
/// Update size and position
ExpoTHREE.utils.scaleLongestSideToSize(mesh, 5);
ExpoTHREE.utils.alignMesh(mesh, { y: 1 });
mesh.scale.set(0.00001, 0.00001, 0.00001);
/// Add the mesh to the scene
scene.add(mesh);
const animate = () => {
requestAnimationFrame(animate);
renderer.render(scene, camera);
gl.endFrameEXP();
}
animate();
}
}
If you would have any input as to what I might be doing wrong, would be great.
I need to use bones created in Blender to control my mesh in ExpoThree. Is there an example to load .json
files and render them in the scene.
I tried await ExpoTHREE.parseAsync(require('../../Assets/models/hand.json'))
to load the mesh but getting a strange error: Requiring unknown module "3324"...
.
npm ERR! @expo/[email protected] postinstall: find ../ -name .babelrc -delete
npm ERR! Exit status 2
npm ERR!
npm ERR! Failed at the @expo/[email protected] postinstall script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
Hi !
I just got to use expo-three and I was wondering if we could use materials like :
const material = new THREE.MeshBasicMaterial( { color: 0xffffff, envMap: this.scene.background } );
It would make sense in three I guess but here all I got is a black texture
Thanks !
After I use the Expo.Camera or Expo.ImagePicker camera component in one of my app tabs, the ThreeAR.Camera in my Expo AR Scene in the other tab freezes.
Is there a way to reset the ThreeAR.Camera after using the phone's camera in another screen?
Thanks
Hi,
I'm having a weird issue, I wrote my own app in expo and one of those views uses expo three with AR.
I was testing with the expo app and expo three with AR works great but when trying to export my project with exp build:ios everything works but when it comes to the AR I'am not able to see the camera and the app freeze.
I tried with the example AR code that is in this snack: https://snack.expo.io/Hk1C_YqjW and i have the same problem. I also granted the camera permission.
I don't believe there is a problem with the dependencies because creating ExpoTHREE.createARCamera without arSession works great.
Thanks.
I'm looking at the examples but I could not find a way to load remote textures. Is it possible?
Hi there, if i use this library there is no problem for publishing app in apk ?
and next im looking for human body to move in 360dig and its possbile to define some parts on body for click and see alert ?!
thanks
I want to implement THREE.LensFlare to expo-three scene but it does not work. I dig deep down three.js and expo-three build codes. Everything is okay and there is no error. Textures are loading via await ExpoTHREE.loadTextureAsync({ asset: Assets.textures[file] }) as well but does not appear while rendering.
async getTexture(file) {
return await ExpoTHREE.loadTextureAsync({ asset: Assets.textures[file] });
}
async initLights() {
let dirLight = new THREE.DirectionalLight(0xffffff, 0.125);
dirLight.position.set(0, -1, 0).normalize();
this.scene.add(dirLight);
let materialLoader = new MaterialLoader();
let textureFlare0 = await materialLoader.getTexture('lensflare_0');
let textureFlare1 = await materialLoader.getTexture('lensflare_1');
let textureFlare2 = await materialLoader.getTexture('lensflare_2');
let light = new THREE.PointLight(0xffffff, 1.5, 4500);
light.position.set(0, 0, -20);
this.scene.add(light);
light.color = new THREE.Color(0xffffff);
let flareColor = new THREE.Color(0xffffff);
flareColor.copy(light.color);
let lensflare = new THREE.LensFlare(textureFlare0, 700, 0.0, THREE.AdditiveBlending, flareColor);
lensflare.add(textureFlare1, 512, 0.0, THREE.AdditiveBlending);
lensflare.add(textureFlare1, 512, 0.0, THREE.AdditiveBlending);
lensflare.add(textureFlare1, 512, 0.0, THREE.AdditiveBlending);
lensflare.add(textureFlare2, 60, 0.6, THREE.AdditiveBlending);
lensflare.add(textureFlare2, 70, 0.7, THREE.AdditiveBlending);
lensflare.add(textureFlare2, 120, 0.9, THREE.AdditiveBlending);
lensflare.add(textureFlare2, 70, 1.0, THREE.AdditiveBlending);
lensflare.position.copy(light.position);
this.scene.add(lensflare);
}
Here is I want to implement to expo-three :
https://github.com/mrdoob/three.js/blob/r88/examples/webgl_lensflares.html
Demo:
https://threejs.org/examples/webgl_lensflares.html
Result (there is no error):
It works on android and ios browser.
Hi,
I'm totally new to react native and expo/expo-three.
I am trying to build a simple 3d scene, and now i'm just loading in some test obj.
I ran in to some issues
Error: You tried to download an Expo.Asset and for some reason it didn't cache... Known reasons are: it's an .mtl file
and console.error"ExpoThree.loadAsync:OMG,this asset couldn't be downloaded! Open an issue on GitHub."
If i remove the mtl line from loading the mesh the obj loads fine, but without any textures. My app.json also includes mtl as well.
How do i load texture without the mtl? Is that possible?
https://snack.expo.io/@wmts/batman-obj
Here's the snack version of my js file.
Hi, thanks for your hard work, loving the project!
Raycasting and handling taps is straightforward, but could you maybe point me in the right direction as to how to implement a drag & drop? I can't seem to figure out how to "pick up" a model and drag it with my finger. Linking up PanResponder
with Three.js
isn't exactly obvious and I can't use Animated.View
so the stuff I find online isn't very relevant. Thanks again for this project!
I recently upgraded to Expo v28.
When I import { THREE } from 'expo-three';
, I get this error:
Couldn't find preset "es2015" relative to directory "/<project>/node_modules/gl-matrix"
I was able to fix it (I think) by running:
npm install babel-cli babel-preset-es2015
However, can this dependency be packaged with the expo-three
install?
Here is my package.json:
{
"main": "node_modules/expo/AppEntry.js",
"private": true,
"scripts": {
"test": "node ./node_modules/jest/bin/jest.js --watchAll"
},
"jest": {
"preset": "jest-expo"
},
"dependencies": {
"@expo/samples": "2.1.1",
"expo": "^28.0.0",
"expo-graphics": "^1.0.0-alpha.1",
"expo-three": "^3.0.0-alpha.1",
"firebase": "^5.0.4",
"react": "16.3.1",
"react-native": "https://github.com/expo/react-native/archive/sdk-28.0.0.tar.gz",
"react-navigation": "2.3.1"
},
"devDependencies": {
"jest-expo": "^28.0.0"
}
}
"dependencies": {
"expo": "^26.0.0",
"expo-three": "^2.2.2-alpha.1",
"react": "16.3.0-alpha.1",
"react-native": "0.54.0"
}
Started a brand new project via CRNA and installed expo-three, getting the following error:
Went through all the recommended debugging suggestions (cleared watchman, reinstalled node_modules, reset cache, etc)
The expo-three sample runs fine on an emulator running android 7.1.1
On my real device, a Nexus 5, it fails with:
THREE.WebGLShader: Shader couldn't compile.
THREE.WebGLShader: gl.getShaderInfoLog() vertex Vertex shader compilation failed.
ERROR: 0:93: 'const' : overloaded functions must have the same parameter qualifiers
ERROR: 1 compilation errors. No code generated.
The same Nexus 5 can successfully run the snack at https://snack.expo.io/rkpPMg8ie via qr code, but that's not using expo-three.
Is my phone just too old?
Hi I can't actually detect any planes with getPlanes function. I can get feature points using getRawFeaturePoints but when I try to detect planes planes.length always seems to return 0. I'm developing with the latest expo and expo-three. I am new to expo and react - any help is much appreciated thanks.
const { planes } = ExpoTHREE.getPlanes(this.arSession); if (planes.length > 0) { console.log("found plane"); }
planes.length is always 0
Hi folks! Thanks for maintaining this project!
I'm wondering if you have an example of loading an OBJ file and applying a texture? All I want to achieve is to being able to load a model, apply a texture and rotate it.
This is what I have so far:
import React, { Component } from 'react';
import { ScrollView, Text, StyleSheet } from 'react-native';
import Expo, { Asset, GLView } from 'expo';
import * as THREE from 'three';
import ExpoTHREE from 'expo-three';
global.THREE = THREE;
require('./OBJLoader');
console.disableYellowBox = true;
export default class ModelScreen extends Component {
static navigationOptions = {
title: '3D Model',
};
state = {
loaded: false,
}
componentWillMount() {
this.preloadAssetsAsync();
}
async preloadAssetsAsync() {
await Promise.all([
require('../assets/suzuki.obj'),
require('../assets/MotociklySuzuki_LOW001.jpg'),
require('../assets/male02.obj'),
require('../assets/UV_Grid_Sm.jpg'),
].map((module) => Asset.fromModule(module).downloadAsync()));
this.setState({ loaded: true });
}
onContextCreate = async (gl) => {
const width = gl.drawingBufferWidth;
const height = gl.drawingBufferHeight;
console.log(width, height);
gl.createRenderbuffer = () => {};
gl.bindRenderbuffer = () => {};
gl.renderbufferStorage = () => {};
gl.framebufferRenderbuffer = () => {};
const camera = new THREE.PerspectiveCamera( 45, width / height, 1, 2000 );
camera.position.z = 250;
const scene = new THREE.Scene();
const ambient = new THREE.AmbientLight( 0x101030 );
const directionalLight = new THREE.DirectionalLight( 0xffeedd );
directionalLight.position.set( 0, 0, 1 );
scene.add(ambient);
scene.add(directionalLight);
// Texture
const textureAsset = Asset.fromModule(require('../assets/MotociklySuzuki_LOW001.jpg'));
const texture = new THREE.Texture();
texture.image = {
data: textureAsset,
width: textureAsset.width,
height: textureAsset.height,
};;
texture.needsUpdate = true;
texture.isDataTexture = true;
const material = new THREE.MeshPhongMaterial({ map: texture });
// Object
const modelAsset = Asset.fromModule(require('../assets/suzuki.obj'));
const loader = new THREE.OBJLoader();
const model = loader.parse(
await Expo.FileSystem.readAsStringAsync(modelAsset.localUri));
model.traverse((child) => {
if (child instanceof THREE.Mesh) {
child.material = material;
}
});
model.position.y = - 95;
scene.add(model);
const renderer = ExpoTHREE.createRenderer({ gl });
renderer.setPixelRatio(2);
renderer.setSize(width, height);
const animate = () => {
// camera.position.x += ( 7.25 - camera.position.x ) * .05;
// camera.position.y += ( 62.75 - camera.position.y ) * .05;
camera.lookAt( scene.position );
renderer.render( scene, camera );
gl.endFrameEXP();
requestAnimationFrame(animate);
};
animate();
};
render() {
return (
<ScrollView style={styles.container}>
<Text>This is your avatar:</Text>
{ this.state.loaded &&
<GLView
ref={(ref) => this.glView = ref}
style={styles.glview}
onContextCreate={this.onContextCreate}
/>
}
<Text>Something here!</Text>
</ScrollView>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
paddingTop: 15,
backgroundColor: '#fff',
},
glview: {
width: 350,
height: 500
},
});
I don't get any errors, the app builds and runs ok, but I don't see anything on the GLView, any pointers?
Thanks in advance!
I made a Snack here: https://snack.expo.io/ryABMCsLf
When you press the Add button, it should position an image at (0, 0, -0.7) [Slightly in front of the camera].
However, if I walk somewhere, or rotate the device, and press Add again, it adds the image back in the original position.
It seems that the device's position gets initialized once and never updated, even if the device moves.
How can I refresh that position so that adding new images will be relative to the camera's current position?
I found a bug on IOS 10.3.3 and Android 5.1.0, it's crashing when I create an Expo.GLView component
I initialized an empty project with this package.json, all latest version
"main": "node_modules/expo/AppEntry.js",
"private": true,
"dependencies": {
"expo": "^26.0.0",
"expo-three": "^2.2.2-alpha.1",
"react": "16.3.0-alpha.1",
"react-native": "https://github.com/expo/react-native/archive/sdk-26.0.0.tar.gz",
"three": "^0.91.0"
}
}
Then create a simple App.js (it's an example on readme of https://github.com/expo/expo-three) with Expo.GLView component, it crashed and out to phone's home screen everytime I open it. please help me on this.
My test for:
Iphone 5 IOS 10.3.3 - real device: crash
Iphone 6S-Plus IOS 11.2.6 - real device ok
So I have tried the example from "Creating a scene", and encountered some performance issues. I have tried running the same example in browser to compare performance.
On iOS (both an iPhone 5S and iPhone8) the framerate seems to be smooth, except that it stutters a little bit every second or so. The Perf Monitor says about 60 for both JS and UI on both phones.
On Android the framerate so bad that no FPS counter or profiler is needed to tell that. I ran the example on a Nexus 6 and got roughly between 40 and 55 FPS for JS and 60 for UI. On the same Nexus, if I run the similar example in Chrome, it seems to run much smoother.
I have also run the "Animation Cloth" demo, and I get similar results. Much better framerate in the browsers, especially on Android.
What kind of performance can be expected when using Expo.GLView as opposed to webgl in a webview? I read somewhere that Expo.GLView somehow bypasses the javascript bridge, but I cannot see how that can be as it seems like the game loop is run from javascript.
Are there any possible solutions to this on the horizon?
Is this possible, I've tried the things that works on Web with no success.
The first thing is to change the pixel ratio while conserving the same size (like in this web example)
this.renderer.setPixelRatio(scale * 0.5);
this.renderer.setSize(width, height);
Try snack
And this gave me a half screen sized render, not lower resolution full screen render.
The second thing I tried is the threejs documentation way which is
If you wish to keep the size of your app but render it at a lower resolution, you can do so by calling setSize with false as updateStyle (the third argument). For example, setSize(window.innerWidth/2, window.innerHeight/2, false) will render your app at half resolution, given that your has 100% width and height.
Kinda translate to this web example
Which (kind of) translates to:
// this.renderer.setPixelRatio(scale);
this.renderer.setSize(width, height);
this.renderer.setSize(width * 0.5, height * 0.5, false);
Try snack
And the same result as above.
So, my first impression it's all about a small bug.
If there is a way to do this please provide me a expo snack.
Lastly but equally important: This is an awesome tool, thanks for your support.
I am getting this error when loading remote images.
"THREE.WebGLState:", "EXGL: Invalid pixel data argument for gl.textImage2D()!"
Error can be dismissed, but the image appears as all black.
Snack:
https://snack.expo.io/By2XW0oUG
I believe similar code was working the last time I tried it.
I was trying to use expo-three within Viro React but i didn't seem to get it to work. It seems like it would be perfect if this was possible, given that Viro React has Augmented Reality + Virtual reality and both IOS and Android.
For a project i need to integrate a part of an existing THREEjs website into my native react AR app.
Any knowledge about this matter? I can provide detailed logs and explanations if required.
Great work on this. ๐
I see you mentioned plane detection on this issue, but has anything been done in this direction?
Currently considering trying to extract some features from react-native-arkit which apparently does have plane detection of some sort.
This project is amazing, really been enjoying using Three.js with ARKit! I saw some references to geometry faces
but didn't see anything about access the front-facing camera & by extension ARFaceGeometry. Would be really cool to manipulate face meshes using Three! Any plans to support this or did I miss something? Thanks!
Hello there.
I get an error when I'm trying to work with this package and it says that it is unable to resolve module react-native-console-time-polyfill
. It also gave me some advice to remove node_modules, clear $TMPDIR etc. I've tried all of these, but none of them works.
I'm just following your example you provided in the readme for this repo. I do believe it is something wrong with the ExpoTHREE-package, as it gives me the error when I try to include it.
I couldn't find any issues thas has been resolved, so I apologise if this is unnecessary.
Thanks in advance!
is it possible to use a video as a texture with this ?
Hi there,
I'll start by saying thanks for the great work :)
I am playing around with a low-poly flat water shader using the ThreeJS ShaderMaterial class.
The process to make this work is as follows (PS I've been frankensteining code from all over the place, but I've simplified the shader for illustration purposes):
Here's how the shader looks like in Chrome (desktop):
And here is how it looks like in iOS and Android - notice the weird static:
If use a mediump or lowp precision - it looks like this:
The only thing I can think of is that there is an issue in the dFdx and dFdy extension functions in the EXGLView implementation or my setup - but I don't have enough experience with WebGL/OpenGL (or Three for that matter) to say that.
Here is a JS Fiddle to the web example (the shader is identical to the one I use on mobile): https://jsfiddle.net/bberak/jgprdtLs/
And here is the shader code (copy + pasted from the Fiddle):
var material = new THREE.ShaderMaterial({
vertexShader: `
precision highp float;
//-- START NOISE FUNCTIONS
vec3 mod289(vec3 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; } vec4 mod289(vec4 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; } vec4 permute(vec4 x) { return mod289(((x*34.0)+1.0)*x); } vec4 taylorInvSqrt(vec4 r) { return 1.79284291400159 - 0.85373472095314 * r; } vec3 fade(vec3 t) { return t*t*t*(t*(t*6.0-15.0)+10.0); } float cnoise(vec3 P) { vec3 Pi0 = floor(P); vec3 Pi1 = Pi0 + vec3(1.0); Pi0 = mod289(Pi0); Pi1 = mod289(Pi1); vec3 Pf0 = fract(P); vec3 Pf1 = Pf0 - vec3(1.0); vec4 ix = vec4(Pi0.x, Pi1.x, Pi0.x, Pi1.x); vec4 iy = vec4(Pi0.yy, Pi1.yy); vec4 iz0 = Pi0.zzzz; vec4 iz1 = Pi1.zzzz; vec4 ixy = permute(permute(ix) + iy); vec4 ixy0 = permute(ixy + iz0); vec4 ixy1 = permute(ixy + iz1); vec4 gx0 = ixy0 * (1.0 / 7.0); vec4 gy0 = fract(floor(gx0) * (1.0 / 7.0)) - 0.5; gx0 = fract(gx0); vec4 gz0 = vec4(0.5) - abs(gx0) - abs(gy0); vec4 sz0 = step(gz0, vec4(0.0)); gx0 -= sz0 * (step(0.0, gx0) - 0.5); gy0 -= sz0 * (step(0.0, gy0) - 0.5); vec4 gx1 = ixy1 * (1.0 / 7.0); vec4 gy1 = fract(floor(gx1) * (1.0 / 7.0)) - 0.5; gx1 = fract(gx1); vec4 gz1 = vec4(0.5) - abs(gx1) - abs(gy1); vec4 sz1 = step(gz1, vec4(0.0)); gx1 -= sz1 * (step(0.0, gx1) - 0.5); gy1 -= sz1 * (step(0.0, gy1) - 0.5); vec3 g000 = vec3(gx0.x,gy0.x,gz0.x); vec3 g100 = vec3(gx0.y,gy0.y,gz0.y); vec3 g010 = vec3(gx0.z,gy0.z,gz0.z); vec3 g110 = vec3(gx0.w,gy0.w,gz0.w); vec3 g001 = vec3(gx1.x,gy1.x,gz1.x); vec3 g101 = vec3(gx1.y,gy1.y,gz1.y); vec3 g011 = vec3(gx1.z,gy1.z,gz1.z); vec3 g111 = vec3(gx1.w,gy1.w,gz1.w); vec4 norm0 = taylorInvSqrt(vec4(dot(g000, g000), dot(g010, g010), dot(g100, g100), dot(g110, g110))); g000 *= norm0.x; g010 *= norm0.y; g100 *= norm0.z; g110 *= norm0.w; vec4 norm1 = taylorInvSqrt(vec4(dot(g001, g001), dot(g011, g011), dot(g101, g101), dot(g111, g111))); g001 *= norm1.x; g011 *= norm1.y; g101 *= norm1.z; g111 *= norm1.w; float n000 = dot(g000, Pf0); float n100 = dot(g100, vec3(Pf1.x, Pf0.yz)); float n010 = dot(g010, vec3(Pf0.x, Pf1.y, Pf0.z)); float n110 = dot(g110, vec3(Pf1.xy, Pf0.z)); float n001 = dot(g001, vec3(Pf0.xy, Pf1.z)); float n101 = dot(g101, vec3(Pf1.x, Pf0.y, Pf1.z)); float n011 = dot(g011, vec3(Pf0.x, Pf1.yz)); float n111 = dot(g111, Pf1); vec3 fade_xyz = fade(Pf0); vec4 n_z = mix(vec4(n000, n100, n010, n110), vec4(n001, n101, n011, n111), fade_xyz.z); vec2 n_yz = mix(n_z.xy, n_z.zw, fade_xyz.y); float n_xyz = mix(n_yz.x, n_yz.y, fade_xyz.x); return 2.2 * n_xyz; } float pnoise(vec3 P, vec3 rep) { vec3 Pi0 = mod(floor(P), rep); vec3 Pi1 = mod(Pi0 + vec3(1.0), rep); Pi0 = mod289(Pi0); Pi1 = mod289(Pi1); vec3 Pf0 = fract(P); vec3 Pf1 = Pf0 - vec3(1.0); vec4 ix = vec4(Pi0.x, Pi1.x, Pi0.x, Pi1.x); vec4 iy = vec4(Pi0.yy, Pi1.yy); vec4 iz0 = Pi0.zzzz; vec4 iz1 = Pi1.zzzz; vec4 ixy = permute(permute(ix) + iy); vec4 ixy0 = permute(ixy + iz0); vec4 ixy1 = permute(ixy + iz1); vec4 gx0 = ixy0 * (1.0 / 7.0); vec4 gy0 = fract(floor(gx0) * (1.0 / 7.0)) - 0.5; gx0 = fract(gx0); vec4 gz0 = vec4(0.5) - abs(gx0) - abs(gy0); vec4 sz0 = step(gz0, vec4(0.0)); gx0 -= sz0 * (step(0.0, gx0) - 0.5); gy0 -= sz0 * (step(0.0, gy0) - 0.5); vec4 gx1 = ixy1 * (1.0 / 7.0); vec4 gy1 = fract(floor(gx1) * (1.0 / 7.0)) - 0.5; gx1 = fract(gx1); vec4 gz1 = vec4(0.5) - abs(gx1) - abs(gy1); vec4 sz1 = step(gz1, vec4(0.0)); gx1 -= sz1 * (step(0.0, gx1) - 0.5); gy1 -= sz1 * (step(0.0, gy1) - 0.5); vec3 g000 = vec3(gx0.x,gy0.x,gz0.x); vec3 g100 = vec3(gx0.y,gy0.y,gz0.y); vec3 g010 = vec3(gx0.z,gy0.z,gz0.z); vec3 g110 = vec3(gx0.w,gy0.w,gz0.w); vec3 g001 = vec3(gx1.x,gy1.x,gz1.x); vec3 g101 = vec3(gx1.y,gy1.y,gz1.y); vec3 g011 = vec3(gx1.z,gy1.z,gz1.z); vec3 g111 = vec3(gx1.w,gy1.w,gz1.w); vec4 norm0 = taylorInvSqrt(vec4(dot(g000, g000), dot(g010, g010), dot(g100, g100), dot(g110, g110))); g000 *= norm0.x; g010 *= norm0.y; g100 *= norm0.z; g110 *= norm0.w; vec4 norm1 = taylorInvSqrt(vec4(dot(g001, g001), dot(g011, g011), dot(g101, g101), dot(g111, g111))); g001 *= norm1.x; g011 *= norm1.y; g101 *= norm1.z; g111 *= norm1.w; float n000 = dot(g000, Pf0); float n100 = dot(g100, vec3(Pf1.x, Pf0.yz)); float n010 = dot(g010, vec3(Pf0.x, Pf1.y, Pf0.z)); float n110 = dot(g110, vec3(Pf1.xy, Pf0.z)); float n001 = dot(g001, vec3(Pf0.xy, Pf1.z)); float n101 = dot(g101, vec3(Pf1.x, Pf0.y, Pf1.z)); float n011 = dot(g011, vec3(Pf0.x, Pf1.yz)); float n111 = dot(g111, Pf1); vec3 fade_xyz = fade(Pf0); vec4 n_z = mix(vec4(n000, n100, n010, n110), vec4(n001, n101, n011, n111), fade_xyz.z); vec2 n_yz = mix(n_z.xy, n_z.zw, fade_xyz.y); float n_xyz = mix(n_yz.x, n_yz.y, fade_xyz.x); return 2.2 * n_xyz; }
//-- END NOISE FUNCTIONS
uniform float time;
uniform float terrain_seed;
uniform float terrain_height;
varying vec4 pos;
void main() {
float displacement = cnoise(terrain_seed * position + vec3(time));
vec3 newPosition = vec3(position.xy, displacement * terrain_height);
pos = modelViewMatrix * vec4(newPosition, 1.0);
gl_Position = projectionMatrix * modelViewMatrix * vec4(newPosition, 1.0);
}`,
fragmentShader: `
precision highp float;
varying vec4 pos;
void main() {
vec3 normal = normalize(cross(dFdx(pos.xyz), dFdy(pos.xyz)));
gl_FragColor = vec4(normal, 1.0);
}`,
uniforms: {
time: {
value: 0.0
},
terrain_seed: {
value: 0.5
},
terrain_height: {
value: 1.0
},
},
extensions: {
derivatives: true
}
});
Any help would be greatly appreciated, and let me know if I can provide more info.
Kind Regards,
Boris
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.