Giter Site home page Giter Site logo

xeokit / xeokit-convert Goto Github PK

View Code? Open in Web Editor NEW
48.0 8.0 44.0 287.78 MB

Convert various AEC model formats for efficient viewing in the browser with xeokit.

Home Page: https://xeokit.github.io/xeokit-convert/docs/

License: Other

JavaScript 99.94% Handlebars 0.06%
gltf converter ifc 3dxml cityjson bim webgl xkt las laz

xeokit-convert's People

Contributors

aidan2129 avatar amoki avatar db-ec-hamburg-bim avatar raukie avatar tmarti avatar woweh avatar xeolabs avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xeokit-convert's Issues

memory access out of bounds

Hi,

I follow this instruction

Capture2

I build the wasm successfully.

Then I replaced the "web-ifc.wasm" in xeokit-convert/dist and launched the convertion but it was not worked. I got this

Capture

Any idea for me ?

Add textures to XKT

A plan for adding textures to XKT

Overview

  • add textures to a new XKT v10-alpha format,
  • extend convert2xkt to extract all the material types from glTF and encode them into XKT using this new format
  • extend XKTLoaderPlugin to load XKT 10-alpha
  • extend xeokit’s 3D viewer scene representation and renderer (specifically, PerformanceModel and its renderer) to store and render textures

Resources

Adding textures to XKT

To start with, add these low-level fields to the XKT format:

  • Add uvs array, which holds all UVs
  • Add eachGeometryUVsPortion array
  • Add textures array, which holds data for all textures

To map each mesh to an arbitrary set of textures in the texturesArray.

  • eachMaterialType - for each material, it's type: 0 = custom, 1 = metallic/rough PBR, 2 = specular/rough PBR, 3 = Blinn/Phong etc.
  • eachMaterialTextures - for each material, an arbitrary-length array of indices of textures used by that material - interpretation of what the textures depends on material type
  • eachMaterialAttributes - for each material, an arbitrary-length array of floats - interpretation depends on material type

Then we re-purpose the existing eachMeshMaterial field, making it index the eachMaterialXXX arrays, instead of defining per-mesh material colors inline.

xeokit-convert JS API

Some additions to the JS programming API within xeokit-convert:

  • Add XKTPhongMaterial, XKTMetalRoughMaterial and XKTSpecularRoughMaterial classes
  • Add XKTModel#createMaterial method
  • Add materialId param to XKTModel#createMesh method
  • Add XKTModel#materials map

WIP

expected: params.primitiveType

fix this https://github.com/xeokit/xeokit-gltf-to-xkt/issues/39

(node:23672) UnhandledPromiseRejectionWarning: Parameter expected: params.primitiveType
(Use node --trace-warnings ... to show where the warning was created)
(node:23672) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag --unhandled-rejections=strict (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:23672) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

Issues with las to xkt-conversion

I had several issues with converting a las-file to xkt. I'm neither an expert for node.js nor for 3d-file-formats or ifc. Our goal is to load an ifc-file and a point cloud scan of the built model into the same viewer. Unfortunately, I can't provide the raw input data. Both files are in the same coordinate system. If I open the raw data with blender both models are on top of each other and everything is fine.

I did some edits directly in dist/convert2xkt.cjs.js to get the conversion to work:

a) (see #26 ): I just added "typeof Response !== 'undefined'" to some ifs at various places. (main issue was in function validHTTPResponse(), somewhere around line 50960. Can this ever be defined within a CLI-tool?)

b) the --rotatex parameter is not used for LAS-files, only for LAZ. (there is some big switch-block with various file formats in function convert2xkt, somewhere around line 83500.) I don't know if this was intentional or just forgotten, but for my point cloud it didn't work without it. From my understanding, LAS and LAZ should be basically the same, LAZ has just better compression?

c) the entire point cloud was mirrored. To fix this I changed the coordinate shifting in function readPositions (around line 71450) to positionsValue[i + 2] = temp * -1; If this is, depending on the input-data, an issue in some cases, but not in all, maybe this should be another cli parameter?

Not sure if this is helpful, but with these changes, so far, everything seems to be working fine for me.

Problems consolidating packages

When I started testing, we had problems restoring packages. I emphasize here the caution for the release and warnings about possible related problems.

Test 1:

Installing with npm i @xeokit/xeokit-convert

Breakage when bringing package

Test 2:

Cloning Release

Missing files like convert2xkt.js

Test 3:

Download the Release in ZIP format
At the time of restoring the package, some dependencies are discontinued and with errors in the download. The packages are:

puppeteer-firefox
@percy/script

image

Another point of care is the versions of dependencies, although xeokit-convert was released at the end of April, we still have some packages that have become obsolete.

Package Avalible Latest
@loaders.gl/core 3.0.13 3.2.3
@loaders.gl/json 3.0.13 3.2.3
@loaders.gl/las 3.0.13 3.2.3
@loaders.gl/obj 3.2.3 3.2.3
@loaders.gl/ply 3.2.3 3.2.3
@loaders.gl/polyfills 3.2.3 3.2.3
@zip.js/zip.js 2.2.38 2.4.12
autoprefixer: 9.8.5 10.4.7
earcut 2.2.2 2.2.3
jszip 3.6.0 3.10.0
puppeteer 10.4.0 14.1.2
request 2.79.0 2.88.2
web-ifc 0.0.30 0.0.35
xmldom 0.6.0 0.6.0
rollup 2.46.0 2.75.5
rollup-plugin-commonjs 10.1.0 22.0.0
rollup-plugin-node-resolve 5.2.0 13.3.0

Terminal

PS D:\Downloads\xeokit-convert-1.0.7> npm update npm WARN deprecated @percy/[email protected]: PercyScript has been deprecated in favor of Percy CLI's snapshot command: https://docs.percy.io/docs/percy-snapshot Please upgrade to use the CLI rather than this package, as it's no longer actively supported. npm WARN deprecated [email protected]: request-promise has been deprecated because it extends the now deprecated request package, see https://github.com/request/request/issues/3142 npm WARN deprecated [email protected]: this library is no longer supported npm WARN deprecated [email protected]: This package has been deprecated and is no longer maintained. Please use @rollup/plugin-node-resolve. npm WARN deprecated [email protected]: jsSHA versions < 3.0.0 will no longer receive feature updates npm WARN deprecated [email protected]: This package has been deprecated and is no longer maintained. Please use @rollup/plugin-commonjs. npm WARN deprecated [email protected]: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details. npm WARN deprecated [email protected]: request has been deprecated, see https://github.com/request/request/issues/3142 npm WARN deprecated [email protected]: support for ECMAScript is superseded by uglify-js` as of v3.13.0
npm WARN deprecated [email protected]: This package is unmaintained and deprecated. See the GH Issue 259.
npm WARN deprecated [email protected]: Firefox support is gradually transitioning to the puppeteer package. As of puppeteer v2.1.0 you can interact with Firefox Nightly. The puppeteer-firefox package will remain available until the transition is complete, but it is no longer actively maintained. For more information visit https://wiki.mozilla.org/Remote
npm WARN deprecated [email protected]: Package no longer supported. Contact Support at https://www.npmjs.com/support for more info.
npm WARN deprecated @percy/[email protected]: @percy/agent is no longer supported. Please upgrade to @percy/cli https://docs.percy.io/docs/migrating-to-percy-cli
npm WARN deprecated [email protected]: core-js@<3.4 is no longer maintained and not recommended for usage due to the number of issues. Because of the V8 engine whims, feature detection in old core-js versions could cause a slowdown up to 100x even if nothing is polyfilled. Please, upgrade your dependencies to the actual version of core-js.
npm WARN deprecated [email protected]: core-js@<3.4 is no longer maintained and not recommended for usage due to the number of issues. Because of the V8 engine whims, feature detection in old core-js versions could cause a slowdown up to 100x even if nothing is polyfilled. Please, upgrade your dependencies to the actual version of core-js.
npm ERR! code 1
npm ERR! path D:\Downloads\xeokit-convert-1.0.7\node_modules\puppeteer-firefox
npm ERR! command failed
npm ERR! command C:\Windows\system32\cmd.exe /d /s /c node install.js
npm ERR! ERROR: Failed to download Firefox rv0.0.1!
npm ERR! Error: Download failed: server returned code 404. URL: https://github.com/puppeteer/juggler/releases/download/v0.0.1/firefox-win64.zip
npm ERR! at D:\Downloads\xeokit-convert-1.0.7\node_modules\puppeteer-firefox\lib\BrowserFetcher.js:264:21
npm ERR! at ClientRequest.requestCallback (D:\Downloads\xeokit-convert-1.0.7\node_modules\puppeteer-firefox\lib\BrowserFetcher.js:320:7)
npm ERR! at Object.onceWrapper (node:events:642:26)
npm ERR! at ClientRequest.emit (node:events:527:28)
npm ERR! at HTTPParser.parserOnIncomingClient (node:_http_client:631:27)
npm ERR! at HTTPParser.parserOnHeadersComplete (node:_http_common:128:17)
npm ERR! at TLSSocket.socketOnData (node:_http_client:494:22)
npm ERR! at TLSSocket.emit (node:events:527:28)
npm ERR! at addChunk (node:internal/streams/readable:315:12)
npm ERR! at readableAddChunk (node:internal/streams/readable:289:9)
npm ERR! -- ASYNC --
npm ERR! at BrowserFetcher. (D:\Downloads\xeokit-convert-1.0.7\node_modules\puppeteer-firefox\lib\helper.js:32:15)
npm ERR! at Object. (D:\Downloads\xeokit-convert-1.0.7\node_modules\puppeteer-firefox\install.js:47:16)
npm ERR! at Module._compile (node:internal/modules/cjs/loader:1105:14)
npm ERR! at Object.Module._extensions..js (node:internal/modules/cjs/loader:1159:10)
npm ERR! at Module.load (node:internal/modules/cjs/loader:981:32)
npm ERR! at Function.Module._load (node:internal/modules/cjs/loader:822:12)
npm ERR! at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12)
npm ERR! at node:internal/main/run_main_module:17:47

npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\thiago.marques\AppData\Local\npm-cache_logs\2022-06-01T17_28_25_233Z-debug-0.log`

Remove 3DXML support

Removing this because

  • we don't have any use for 3DXML -> XKT conversion, and the instancing transforms have been broken since it was added
  • less dependencies/maintenance

Convert glTF using loaders.gl

Use loaders.gl to help with glTF parsing.

This has the following benefits:

  • Converts .glb
  • Handles more geometry formats (interleaving etc)
  • Easy to support Draco mesh compression
  • Easy to support Basis texture compression

Fix conversion of glTF with external buffers

convert2xkt glTF->XKT conversion mode breaks when the glTF file provides its geometry in a separate binary file attachment.

This can be reproduced by running the open source converter tools like this:

IfcConvert-v0.6.0-517b819-linux64/IfcConvert ./myModel.ifc ./myModel.dae

COLLADA2GLTF/build/COLLADA2GLTF-bin -i ./myModel.dae -o ./myModel.gltf -s 

convert2xkt.js -s ./myModel.gltf -o ./myModel.xkt -l

convert2xkt uses fs.readFileSync to load the binary file, and expects the loaded data to be an ArrayBuffer. However, that function returns a Buffer.

Solution is to simply convert the Buffer to an ArrayBuffer.

STL conversion error

XKT version: 9
node version: v14.17.0

Converted a file to STL with Blender 2.79

Conversion output:

[convert2xkt] Reading input file: house.stl
[convert2xkt] Input file size: 123.48 kB
[convert2xkt] Converting...
TypeError: First argument to DataView constructor must be an ArrayBuffer
at new DataView ()
at isBinary (/home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:54156:20)
at /home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:54131:13
at new Promise ()
at parseSTLIntoXKTModel (/home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:54090:12)
at convert (/home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:64016:13)
at convertForFormat (/home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:64000:21)
at /home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:63916:13
at new Promise ()
at convert2xkt (/home/####/node_modules/@xeokit/xeokit-convert/dist/convert2xkt.cjs.js:63844:12)

Option to disable geometry reuse

Geometry instancing is the practice of rendering multiple copies of the same mesh in a scene at once.

WebGL rendering performance can really suffer when the number of instanced meshes is too high, even if the number of instances of each mesh is still low.

For this reason, we need the option to disable geometry instancing for convert2xkt and parseGLTFIntoGeometry.

This would "expand" instanced meshes, creating duplicate copes within the .xkt. The penalty is a larger .xkt file, where the size increase depends on how geometry reuse happens in the model.

convert2xkt

Add a disablegeoreuse (-g) option to make convert2xkt disable geometry reuse while converting.

Also add info on this new option to the tool's help info:

node convert2xkt.js -h
Usage: convert2xkt [options]

Options:
  -v, --version           output the version number
  -s, --source [file]     path to source file
  -f, --format [string]   source file format (optional); supported formats are gltf, ifc, laz, las, pcd, ply, stl and
                          cityjson
  -m, --metamodel [file]  path to source metamodel JSON file (optional)
  -i, --include [types]   only convert these types (optional)
  -x, --exclude [types]   never convert these types (optional)
  -r, --rotatex           rotate model 90 degrees about X axis (for las and cityjson)
  -g, --disablegeoreuse   disable geometry reuse (for ifc and gltf)
  -o, --output [file]     path to target .xkt file; creates directories on path automatically if not existing
  -l, --log               enable logging
  -h, --help              display help for command


XKT version: 9

Usage example

node convert2xkt.js \ 
-s IFC_Schependomlaan.gltf \
-m IFC_Schependomlaan.json \
-o IFC_Schependomlaan.xkt -l -g

Missing elements in the conversion

At the time of conversion, some models (large and small) had elements left out.

Using the pattern described by bimspot presents the correct elements as per the modeler.

Is there any specific configuration to solve this problem?

Link for model Link

node convert2xkt.js -s HID-R00.ifc -o HID-R00.xkt -l

...
XKTEntity has no meshes - won't create: 0C$jkpSpT1e9D_WKWEOFDV
XKTEntity has no meshes - won't create: 0C$jkpSpT1e9D_WKWEOF1G
XKTEntity has no meshes - won't create: 0C$jkpSpT1e9D_WKWEOFUf
XKTEntity has no meshes - won't create: 2QR6j3eqD1nBRYB9JI5DfZ
XKTEntity has no meshes - won't create: 2QR6j3eqD1nBRYB9JI5Dfk
XKTEntity has no meshes - won't create: 2QR6j3eqD1nBRYB9JI5DfR
XKTEntity has no meshes - won't create: 2QR6j3eqD1nBRYB9JI5Dhf
XKTEntity has no meshes - won't create: 2QR6j3eqD1nBRYB9JI5DhK
XKTEntity has no meshes - won't create: 2fTRVXJyfDtgdyqgs_GApO
XKTEntity has no meshes - won't create: 2fTRVXJyfDtgdyqgs_GAp6
XKTEntity has no meshes - won't create: 1H7Rle_gX6VukYdFGRTXRs
XKTEntity has no meshes - won't create: 0$80N_8mz639VZV5SqE5kl
XKTEntity has no meshes - won't create: 1gZfGABc1B7e4mLF2w0Tcr
XKTEntity has no meshes - won't create: 3kEVZD8cHFGOx3b1jmWCUJ
[convert2xkt] Converted to: XKT v9
[convert2xkt] XKT size: 956.79 kB
[convert2xkt] Compression ratio: 27.50
[convert2xkt] Conversion time: 82.68 s
[convert2xkt] Converted metaobjects: 1647
[convert2xkt] Converted property sets: 23082
[convert2xkt] Converted drawable objects: 4961
[convert2xkt] Converted geometries: 732
[convert2xkt] Converted triangles: 37916
[convert2xkt] Converted vertices: 81394
[convert2xkt] reuseGeometries: true
[convert2xkt] minTileSize: 1000
[convert2xkt] Writing XKT file: HID-R00.xkt

[IFC->XKT] Parse units from IFC

if the IFC has a conversion measure unit - it is not updated in the xkt model
i.e. :
#43= IFCSIUNIT(*,.LENGTHUNIT.,$,.METRE.);
#44= IFCDIMENSIONALEXPONENTS(1,0,0,0,0,0,0);
#45= IFCMEASUREWITHUNIT(IFCRATIOMEASURE(0.3048),#43);
#46= IFCCONVERSIONBASEDUNIT(#44,.LENGTHUNIT.,'FOOT',#45);

expected:
viewer.scene.metrics.units === 'feet'
actual:
viewer.scene.metrics.units === 'meters'

Create -r option for convert2xkt to rotate LAS & CityJSON about X

Create new convert2xkt option r, to rotate LAS and CityJSON models 90 degrees about the X-axis as they are converted.

Usage

node convert2xkt -h

Usage: convert2xkt [options]

Options:

    -v, --version           output the version number
    -s, --source [file]     path to source file
    -f, --format [string]   source file format (optional); supported formats are gltf, ifc, laz, las, stl and cityjson
    -m, --metamodel [file]  path to source metamodel JSON file (optional)
    -i, --include [types]   only convert these types (optional)
    -x, --exclude [types]   never convert these types (optional)
    -r, --rotatex           rotate model 90 degrees about X axis (for las and cityjson)
    -o, --output [file]     path to target .xkt file; creates directories on path automatically if not existing
    -l, --log               enable logging
    -h, --help              output usage information

XKT version: 9

To rotate a sample model:

node convert2xkt -s assets/models/laz/indoor.0.1.laz -o indoor.0.1.laz.xkt -r -l

Update npm package dependencies

I try to install as said in https://www.notion.so/Converting-Models-to-XKT-with-convert2xkt-fa567843313f4db8a7d6535e76da9380

git clone https://github.com/xeokit/xeokit-convert.git
cd xeokit-convert
npm install

Installation throw errors

D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convertPS>npm install
npm WARN deprecated @percy/[email protected]: PercyScript has been deprecated in favor of Percy CLI's snapshot command: https://docs.percy.io/docs/percy-snapshot Please upgrade to use the CLI rather than this package, as it's no longer actively supported.
npm WARN deprecated [email protected]: request-promise has been deprecated because it extends the now deprecated request package, see request/request#3142
npm WARN deprecated [email protected]: this library is no longer supported
npm WARN deprecated [email protected]: This version has been deprecated in accordance with the hapi support policy (hapi.im/support). Please upgrade to the latest version to get the best features, bug fixes, and security patches. If you are unable to upgrade at this time, paid support is available for older versions (hapi.im/commercial).
npm WARN deprecated [email protected]: This package has been deprecated and is no longer maintained. Please use @rollup/plugin-node-resolve.
npm WARN deprecated [email protected]: This version has been deprecated in accordance with the hapi support policy (hapi.im/support). Please upgrade to the latest version to get the best features, bug fixes, and security patches. If you are unable to upgrade at this time, paid support is available for older versions (hapi.im/commercial).
npm WARN deprecated [email protected]: jsSHA versions < 3.0.0 will no longer receive feature updates
npm WARN deprecated [email protected]: This module moved to @hapi/sntp. Please make sure to switch over as this distribution is no longer supported and may contain bugs and critical security issues.
npm WARN deprecated [email protected]: This version has been deprecated in accordance with the hapi support policy (hapi.im/support). Please upgrade to the latest version to get the best features, bug fixes, and security patches. If you are unable to upgrade at this time, paid support is available for older versions (hapi.im/commercial).
npm WARN deprecated [email protected]: request has been deprecated, see request/request#3142
npm WARN deprecated [email protected]: This package has been deprecated and is no longer maintained. Please use @rollup/plugin-commonjs.
npm WARN deprecated [email protected]: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic.
See https://v8.dev/blog/math-random for details.
npm WARN deprecated [email protected]: request has been deprecated, see request/request#3142
npm WARN deprecated [email protected]: this library is no longer supported
npm WARN deprecated [email protected]: This module moved to @hapi/hawk. Please make sure to switch over as this distribution is no longer supported and may contain bugs and critical security issues.
npm WARN deprecated [email protected]: support for ECMAScript is superseded by uglify-js as of v3.13.0
npm WARN deprecated [email protected]: This package is unmaintained and deprecated. See the GH Issue 259.
npm WARN deprecated [email protected]: Firefox support is gradually transitioning to the puppeteer package. As of puppeteer v2.1.0 you can interact with Firefox Nightly. The puppeteer-firefox package will remain available until the transition is complete, but it is no longer actively maintained. For more information visit https://wiki.mozilla.org/Remote
npm WARN deprecated [email protected]: core-js@<3.3 is no longer maintained and not recommended for usage due to the number of issues. Because of the V8 engine whims, feature detection in old core-js versions could cause a slowdown up to 100x even if nothing is polyfilled. Please, upgrade your dependencies to the actual version of core-js.
npm WARN deprecated [email protected]: core-js@<3.3 is no longer maintained and not recommended for usage due to the number of issues. Because of the V8 engine whims, feature detection in old core-js versions could cause a slowdown up to 100x even if nothing is polyfilled. Please, upgrade your dependencies to the actual version of core-js.
npm ERR! code 1
npm ERR! path D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\node_modules\puppeteer-firefox
npm ERR! command failed
npm ERR! command C:\WINDOWS\system32\cmd.exe /d /s /c node install.js
npm ERR! ERROR: Failed to download Firefox rv0.0.1!
npm ERR! Error: Download failed: server returned code 404. URL: https://github.com/puppeteer/juggler/releases/download/v0.0.1/firefox-win64.zip
npm ERR! at D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\node_modules\puppeteer-firefox\lib\BrowserFetcher.js:264:21
npm ERR! at ClientRequest.requestCallback (D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\node_modules\puppeteer-firefox\lib\BrowserFetcher.js:320:7)
npm ERR! at Object.onceWrapper (node:events:646:26)
npm ERR! at ClientRequest.emit (node:events:526:28)
npm ERR! at HTTPParser.parserOnIncomingClient (node:_http_client:618:27)
npm ERR! at HTTPParser.parserOnHeadersComplete (node:_http_common:128:17)
npm ERR! at TLSSocket.socketOnData (node:_http_client:482:22)
npm ERR! at TLSSocket.emit (node:events:526:28)
npm ERR! at addChunk (node:internal/streams/readable:315:12)
npm ERR! at readableAddChunk (node:internal/streams/readable:289:9)
npm ERR! -- ASYNC --
npm ERR! at BrowserFetcher. (D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\node_modules\puppeteer-firefox\lib\helper.js:32:15)
npm ERR! at Object. (D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\node_modules\puppeteer-firefox\install.js:47:16)
npm ERR! at Module._compile (node:internal/modules/cjs/loader:1103:14)
npm ERR! at Object.Module._extensions..js (node:internal/modules/cjs/loader:1157:10)
npm ERR! at Module.load (node:internal/modules/cjs/loader:981:32)
npm ERR! at Function.Module._load (node:internal/modules/cjs/loader:822:12)
npm ERR! at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12)
npm ERR! at node:internal/main/run_main_module:17:47

npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\garenovich\AppData\Local\npm-cache_logs\2022-04-22T14_54_44_386Z-debug-0.log

2022-04-22T14_54_44_386Z-debug-0.log

Node script convert2xkt issue with options

I have this issue with node script convert2xkt.js where the options where not recognised (e.g. I set "-s filename" and script acts as if it's not set).

In convert2xkt.js we access the options through "program." (e.g. program.source) and that doesn't work. I changed that to "options.source" and all the other "program.xxx" to "options.xxx" too, and now it works.

Funny thing is that is works as it is on windows but it didn't work on ubuntu. I would still suggest changing it to "options." instead of "program." as in the commander docs.

Seems like it's a bug or maybe I am doing something wrong?

convert2xkt.cjs.js - las/laz converting is not working correct

there are several errors in the file.
the function _flagSolidGeometries() runs into an error because sometimes
let maxNumPositions = -1; let maxNumIndices = -1;
are not changed and an Array with a negativ length can not be set.

in the function convert2xkt in
case "laz":
the variable rotateX is missing.
and the async function parse() can not get a loader when the data is a Buffer.

Error converting VectorWorks 20 file

When converting IFC files exported from VectorWorks 20 we get the following error:

Debugger listening on ws://127.0.0.1:9229/ae17d9a7-733d-4ea3-be1f-05c87148462e
For help, see: https://nodejs.org/en/docs/inspector
web-ifc: 0.0.29 threading: 0
Error: TypeError: Cannot read property 'value' of null
    at /home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48846:81
    at Array.forEach (<anonymous>)
    at parseRelatedItemsOfType (/home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48844:25)
    at parseSpatialChildren (/home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48792:5)
    at /home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48848:21
    at Array.forEach (<anonymous>)
    at parseRelatedItemsOfType (/home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48844:25)
    at parseSpatialChildren (/home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48784:5)
    at /home/gidoca/src/third_party/xeokit/xeokit-convert/xeokit-convert/dist/convert2xkt.cjs.js:48848:21
    at Array.forEach (<anonymous>)

The reason appears to be that the file contains lines like the following:

#66033=IFCRELDEFINESBYPROPERTIES('01b$bsqtH0eBDv9j4xF5mW',#1,$,$,($),#64425);

Specifically ($) seems to cause problems here (and does not really make sense, but still ends up in real world files). This commit on my fork contains a hack that allows the file to run through.

Fix extraneous lines for (glTF+JSON)->XKT workflow

Problem happens when triangles have irregular vertex winding orders, which defeats the edge generation algorithm:
Screenshot from 2022-04-12 15-16-28

Fix by enhancing the edge generation algorithm to handle the case for two triangles that face in different directions.

Result:
Screenshot from 2022-04-12 15-17-26

IFCSpace in xkt v9

I have problems with IFCSpace when I convert IFC files in xkt v9.
A metamodel is created, but there are no entity or “normal” objects created in the viewer.
I also downloaded the example xkt files “Duplex.ifc.xkt” in version v8 (https://github.com/xeokit/xeokit-sdk/blob/master/assets/models/xkt/v8/ifc/Duplex.ifc.xkt) and v9 (https://github.com/xeokit/xeokit-sdk/blob/master/assets/models/xkt/v9/ifc/Duplex.ifc.xkt) to check my problem.
When I load the v8 into the viewer I get the IFCSpaces (as visible entities) but in the v9 not (only metamodels).

Fix incompatibility with glTF and metadata from IFC2GLTFCxConverter

Issue

This fixes an incompatibility within xeokit-sdk an xeokit-bim-data with the metadata JSON and glTF created by IFC2GLTFCxConverter_v2.1.0_210616 ("converter").

For each IFC element that has a geometric representation (eg. IfcDoor), the converter currently creates a composite of objects, in which the root object has an ID that matches the IFC element, and its child objects represent its parts (eg. frame and panel). The child objects have random IDs that don't match any IFC elements, and the same type as their root.

This confuses xeokit-sdk and xeokit-bim-viewer, which expects every object in a model to correspond to an IFC element.

Fix

  • Published in npm @xeokit/xeokit-convert 1.0.3

In this fix, convert2xkt automatically combines such composite objects into single objects, that have IDs that match IFC elements.

The screenshot below shows the result of this combination. Using a TreeViewPlugin configured to be in "containment" hierarchy mode, we've made a single object made visible, which is an IfcDoor that contains a frame and panel, which are both separate meshes within the viewer. Toggling visibility on the IfcDoor, for example, will toggle both the frame and panel in unison.

Within the metadata created by IFC2GLTFCxConverter_v2.1.0_210616, that object was a represented as a composite of three metaobjects having type IfcDoor, in which the root had an IFC UUID, and the children had random IDs. By combining that, we're able to represent the whole door as one object in the tree view.

If we call myViewer.scene.setObjectsSelected(["<IFCDoor element ID>"], true), then that will select our object and set both the panel and frame to appear selected.

Screenshot from 2022-03-03 22-02-07

Add option to set minimum RTC tile size

Status

Released in @xeokit/xeokit-convert 1.0.9

Background

convert2xkt automatically partitions each model's objects into tiles.

The vertex coordinates of the objects within each tile may then be stored in XKT as single-precision values that are relative to the center of their tiles. XKT also compresses the coordinates to integers, by quantizing them to an unsigned 16-bit integer range that maps to the extents of their tiles.

As ````convert2xkt``` converts each object, it selects a tile to place it in, converting the object's coordinates to RTC (relative-to-center-of-tile) and quantizing them to the tile boundary.

Accuracy Loss Issue

We lose accuracy in the following case:

  • the model's objects are very distant from the origin, and
  • the allowable minimum tile size is too large to retain accuracy for such distant coordinates

This is because, within the IEEE754 floating point standard, the availability of precise floating point values becomes more sparse as the magnitude of the value increases.

Ultimately, it makes sense to automatically select the tile size as a function of this magnitude.

For now, however, we'll expose a new mintilesize option to convert2xkt.

Usage

node convert2xkt.js -h
Usage: convert2xkt [options]

Options:
  -v, --version               output the version number
  -s, --source [file]         path to source file
  -f, --format [string]       source file format (optional); supported formats are gltf, ifc, laz, las, pcd, ply, stl and cityjson
  -m, --metamodel [file]      path to source metamodel JSON file (optional)
  -i, --include [types]       only convert these types (optional)
  -x, --exclude [types]       never convert these types (optional)
  -r, --rotatex               rotate model 90 degrees about X axis (for las and cityjson)
  -g, --disablegeoreuse       disable geometry reuse (for ifc and gltf)
  -t, --mintilesize [number]  minimum diagonal tile size (optional, default 1000)
  -o, --output [file]         path to target .xkt file; creates directories on path automatically if not existing
  -l, --log                   enable logging
  -h, --help                  display help for command
convert2xkt.js -s ./Railway.json -o Railway.xkt -t 2000 -l
[convert2xkt] Reading input file: ./Railway.json
[convert2xkt] Input file size: 4521.41 kB
[convert2xkt] Converting...
[convert2xkt] Converting CityJSON 1.0
[convert2xkt] Converted to: XKT v9
[convert2xkt] XKT size: 846.34 kB
[convert2xkt] Compression ratio: 5.34
[convert2xkt] Conversion time: 2.44 s
[convert2xkt] Converted metaobjects: 123
[convert2xkt] Converted property sets: 0
[convert2xkt] Converted drawable objects: 120
[convert2xkt] Converted geometries: 36984
[convert2xkt] Converted triangles: 113537
[convert2xkt] Converted vertices: 170281
[convert2xkt] reuseGeometries: true
[convert2xkt] minTileSize: 2000
[convert2xkt] Writing XKT file: /Railway.xkt

Improved memory stability for solid mesh tests

This one will be a bit technical, just grab your 🍿!

Introduction: isTriangleMeshSolid function

By looking at the code, flagSolidGeometries is invoking isTriangleMeshSolid method here...

const isTriangleMeshSolid = (indices, positions) => {

This function seems to do the following:

It scans all edges in a certain triangle-defined surface and, if each edge is used exactly twice, it considers the surface as watertight. This means that the surface completely encloses the volume in such a way that water inside the volume could not escape though any hole in the surface (thus watertight).

How it works today

  1. It is considered that each triangle in the geometry defines exactly 3 edges: 1st=>2nd vertex indices; 2nd => 3rd vertex indices; 3rd => 1st vertex indices.

  2. In order to count each edge is found exactly twice in the surface, it need to count edges.
    Problem 1: in order to count how many times each edge is used, it uses a string hash key composed of...
    {edge vertex index 1}-{edge vertex index 2}
    ... and then it iterates all the triangles in the geometry, using a dictionary as auxiliary structure for counting how many times each edge is used.
    The problem is that this generates a massive number of string keys (plus insertion in a dictionary), and the number of generated string hashes is num-triangles-in-model * 3.

  3. In order for 2. to work well, the code needs to make sure any different vertices that have the same position coordinates are effectively considered the same, so it has a pre-processing stage to make sure that e.g. two indices referring to same coordinate (in case of duplicate vertices) are re-mapped the same index. Otherwise 2. would not work.
    Problem 2: an auxiliary dictionary is used to "uniquify" the vertices, and this again involves the generation of a tremendous amount of string keys. In this case the key is composed of...
    {vertex-pos-X},{vertex-pos-Y},{vertex-pos-Z}
    ... and the number of keys generated is exactly the same as the number of total vertices in the model.

The problem today

Even though isTriangleMeshSolid only uses temporary dictionaries (both in step 2. and 3.) to do its job, the massive amount of strings generated and stored in dictionaries is causing the Javascript Engine to suffer from performance and RAM issues, due to (I guess) the extensive usage of the GC to free the dictionaries plus the strings after each execution of isTriangleMeshSolid.

For example, the following RAM measurement method has been from nodejs before/after executing _flagSolidGeometries...

let usedMb = process.memoryUsage().heapUsed / 1024 / 1024;

(https://nodejs.org/api/process.html#processmemoryusage)

... and for a moderate IFC file (around 126 MB), the used memory by the process increases by around 107 MB after executing flagSolidGeometries.

Sketching a solution

I suppose the question at this point is: is there any way to count duplicate items (as this is used both in 2. to count edge occurrences and in 1. to remap indices corresponding to unique vertex positions) but avoiding the usage of RAM-hungry data structures?

(and this includes dictionaries of course if they need string hashes per-counted-item)

Well, here I propose you a conceptually very easy solution. The reasoning is the following:

What if, starting from a list of items, we could sort the list to semantically same items get grouped together? Then, in the counting phase, iterate over the sorted list knowing that if two items must be considered the same, they will be adjacent in the list!

With this solution, if we stick to JS .sort(...) method (which does sorting in-place), we don't need to allocate any additional memory for auxiliary structures 😃!

The results, before going into the proposed code

One natural objection to the proposed approach is that:

  • the current approach (by code inspection) using dictionaries is O(N)
  • but one solution involving quicksort (see here) would be O(N · log N)

So the natural question is the performance would be much affected by the proposed algorithm.

I just measured with a really small IFC file (9 MB) and a moderate one (126 MB), and the results are the following for flagSolidGeometries method:

  • 9 MB IFC file: processing time reduced from 163 ms to 90 ms; not able to measure RAM increase (really simple geometry in that model)
  • 126 MB IFC file: processing time reduced from 8 s to 4.3 s; RAM increase reduced from 107 MB to 16 MB.

So, apart from reducing execution time, the solution is MUCH more RAM friendly, which is all about the topic of this issue.

The proposed code

And, finally, the code! ⚙️

Sorry not to create a PR, preferred to include the code here with the comments above:

You will notice two additional arguments are supplies to the new isTriangleMeshSolid method:

  • vertexIndexMapping and edges

Those are used just to avoid to allocate big lists per processed geometry!

In order to generate those two arguments, add the following to the beginning of _flagSolidGeometries method:

    _flagSolidGeometries() {
        let maxNumPositions = -1;
        let maxNumIndices = -1;

        for (let i = 0, len = this.geometriesList.length; i < len; i++) {
            const geometry = this.geometriesList[i];
            if (geometry.primitiveType === "triangles") {
                if (geometry.positionsQuantized.length > maxNumPositions) {
                    maxNumPositions = geometry.positionsQuantized.length;
                }
                if (geometry.indices.length > maxNumIndices) {
                    maxNumIndices = geometry.indices.length;
                }
            }
        }

        let vertexIndexMapping = new Array (maxNumPositions / 3);
        let edges = new Array (maxNumIndices);

        // ...
        geometry.solid = isTriangleMeshSolid2(
                    geometry.indices,
                    geometry.positionsQuantized,
                    vertexIndexMapping,
                    edges
                ); // Better memory/cpu performance with quantized values
        // ...
}

The previous code will make sure to pre-allocate big enough arrays to be used as temporary lists in the new isTriangleMeshSolid code. (Their size can't be statically determined as it depends on the processed model).

const isTriangleMeshSolid = (indices, positions, vertexIndexMapping, edges) => {

    function compareIndexPositions(a, b)
    {
        let posA, posB;

        for (let i = 0; i < 3; i++) {
            posA = positions [a*3+i];
            posB = positions [b*3+i];

            if (posA !== posB) {
                return posB - posA;
            }
        }

        return 0;
    };

    // Group together indices corresponding to same position coordinates
    let newIndices = indices.slice ().sort (compareIndexPositions);
    
    // Calculate the mapping:
    // - from original index in indices array
    // - to indices-for-unique-positions
    let uniqueVertexIndex = null;

    for (let i = 0, len = newIndices.length; i < len; i++) {
        if (i == 0 || 0 != compareIndexPositions (
            newIndices[i],
            newIndices[i-1],
        )) {
            // different position
            uniqueVertexIndex = newIndices [i];
        }

        vertexIndexMapping [
            newIndices[i]
        ] = uniqueVertexIndex;
    }

    // Generate the list of edges
    for (let i = 0, len = indices.length; i < len; i += 3) {

        const a = vertexIndexMapping[indices[i]];
        const b = vertexIndexMapping[indices[i+1]];
        const c = vertexIndexMapping[indices[i+2]];

        let a2 = a;
        let b2 = b;
        let c2 = c;

        if (a > b && a > c) {
            if (b > c) {
                a2 = a;
                b2 = b;
                c2 = c;
            } else {
                a2 = a;
                b2 = c;
                c2 = b;
            }
        } else if (b > a && b > c) {
            if (a > c) {
                a2 = b;
                b2 = a;
                c2 = c;
            } else {
                a2 = b;
                b2 = c;
                c2 = a;
            }
        } else if (c > a && c > b) {
            if (a > b) {
                a2 = c;
                b2 = a;
                c2 = b;
            } else {
                a2 = c;
                b2 = b;
                c2 = a;
            }
        }

        edges[i+0] = [
            a2, b2
        ];
        edges[i+1] = [
            b2, c2
        ];

        if (a2 > c2) {
            const temp = c2;
            c2 = a2;
            a2 = temp;
        }

        edges[i+2] = [
            c2, a2
        ];
    }

    // Group semantically equivalent edgdes together
    function compareEdges (e1, e2) {
        let a, b;

        for (let i = 0; i < 2; i++) {
            a = e1[i];
            b = e2[i];

            if (b !== a) {
                return b - a;
            }
        }

        return 0;
    }

    edges = edges.slice(0, indices.length);

    edges.sort (compareEdges);

    // Make sure each edge is used exactly twice
    let sameEdgeCount = 0;

    for (let i = 0; i < edges.length; i++)
    {
        if (i === 0 || 0 !== compareEdges (
            edges[i], edges[i-1]
        )) {
            // different edge
            if (0 !== i && sameEdgeCount !== 2)
            {
                return false;
            }

            sameEdgeCount = 1;
        }
        else
        {
            // same edge
            sameEdgeCount++;
        }
    }

    if (edges.length > 0 && sameEdgeCount !== 2)
    {
        return false;
    }

    // Each edge is used exactly twice, this is a 
    // watertight surface and hence a solid geometry.
    return true;
};

What do you think @xeolabs? 😃

Any release 1.0.8 planned soon ?

Hi @xeolabs,

I'm wondering if a stable release is coming up soon, as multiples dependencies have been bumped and some due to security issues. I know that this is an open source project and as such I'm not really asking for a very precise ETA, but could this come in a month or ? Maybe few months or weeks ?

Point cloud conversion options to reduce file size

Possibilities:

  • Convert every N points

  • For each n points, output an average point

  • Option to specify a maximum file size / memory size / point number, so we can ensure the XKT will fit in browser memory

convert2xkt.js IFC to XKT has error web-ifc: 0.0.34 threading: 0 Error: ${err}

I'm trying this command:

$ node ./node_modules/@xeokit/xeokit-convert/convert2xkt.js -s foo.ifc -o hyd.xkt -l                                                                                           
[convert2xkt] Reading input file: foo.ifc
[convert2xkt] Input file size: 48258.68 kB
[convert2xkt] Converting...
web-ifc: 0.0.34 threading: 0
Error: ${err}

Any ideas how I can help debug?

Note: that was installed via npm i @xeokit/xeokit-convert

If I clone from main and npm install then get this instead:

$ node convert2xkt.js -s hyd.ifc -o hyd.xkt -l 
[convert2xkt] Reading input file: hyd.ifc
[convert2xkt] Input file size: 48258.68 kB
[convert2xkt] Converting...
web-ifc: 0.0.34 threading: 0
Error: TypeError: FromRawLineData[rawLineData.type] is not a function

Error executing, missing module commander

When I execute

node convert2xkt.js -h

I get an error

node:internal/modules/cjs/loader:936
throw err;
^

Error: Cannot find module 'commander'
Require stack:

  • D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\convert2xkt.js
    at Function.Module._resolveFilename (node:internal/modules/cjs/loader:933:15)
    at Function.Module._load (node:internal/modules/cjs/loader:778:27)
    at Module.require (node:internal/modules/cjs/loader:1005:19)
    at require (node:internal/modules/cjs/helpers:102:18)
    at Object. (D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\convert2xkt.js:3:19)
    at Module._compile (node:internal/modules/cjs/loader:1103:14)
    at Object.Module._extensions..js (node:internal/modules/cjs/loader:1157:10)
    at Module.load (node:internal/modules/cjs/loader:981:32)
    at Function.Module._load (node:internal/modules/cjs/loader:822:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12) {
    code: 'MODULE_NOT_FOUND',
    requireStack: [
    'D:\Trabajo\NPM\BIM\xeokit-convert\xeokit-convert\convert2xkt.js'
    ]
    }

Fix broken metaobject -> propertyset links in metadata

When converting IFC models from glTF and JSON metadata into XKT, the property sets are missing from the XKT.

This bug occurs when converting glTF+JSON like so:

node convert2xkt.js -s Schependomlaan.gltf -m Schependomlaan.json -o Schependomlaan.xkt 

Property sets are also incorrectly described within the XKT v9 schema specification. This fix also fixes the specification, as described below.

On each metaobject, the spec incorrectly defines a property "propertySetId", which is a string. That property should instead be an array of strings, called "propertySetIds".

Incorrect schema spec:

  "metaObjects": {
            "type": "array",
            "items": [
                {
                    "type": "object",
                    "properties": {
                        "name": {
                            "type": "string"
                        },
                        "type": {
                            "type": "string"
                        },
                        "id": {
                            "type": "string"
                        },
                        "parent": {
                            "type": "string"
                        },
                        "propertySetId": {
                            "type": "string"
                        }
                    },
                    "required": [
                        "name",
                        "type",
                        "id"
                    ]
                }
            ]
        }

Corrected schema spec:

  "metaObjects": {
            "type": "array",
            "items": [
                {
                    "type": "object",
                    "properties": {
                        "name": {
                            "type": "string"
                        },
                        "type": {
                            "type": "string"
                        },
                        "id": {
                            "type": "string"
                        },
                        "parent": {
                            "type": "string"
                        },
                        "propertySetIds": {
                            "type": "array",
                            "items": [
                                 {
                                      "type": "string"
                                 }
                             ]
                        }
                    },
                    "required": [
                        "name",
                        "type",
                        "id"
                    ]
                }
            ]
        }

Error running "node convert2xkt.js -s geometry.gltf -o geometry.xkt - m metadata.json"

This is the error message:

(node:5620) Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.
(Use node --trace-warnings ... to show where the warning was created)
D:\lib\xeokit-convert\src\convert2xkt.js:1
import {parseMetaModelIntoXKTModel} from "./parsers/parseMetaModelIntoXKTModel.js";
^^^^^^

SyntaxError: Cannot use import statement outside a module
at Object.compileFunction (node:vm:352:18)
at wrapSafe (node:internal/modules/cjs/loader:1026:15)
at Module._compile (node:internal/modules/cjs/loader:1061:27)
at Object.Module._extensions..js (node:internal/modules/cjs/loader:1151:10)
at Module.load (node:internal/modules/cjs/loader:975:32)
at Function.Module._load (node:internal/modules/cjs/loader:822:12)
at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12)
at node:internal/main/run_main_module:17:47
Node.js v17.6.0


Dependencies are installed:
xeokit-convert>npm install --only=prod

operating system:
Windows 10, 64 bit

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.