Giter Site home page Giter Site logo

city-super / bungeenerf Goto Github PK

View Code? Open in Web Editor NEW
557.0 11.0 64.0 1.41 MB

[ECCV22] BungeeNeRF: Progressive Neural Radiance Field for Extreme Multi-scale Scene Rendering

Home Page: https://city-super.github.io/citynerf

License: MIT License

Python 100.00%
city multiscale nerf neural-rendering

bungeenerf's People

Contributors

eveneveno avatar kam1107 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bungeenerf's Issues

How to convert data from GES export

Greetings!

I have experimented and reproduced according to the instructions with great results.

But I ran into some problems when using a custom dataset. How should the 3D Camera data extracted from Google Earth Studio be converted into the format used for model training?

I tried to simply pass "rotation" through x'=-x, y'=180-y, z'=180+z to get Euler angles, then convert it to a rotation matrix, and finally combine it with the "position" (after scaling down), but this will cause errors during training and will not get the pose representation in the example dataset

Could you please describe in detail how to convert the data exported from GES?

Blender Synthetic Data

Firstly,thank you for your outstanding work. Training progressively with residual block really inspire me.

I am now doing some comparisons between these methods. I notice that style of data (cityscene) provided by you are different from origin nerf. And you have done some experiments compared with mip-nerf in paper. Can you the Blender Synthetic data of your style or the code to convert the dataset to your style.

Thank you very much!

Questions about using GES2pose.py

I have a question about generating trajectory pose files. I used the 56leonard.esp file you provided, imported it to Google Earth Studio, then exported the pose file 56leonard.json, and converted it using the script file GES2pose.py you provided, but the output pose is not the same as the poses_enu.json file provided directly in the dataset.
Export 3D Tracking Data
Comparison

Export point cloud

Hi.

Thank you for your useful code. Is there any ways to export point cloud, such as export to *.ply file?

Regards

I want to know why my GPU is low when I'm training on 56Leonard

When I train on stage0 and 1, my RTX4080 and CPU ratio are very low and I'm sure the value of torch.cuda.is_available() is True。
The GPU basically has a 0 ratio of all kinds, except for the use of 4.6G of video memory.
I want to know if this is normal?

Some questions about "scene_scaling factor" and "scene_origin"

Great work! In order to obtain the pose used by the network, I used software such as Colmap to calculate the drone image parameters and exported JSON files, and referred to the original NeRF work on LLFF. May I ask how to obtain the "scene_scaling factor" and "scene_origin" in it.

the last 2 dims of Poses information

Great work! I want to ask what the last 2 dimensions of the pose NumPy array mean. Because I notice there are 17 elements in the poses array and the last 2 elements are discarded. Like in the first pose the last two elements are 0.00358713 and 32.06129449. Thank you!

'retraw' is not defined

report a BUG
File "run_bungee.py", line 327, in render_rays
if retraw:
NameError: name 'retraw' is not defined

How to run Nerf?)

Hey,
I'm a new comer to NERF world so help me please with running
here is my Anaconda console output

(bungee) C:\BungeeNeRF>python run_bungee.py --config configs/EXP_CONFIG_FILE
Traceback (most recent call last):
File "C:\BungeeNeRF\run_bungee.py", line 632, in
torch.set_default_tensor_type('torch.cuda.FloatTensor')
File "C:\ProgramData\Anaconda3\envs\bungee\lib\site-packages\torch_init_.py", line 323, in set_default_tensor_type
_C._set_default_tensor_type(t)
TypeError: type torch.cuda.FloatTensor not available. Torch not compiled with CUDA enabled.

I've watched intro video on Youtube, and got fantastic inspiration
thank you for great work guys

The multiscale_google_Transamerica.zip is incomplete

Hi, developers,

The zip file multiscale_google_Transamerica.zip is incomplete and cannot be extracted.

# unzip multiscale_google_Transamerica.zip 
Archive:  multiscale_google_Transamerica.zip
  End-of-central-directory signature not found.  Either this file is not
  a zipfile, or it constitutes one disk of a multi-part archive.  In the
  latter case the central directory and zipfile comment will be found on
  the last disk(s) of this archive.
unzip:  cannot find zipfile directory in one of multiscale_google_Transamerica.zip or
        multiscale_google_Transamerica.zip.zip, and cannot find multiscale_google_Transamerica.zip.ZIP, period.

Can you check it again? And pls upload a complete one if possible.

Thanks~

自制数据如何运行

您好,感谢您能分享科研成果~请问,如何能够像原版的NeRF那样(colmap生成位姿然后转换成.npy)运行自己拍摄的数据而不是使用google数据?谢谢

Rendering the results

Hi, thank you for sharing the codes of your amazing research!

Just curious, right now I am using the default render poses_enu.json file, and it only produces 29 frames.
How can I generate one from a custom .esp from Google Earth and convert that into poses_enu.json file?
I understand you wrote a description about how to convert it and such, but I think I am failing to replicate that.
Any help would be appreciated!

Why does it have poor quality for my train and render

Hello, thanks for you sharing the fantastic work.
Now i download the example data and following the demo, but the the output 11 png images are not as the paper show, i don't know if my configuration parameter are not optimizated? the following is my configuration,

Could you help give me a suggestion?

Event the render quality is lower than the original images.

Hardware: GPU: 3080(24G)

Configuration: using default - 200000 iteraters.

Regards
011

Total 11 images.

Something about the data

First of all, I am grateful to the author for providing the source code. Then I want to run my own data with the BungeeNeRF. I have many UAV images, and I can use software such as colmap to calculate the image parameters and export the json file. But I find that the exported json file is different with yours. How can I convert it to the corresponding json file of the BungeeNeRF?

export Mesh

Thanks for your method. this is very interesting. I was able to run it but unfortunately I cant export 3D mesh
is it possible to export mesh ???

Can not run BungeeNeRF on RTX3080TI

Dear authors,
When I run: python run_bungee.py --config configs/Transamerica.txt

it raise,
NVIDIA GeForce RTX 3080 Ti with CUDA capability sm_86 is not compatible with the current PyTorch installation.
The current PyTorch install supports CUDA capabilities sm_37 sm_50 sm_60 sm_70.
If you want to use the NVIDIA GeForce RTX 3080 Ti GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/

warnings.warn(incompatible_device_warn.format(device_name, capability, " ".join(arch_list), device_name))
Traceback (most recent call last):
File "run_bungee.py", line 634, in
train()
File "run_bungee.py", line 476, in train
render_kwargs_train, render_kwargs_test, start_iter, total_iter, grad_vars, optimizer = create_nerf(args)
File "run_bungee.py", line 120, in create_nerf
embed_fn, input_ch = get_mip_embedder(args.multires, args.min_multires, args.i_embed, log_sampling=True)
File "/media/zyan/sandiskSSD/NeRF_methods/BungeeNeRF-main/run_nerf_helpers.py", line 140, in get_mip_embedder
embedder_obj = MipEmbedder(**embed_kwargs)
File "/media/zyan/sandiskSSD/NeRF_methods/BungeeNeRF-main/run_nerf_helpers.py", line 53, in init
self.create_embedding_fn()
File "/media/zyan/sandiskSSD/NeRF_methods/BungeeNeRF-main/run_nerf_helpers.py", line 68, in create_embedding_fn
freq_bands_y = 2.**torch.linspace(min_freq, max_freq, steps=N_freqs)
RuntimeError: CUDA error: no kernel image is available for execution on the device
CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1.

And when I updated my torch version to 1.12 or 1.13, it raise another error:
Bungee_NeRF_block(
(baseblock): Bungee_NeRF_baseblock(
(pts_linears): ModuleList(
(0): Linear(in_features=63, out_features=256, bias=True)
(1): Linear(in_features=256, out_features=256, bias=True)
(2): Linear(in_features=256, out_features=256, bias=True)
(3): Linear(in_features=256, out_features=256, bias=True)
)
(views_linear): Linear(in_features=283, out_features=128, bias=True)
(feature_linear): Linear(in_features=256, out_features=256, bias=True)
(alpha_linear): Linear(in_features=256, out_features=1, bias=True)
(rgb_linear): Linear(in_features=128, out_features=3, bias=True)
)
(resblocks): ModuleList()
)
Found ckpts []
Traceback (most recent call last):
File "run_bungeepy", line 634, in
train()
File "run_bungee.py", line 511, in train
rays = np.stack([get_rays_np(H, W, focal, p) for p in poses], 0)
File "run_bungeepy", line 511, in
rays = np.stack([get_rays_np(H, W, focal, p) for p in poses], 0)
File "/media/zyan/sandiskSSD/NeRF_methods/BungeeNeRF-main/run_nerf_helpers.py", line 231, in get_rays_np
rays_d = np.sum(dirs[..., np.newaxis, :] * c2w[:3,:3], -1)
TypeError: unsupported operand type(s) for *: 'numpy.ndarray' and 'Tensor'.

Is it any solution about that?

How to use custom datasets

I have some multiscale satellite imagery that I want to use to run Bungeenerf's code and how should I handle it so that it conforms to Bungeenerf's input

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.