1639785600
RetrievalFuse: Neural 3D Scene Reconstruction with a Database
Yawar Siddiqui, Justus Thies, Fangchang Ma, Qi Shan, Matthias Nießner, Angela Dai
ICCV2021
This repository contains the code for the ICCV 2021 paper RetrievalFuse, a novel approach for 3D reconstruction from low resolution distance field grids and from point clouds.
In contrast to traditional generative learned models which encode the full generative process into a neural network and can struggle with maintaining local details at the scene level, we introduce a new method that directly leverages scene geometry from the training database.
Broad code structure is as follows:
File / Folder | Description |
---|---|
config/super_resolution | Super-resolution experiment configs |
config/surface_reconstruction | Surface reconstruction experiment configs |
config/base | Defaults for configurations |
config/config_handler.py | Config file parser |
data/splits | Training and validation splits for different datasets |
dataset/scene.py | SceneHandler class for managing access to scene data samples |
dataset/patched_scene_dataset.py | Pytorch dataset class for scene data |
external/ChamferDistancePytorch | For calculating rough chamfer distance between prediction and target while training |
model/attention.py | Attention, folding and unfolding modules |
model/loss.py | Loss functions |
model/refinement.py | Refinement network |
model/retrieval.py | Retrieval network |
model/unet.py | U-Net model used as a backbone in refinement network |
runs/ | Checkpoint and visualizations for experiments dumped here |
trainer/train_retrieval.py | Lightning module for training retrieval network |
trainer/train_refinement.py | Lightning module for training refinement network |
util/arguments.py | Argument parsing (additional arguments apart from those in config) |
util/filesystem_logger.py | For copying source code for each run in the experiment log directory |
util/metrics.py | Rough metrics for logging during training |
util/mesh_metrics.py | Final metrics on meshes |
util/retrieval.py | Script to dump retrievals once retrieval networks have been trained; needed for training refinement. |
util/visualizations.py | Utility scripts for visualizations |
Further, the data/
directory has the following layout
data # root data directory
├── sdf_008 # low-res (8^3) distance fields
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── sdf_016 # low-res (16^3) distance fields
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── sdf_064 # high-res (64^3) distance fields
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── pc_20K # point cloud inputs
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── splits # train/val splits
├── size # data needed by SceneHandler class (autocreated on first run)
├── occupancy # data needed by SceneHandler class (autocreated on first run)
Install the dependencies using pip
pip install -r requirements.txt
Be sure that you pull the ChamferDistancePytorch
submodule in external
.
For ShapeNetV2 and Matterport, get the appropriate meshes from the datasets. For 3DFRONT get the 3DFUTURE meshes and 3DFRONT scripts. For getting 3DFRONT meshes use our fork of 3D-FRONT-ToolBox to create room meshes.
Once you have the meshes, use our fork of sdf-gen
to create distance field low-res inputs and high-res targets. For creating point cloud inputs simply use trimesh.sample.sample_surface
(check util/misc/sample_scene_point_clouds
). Place the processed data in appropriate directories:
data/sdf_008/<dataset>
or data/sdf_016/<dataset>
for low-res inputs
data/pc_20K/<dataset>
for point clouds inputs
data/sdf_064/<dataset>
for targets
Make sure that CUDA_HOME
variable is set. To train retrieval networks use the following command:
python trainer/train_retrieval.py --config config/<config> --val_check_interval 5 --experiment retrieval --wandb_main --sanity_steps 1
We provide some sample configurations for retrieval.
For super-resolution, e.g.
config/super_resolution/ShapeNetV2/retrieval_008_064.yaml
config/super_resolution/3DFront/retrieval_008_064.yaml
config/super_resolution/Matterport3D/retrieval_016_064.yaml
For surface-reconstruction, e.g.
config/surface_reconstruction/ShapeNetV2/retrieval_128_064.yaml
config/surface_reconstruction/3DFront/retrieval_128_064.yaml
config/surface_reconstruction/Matterport3D/retrieval_128_064.yaml
Once trained, create the retrievals for train/validation set using the following commands:
python util/retrieval.py --mode map --retrieval_ckpt <trained_retrieval_ckpt> --config <retrieval_config>
python util/retrieval.py --mode compose --retrieval_ckpt <trained_retrieval_ckpt> --config <retrieval_config>
Use the following command to train the refinement network
python trainer/train_refinement.py --config <config> --val_check_interval 5 --experiment refinement --sanity_steps 1 --wandb_main --retrieval_ckpt <retrieval_ckpt>
Again, sample configurations for refinement are provided in the config
directory.
For super-resolution, e.g.
config/super_resolution/ShapeNetV2/refinement_008_064.yaml
config/super_resolution/3DFront/refinement_008_064.yaml
config/super_resolution/Matterport3D/refinement_016_064.yaml
For surface-reconstruction, e.g.
config/surface_reconstruction/ShapeNetV2/refinement_128_064.yaml
config/surface_reconstruction/3DFront/refinement_128_064.yaml
config/surface_reconstruction/Matterport3D/refinement_128_064.yaml
Visualizations and checkpoints are dumped in the runs/<experiment>
directory. Logs are uploaded to the user's Weights&Biases dashboard.
Download processed data for ShapeNetV2 dataset using the following command
bash data/download_shapenet_processed.sh
This will populate the data/sdf_008
, data/sdf_064
, data/pc_20K
, data/occupancy
and data/size
folders with processed ShapeNet data.
To download trained models on ShapeNetV2 use the following script
bash data/download_shapenet_models.sh
This downloads the checkpoints for retrieval and refinement for ShapeNet on both super-resolution and surface reconstruction tasks, plus the already computed retrievals. You can resume training these with the --resume
flag in appropriate scripts (or inference with --sanity_steps
flag). E.g. for resuming (and / or dumping inferences from data/splits/ShapeNetV2/main/val_vis.txt
) use the following command
# super-resolution
python trainer/train_refinement.py --config config/super_resolution/ShapeNetV2/refinement_008_064.yaml --sanity_steps -1 --resume runs/checkpoints/superres_refinement_ShapeNetV2.ckpt --retrieval_ckpt runs/07101959_superresolution_ShapeNetV2_upload/_ckpt_epoch=79.ckpt --current_phase 3 --max_epoch 161 --new_exp_for_resume
# surface-reconstruction
python trainer/train_refinement.py --config config/surface_reconstruction/ShapeNetV2/refinement_128_064.yaml --sanity_steps -1 --resume runs/checkpoints/surfacerecon_refinement_ShapeNetV2.ckpt --retrieval_ckpt runs/07101959_surface_reconstruction_ShapeNetV2_upload/_ckpt_epoch=59.ckp --current_phase 3 --max_epoch 161 --new_exp_for_resume
If you find our work useful in your research, please consider citing:
@inproceedings{siddiqui2021retrievalfuse,
title = {RetrievalFuse: Neural 3D Scene Reconstruction with a Database},
author = {Siddiqui, Yawar and Thies, Justus and Ma, Fangchang and Shan, Qi and Nie{\ss}ner, Matthias and Dai, Angela},
booktitle = {Proc. International Conference on Computer Vision (ICCV)},
month = oct,
year = {2021},
doi = {},
month_numeric = {10}
}
The code from this repository is released under the MIT license.
Author: nihalsid
Source Code: https://github.com/nihalsid/retrieval-fuse
#database
1620633584
In SSMS, we many of may noticed System Databases under the Database Folder. But how many of us knows its purpose?. In this article lets discuss about the System Databases in SQL Server.
Fig. 1 System Databases
There are five system databases, these databases are created while installing SQL Server.
#sql server #master system database #model system database #msdb system database #sql server system databases #ssms #system database #system databases in sql server #tempdb system database
1640257440
A simple Boilerplate to Setup Authentication using Django-allauth, with a custom template for login and registration using django-crispy-forms
.
# clone the repo
$ git clone https://github.com/yezz123/Django-Authentication
# move to the project folder
$ cd Django-Authentication
virtual environment
for this project:# creating pipenv environment for python 3
$ virtualenv venv
# activating the pipenv environment
$ cd venv/bin #windows environment you activate from Scripts folder
# if you have multiple python 3 versions installed then
$ source ./activate
SECRET_KEY = #random string
DEBUG = #True or False
ALLOWED_HOSTS = #localhost
DATABASE_NAME = #database name (You can just use the default if you want to use SQLite)
DATABASE_USER = #database user for postgres
DATABASE_PASSWORD = #database password for postgres
DATABASE_HOST = #database host for postgres
DATABASE_PORT = #database port for postgres
ACCOUNT_EMAIL_VERIFICATION = #mandatory or optional
EMAIL_BACKEND = #email backend
EMAIL_HOST = #email host
EMAIL_HOST_PASSWORD = #email host password
EMAIL_USE_TLS = # if your email use tls
EMAIL_PORT = #email port
change all the environment variables in the
.env.sample
and don't forget to rename it to.env
.
After Setup the environment, you can run the project using the Makefile
provided in the project folder.
help:
@echo "Targets:"
@echo " make install" #install requirements
@echo " make makemigrations" #prepare migrations
@echo " make migrations" #migrate database
@echo " make createsuperuser" #create superuser
@echo " make run_server" #run the server
@echo " make lint" #lint the code using black
@echo " make test" #run the tests using Pytest
Includes preconfigured packages to kick start Django-Authentication by just setting appropriate configuration.
Package | Usage |
---|---|
django-allauth | Integrated set of Django applications addressing authentication, registration, account management as well as 3rd party (social) account authentication. |
django-crispy-forms | django-crispy-forms provides you with a crispy filter and {% crispy %} tag that will let you control the rendering behavior of your Django forms in a very elegant and DRY way. |
Download Details:
Author: yezz123
Source Code: https://github.com/yezz123/Django-Authentication
License: MIT License
1625133780
The pandemic has brought a period of transformation across businesses globally, pushing data and analytics to the forefront of decision making. Starting from enabling advanced data-driven operations to creating intelligent workflows, enterprise leaders have been looking to transform every part of their organisation.
SingleStore is one of the leading companies in the world, offering a unified database to facilitate fast analytics for organisations looking to embrace diverse data and accelerate their innovations. It provides an SQL platform to help companies aggregate, manage, and use the vast trove of data distributed across silos in multiple clouds and on-premise environments.
**Your expertise needed! **Fill up our quick Survey
#featured #data analytics #data warehouse augmentation #database #database management #fast analytics #memsql #modern database #modernising data platforms #one stop shop for data #singlestore #singlestore data analytics #singlestore database #singlestore one stop shop for data #singlestore unified database #sql #sql database
1623408615
With the advancement in technology, many products have found a dire need to showcase their product virtually and to make the virtual experience as clear as actual a technology called 3D is used. The 3D technology allows a business to showcase their products in 3 dimensions virtually.
Want to develop an app that showcases anything in 3D?
WebClues Infotech with its expertise in mobile app development can seamlessly connect a technology that has the capability to change an industry with its integration in the mobile app. After successfully serving more than 950 projects WebClues Infotech is prepared with its highly skilled development team to serve you.
Want to know more about our 3D design app development?
Visit us at
https://www.webcluesinfotech.com/3d-design-services/
Visit: https://www.webcluesinfotech.com/3d-design-services/
Share your requirements https://www.webcluesinfotech.com/contact-us/
View Portfolio https://www.webcluesinfotech.com/portfolio/
#3d design service provide #3d design services #3d modeling design services #professional 3d design services #industrial & 3d product design services #3d web design & development company
1639785600
RetrievalFuse: Neural 3D Scene Reconstruction with a Database
Yawar Siddiqui, Justus Thies, Fangchang Ma, Qi Shan, Matthias Nießner, Angela Dai
ICCV2021
This repository contains the code for the ICCV 2021 paper RetrievalFuse, a novel approach for 3D reconstruction from low resolution distance field grids and from point clouds.
In contrast to traditional generative learned models which encode the full generative process into a neural network and can struggle with maintaining local details at the scene level, we introduce a new method that directly leverages scene geometry from the training database.
Broad code structure is as follows:
File / Folder | Description |
---|---|
config/super_resolution | Super-resolution experiment configs |
config/surface_reconstruction | Surface reconstruction experiment configs |
config/base | Defaults for configurations |
config/config_handler.py | Config file parser |
data/splits | Training and validation splits for different datasets |
dataset/scene.py | SceneHandler class for managing access to scene data samples |
dataset/patched_scene_dataset.py | Pytorch dataset class for scene data |
external/ChamferDistancePytorch | For calculating rough chamfer distance between prediction and target while training |
model/attention.py | Attention, folding and unfolding modules |
model/loss.py | Loss functions |
model/refinement.py | Refinement network |
model/retrieval.py | Retrieval network |
model/unet.py | U-Net model used as a backbone in refinement network |
runs/ | Checkpoint and visualizations for experiments dumped here |
trainer/train_retrieval.py | Lightning module for training retrieval network |
trainer/train_refinement.py | Lightning module for training refinement network |
util/arguments.py | Argument parsing (additional arguments apart from those in config) |
util/filesystem_logger.py | For copying source code for each run in the experiment log directory |
util/metrics.py | Rough metrics for logging during training |
util/mesh_metrics.py | Final metrics on meshes |
util/retrieval.py | Script to dump retrievals once retrieval networks have been trained; needed for training refinement. |
util/visualizations.py | Utility scripts for visualizations |
Further, the data/
directory has the following layout
data # root data directory
├── sdf_008 # low-res (8^3) distance fields
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── sdf_016 # low-res (16^3) distance fields
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── sdf_064 # high-res (64^3) distance fields
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── pc_20K # point cloud inputs
├── <dataset_0>
├── <sample_0>
├── <sample_1>
├── <sample_2>
...
├── <dataset_1>
...
├── splits # train/val splits
├── size # data needed by SceneHandler class (autocreated on first run)
├── occupancy # data needed by SceneHandler class (autocreated on first run)
Install the dependencies using pip
pip install -r requirements.txt
Be sure that you pull the ChamferDistancePytorch
submodule in external
.
For ShapeNetV2 and Matterport, get the appropriate meshes from the datasets. For 3DFRONT get the 3DFUTURE meshes and 3DFRONT scripts. For getting 3DFRONT meshes use our fork of 3D-FRONT-ToolBox to create room meshes.
Once you have the meshes, use our fork of sdf-gen
to create distance field low-res inputs and high-res targets. For creating point cloud inputs simply use trimesh.sample.sample_surface
(check util/misc/sample_scene_point_clouds
). Place the processed data in appropriate directories:
data/sdf_008/<dataset>
or data/sdf_016/<dataset>
for low-res inputs
data/pc_20K/<dataset>
for point clouds inputs
data/sdf_064/<dataset>
for targets
Make sure that CUDA_HOME
variable is set. To train retrieval networks use the following command:
python trainer/train_retrieval.py --config config/<config> --val_check_interval 5 --experiment retrieval --wandb_main --sanity_steps 1
We provide some sample configurations for retrieval.
For super-resolution, e.g.
config/super_resolution/ShapeNetV2/retrieval_008_064.yaml
config/super_resolution/3DFront/retrieval_008_064.yaml
config/super_resolution/Matterport3D/retrieval_016_064.yaml
For surface-reconstruction, e.g.
config/surface_reconstruction/ShapeNetV2/retrieval_128_064.yaml
config/surface_reconstruction/3DFront/retrieval_128_064.yaml
config/surface_reconstruction/Matterport3D/retrieval_128_064.yaml
Once trained, create the retrievals for train/validation set using the following commands:
python util/retrieval.py --mode map --retrieval_ckpt <trained_retrieval_ckpt> --config <retrieval_config>
python util/retrieval.py --mode compose --retrieval_ckpt <trained_retrieval_ckpt> --config <retrieval_config>
Use the following command to train the refinement network
python trainer/train_refinement.py --config <config> --val_check_interval 5 --experiment refinement --sanity_steps 1 --wandb_main --retrieval_ckpt <retrieval_ckpt>
Again, sample configurations for refinement are provided in the config
directory.
For super-resolution, e.g.
config/super_resolution/ShapeNetV2/refinement_008_064.yaml
config/super_resolution/3DFront/refinement_008_064.yaml
config/super_resolution/Matterport3D/refinement_016_064.yaml
For surface-reconstruction, e.g.
config/surface_reconstruction/ShapeNetV2/refinement_128_064.yaml
config/surface_reconstruction/3DFront/refinement_128_064.yaml
config/surface_reconstruction/Matterport3D/refinement_128_064.yaml
Visualizations and checkpoints are dumped in the runs/<experiment>
directory. Logs are uploaded to the user's Weights&Biases dashboard.
Download processed data for ShapeNetV2 dataset using the following command
bash data/download_shapenet_processed.sh
This will populate the data/sdf_008
, data/sdf_064
, data/pc_20K
, data/occupancy
and data/size
folders with processed ShapeNet data.
To download trained models on ShapeNetV2 use the following script
bash data/download_shapenet_models.sh
This downloads the checkpoints for retrieval and refinement for ShapeNet on both super-resolution and surface reconstruction tasks, plus the already computed retrievals. You can resume training these with the --resume
flag in appropriate scripts (or inference with --sanity_steps
flag). E.g. for resuming (and / or dumping inferences from data/splits/ShapeNetV2/main/val_vis.txt
) use the following command
# super-resolution
python trainer/train_refinement.py --config config/super_resolution/ShapeNetV2/refinement_008_064.yaml --sanity_steps -1 --resume runs/checkpoints/superres_refinement_ShapeNetV2.ckpt --retrieval_ckpt runs/07101959_superresolution_ShapeNetV2_upload/_ckpt_epoch=79.ckpt --current_phase 3 --max_epoch 161 --new_exp_for_resume
# surface-reconstruction
python trainer/train_refinement.py --config config/surface_reconstruction/ShapeNetV2/refinement_128_064.yaml --sanity_steps -1 --resume runs/checkpoints/surfacerecon_refinement_ShapeNetV2.ckpt --retrieval_ckpt runs/07101959_surface_reconstruction_ShapeNetV2_upload/_ckpt_epoch=59.ckp --current_phase 3 --max_epoch 161 --new_exp_for_resume
If you find our work useful in your research, please consider citing:
@inproceedings{siddiqui2021retrievalfuse,
title = {RetrievalFuse: Neural 3D Scene Reconstruction with a Database},
author = {Siddiqui, Yawar and Thies, Justus and Ma, Fangchang and Shan, Qi and Nie{\ss}ner, Matthias and Dai, Angela},
booktitle = {Proc. International Conference on Computer Vision (ICCV)},
month = oct,
year = {2021},
doi = {},
month_numeric = {10}
}
The code from this repository is released under the MIT license.
Author: nihalsid
Source Code: https://github.com/nihalsid/retrieval-fuse
#database