Direkt zum Inhalt springen
Computer Vision Group
TUM School of Computation, Information and Technology
Technical University of Munich

Technical University of Munich

Menu

Links

Informatik IX
Computer Vision Group

Boltzmannstrasse 3
85748 Garching info@vision.in.tum.de

Follow us on:

News

04.03.2024

We have twelve papers accepted to CVPR 2024. Check our publication page for more details.

18.07.2023

We have four papers accepted to ICCV 2023. Check out our publication page for more details.

02.03.2023

CVPR 2023

We have six papers accepted to CVPR 2023. Check out our publication page for more details.

15.10.2022

NeurIPS 2022

We have two papers accepted to NeurIPS 2022. Check out our publication page for more details.

15.10.2022

WACV 2023

We have two papers accepted at WACV 2023. Check out our publication page for more details.

More


Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
data:datasets:photometricdepthsr [2019/02/14 15:00]
haefner
data:datasets:photometricdepthsr [2019/02/27 14:55]
haefner
Line 5: Line 5:
 <html><div style="text-align:center"></html> <html><div style="text-align:center"></html>
  
-//[[members:haefner|Bjoern Haefner]]<sup>1</sup>\ \ \ \ Songyou Peng<sup>2</sup>\ \ \ \ Alok Verma<sup>1</sup>\ \ \ \ [[https://sites.google.com/view/yvainqueau|Yvain Quéau]]<sup>3</sup>\ \ \ \ [[members:cremers|Daniel Cremers]]<sup>1</sup>\\+//[[members:haefner|Bjoern Haefner]]<sup>1</sup>\ \ \ \ [[https://pengsongyou.github.io/|Songyou Peng]]<sup>2</sup>\ \ \ \ Alok Verma<sup>1</sup>\ \ \ \ [[https://sites.google.com/view/yvainqueau|Yvain Quéau]]<sup>3</sup>\ \ \ \ [[members:cremers|Daniel Cremers]]<sup>1</sup>\\
 <sup>1</sup>Technical University of Munich\ \ \ \ <sup>2</sup>University of Illinois <sup>1</sup>Technical University of Munich\ \ \ \ <sup>2</sup>University of Illinois
-at Urbana-Champaign\ \ \ \ <sup>3</sup>University of Caen//+at Urbana-Champaign\ \ \ \ <sup>3</sup>GREYC, UMR CNRS 6072//
  
 {{:data:datasets:photometricdepthsr:photometricdepthsr_teaser.png?nolink&700|}} {{:data:datasets:photometricdepthsr:photometricdepthsr_teaser.png?nolink&700|}}
Line 16: Line 16:
  
 This study explores the use of photometric techniques (shape-from-shading and uncalibrated photometric stereo) for upsampling the low-resolution depth map from an RGB-D sensor to the higher resolution of the companion RGB image. A single-shot variational approach is first put forward, which is effective as long as the target's reflectance is piecewise-constant. It is then shown that this dependency upon a specific reflectance model can be relaxed by focusing on a specific class of objects (e.g., faces), and delegate reflectance estimation to a deep neural network. A multi-shot strategy based on randomly varying lighting conditions is eventually discussed. It requires no training or prior on the reflectance, yet this comes at the price of a dedicated acquisition setup. Both quantitative and qualitative evaluations illustrate the effectiveness of the proposed methods on synthetic and real-world scenarios. This study explores the use of photometric techniques (shape-from-shading and uncalibrated photometric stereo) for upsampling the low-resolution depth map from an RGB-D sensor to the higher resolution of the companion RGB image. A single-shot variational approach is first put forward, which is effective as long as the target's reflectance is piecewise-constant. It is then shown that this dependency upon a specific reflectance model can be relaxed by focusing on a specific class of objects (e.g., faces), and delegate reflectance estimation to a deep neural network. A multi-shot strategy based on randomly varying lighting conditions is eventually discussed. It requires no training or prior on the reflectance, yet this comes at the price of a dedicated acquisition setup. Both quantitative and qualitative evaluations illustrate the effectiveness of the proposed methods on synthetic and real-world scenarios.
 +
 +===== Code =====
 +
 +The code that generated the data shown here is available on github:\\
 +https://github.com/BjoernHaefner/DepthSRfromShading\\
 +https://github.com/pengsongyou/SRmeetsPS.
  
 ===== Dataset ===== ===== Dataset =====
Line 29: Line 35:
   * Intrinsic parameters (default factory calibration).   * Intrinsic parameters (default factory calibration).
  
-==== Rucksack ====+==== Rucksack ====
  
-{{ :data:datasets:photometricdepthsr:rucksack_teaser.png?nolink&700 |}}+{{data:datasets:photometricdepthsr:realsense_rucksack1_teaser.png?nolink&700|}}
  
-Download (mat-files): {{:data:datasets:photometricdepthsr:realsense_rucksack1_sf2_sfs.zip|rucksack1_sfs.zip}} %%|%% {{ :data:datasets:photometricdepthsr:realsense_rucksack1_sf2_ups.zip |rucksack1_ups.zip}}\\ +Download (mat-files): 
-Download (png-file of deep net albedo estimate): {{ :data:datasets:photometricdepthsr:realsense_rucksack1_sf2_albedo_deep.png?linkonly |albedo_deep.png}}\\ +  * {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_sfs.zip|rucksack_sf2_sfs.zip}} | {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_deep.zip|rucksack_sf2_deep.zip}} | {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_ups.zip|rucksack_sf2_ups.zip}}\\ 
-Download (obj-files):  {{ :data:datasets:photometricdepthsr:realsense_rucksack1_sf2_input_mesh.zip |rucksack1_input_mesh.zip}} %%|%% {{ :data:datasets:photometricdepthsr:realsense_rucksack1_sf2_sfs_mesh.zip |rucksack1_sfs_mesh.zip}} %%|%% {{ :data:datasets:photometricdepthsr:realsense_rucksack1_sf2_deep_mesh.zip |rucksack1_deep_mesh.zip}} %%|%% {{ :data:datasets:photometricdepthsr:realsense_rucksack1_sf2_ups_mesh.zip |rucksack1_ups_mesh.zip}}+Download (png-file of deep net albedo estimate): 
 +  * {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_deep.png?linkonly|rucksack_albedo_deep.png}}\\ 
 +Download (obj-files): 
 +  {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_sfs_input_mesh.zip|rucksack_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_sfs_mesh.zip|rucksack_sf2_sfs_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_deep_mesh.zip|rucksack_sf2_deep_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_rucksack1_sf2_ups_mesh.zip|rucksack_sf2_ups_mesh.zip}}
  
-==== Rucksack 2 ====+==== Android ====
  
 +{{data:datasets:photometricdepthsr:realsense_android_teaser.png?nolink&700|}}
  
-==== Android ====+Download (mat-files): 
 +  * {{data:datasets:photometricdepthsr:realsense_android_sf2_sfs.zip|android_sf2_sfs.zip}}
  
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_android_sf2_sfs_input_mesh.zip|android_sf2_sfs_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_android_sf2_sfs_mesh.zip|android_sf2_sfs_mesh.zip}}
  
 ==== Basecap ==== ==== Basecap ====
  
 +{{data:datasets:photometricdepthsr:realsense_basecap_teaser.png?nolink&700|}}
 +
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_basecap_sf2_sfs.zip|basecap_sf2_sfs.zip}}
 +
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_basecap_sf2_sfs_input_mesh.zip | basecap_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_basecap_sf2_sfs_mesh.zip | basecap_sf2_sfs_mesh.zip}}
  
 ==== Minion ==== ==== Minion ====
  
 +{{data:datasets:photometricdepthsr:realsense_minion_teaser.png?nolink&700|}}
 +
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_minion_sf2_sfs.zip |minion_sf2_sfs.zip}}
 +
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_minion_sf2_sfs_input_mesh.zip |minion_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_minion_sf2_sfs_mesh.zip |minion_sf2_sfs_mesh.zip}}
  
 ==== Blanket ==== ==== Blanket ====
 +
 +{{data:datasets:photometricdepthsr:xtion_blanket_teaser.png?nolink&700|}}
 +
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:xtion_blanket_sf2_sfs.zip|blanket_sf2_sfs.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_blanket_sf4_sfs.zip|blanket_sf4_sfs.zip}}
 +
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:xtion_blanket_sf2_sfs_input_mesh.zip | blanket_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_blanket_sf2_sfs_mesh.zip|blanket_sf2_sfs_mesh.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_blanket_sf4_sfs_input_mesh.zip | blanket_sf4_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_blanket_sf4_sfs_mesh.zip|blanket_sf4_sfs_mesh.zip}}
  
  
 ==== Clothes ==== ==== Clothes ====
 +
 +{{data:datasets:photometricdepthsr:xtion_clothes_teaser.png?nolink&700|}}
 +
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:xtion_clothes_sf2_sfs.zip|clothes_sf2_sfs.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_clothes_sf4_sfs.zip|clothes_sf4_sfs.zip}}
 +
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:xtion_clothes_sf2_sfs_input_mesh.zip | clothes_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_clothes_sf2_sfs_mesh.zip|clothes_sf2_sfs_mesh.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_clothes_sf4_sfs_input_mesh.zip | clothes_sf4_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_clothes_sf4_sfs_mesh.zip|clothes_sf4_sfs_mesh.zip}}
  
  
 ==== Monkey ==== ==== Monkey ====
 +
 +{{data:datasets:photometricdepthsr:xtion_monkey_teaser.png?nolink&700|}}
 +
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:xtion_monkey_sf2_sfs.zip|monkey_sf2_sfs.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_monkey_sf4_sfs.zip|monkey_sf4_sfs.zip}}
 +
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:xtion_monkey_sf2_sfs_input_mesh.zip | monkey_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_monkey_sf2_sfs_mesh.zip|monkey_sf2_sfs_mesh.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_monkey_sf4_sfs_input_mesh.zip | monkey_sf4_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_monkey_sf4_sfs_mesh.zip|monkey_sf4_sfs_mesh.zip}}
  
  
 ==== Wool ==== ==== Wool ====
  
 +{{data:datasets:photometricdepthsr:xtion_wool_teaser.png?nolink&700|}}
  
-==== Face 1 Expression 1 ====+Download (mat-files): 
 +  * {{data:datasets:photometricdepthsr:xtion_wool_sf2_sfs.zip|wool_sf2_sfs.zip}} 
 +  * {{data:datasets:photometricdepthsr:xtion_wool_sf4_sfs.zip|wool_sf4_sfs.zip}}
  
-{{ :data:datasets:photometricdepthsr:face1_teaser.png?nolink&700 |}}+Download (obj-files): 
 +  * {{data:datasets:photometricdepthsr:xtion_wool_sf2_sfs_input_mesh.zip | wool_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_wool_sf2_sfs_mesh.zip|wool_sf2_sfs_mesh.zip}} 
 +  * {{data:datasets:photometricdepthsr:xtion_wool_sf4_sfs_input_mesh.zip | wool_sf4_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_wool_sf4_sfs_mesh.zip|wool_sf4_sfs_mesh.zip}}
  
  
-==== Face 1 Expression 2 ====+==== Face 1 ====
  
-==== Face 2 Expression 1 ====+{{data:datasets:photometricdepthsr:realsense_face1_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face1_sf2_sfs.zip|face1_sf2_sfs.zip}} | {{data:datasets:photometricdepthsr:realsense_face1_sf2_deep.zip|face1_sf2_deep.zip}} | {{data:datasets:photometricdepthsr:realsense_face1_sf2_ups.zip|face1_sf2_ups.zip}}\\
 +Download (png-file of deep net albedo estimate):
 +  * {{data:datasets:photometricdepthsr:realsense_face1_sf2_deep.png?linkonly|face1_albedo_deep.png}}\\
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face1_sf2_deep_input_mesh.zip|face1_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face1_sf2_sfs_mesh.zip|face1_sf2_sfs_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face1_sf2_deep_mesh.zip|face1_sf2_deep_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face1_sf2_ups_mesh.zip|face1_sf2_ups_mesh.zip}}
  
-==== Face 2 Expression 2 ====+==== Face 2 ====
  
 +{{data:datasets:photometricdepthsr:realsense_face2_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face2_sf2_deep.zip|face2_sf2_deep.zip}} | {{data:datasets:photometricdepthsr:realsense_face2_sf2_ups.zip|face2_sf2_ups.zip}}\\
 +Download (png-file of deep net albedo estimate):
 +  * {{data:datasets:photometricdepthsr:realsense_face2_sf2_deep.png?linkonly|face2_albedo_deep.png}}\\
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face2_sf2_deep_input_mesh.zip|face2_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face2_sf2_deep_mesh.zip|face2_sf2_deep_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face2_sf2_ups_mesh.zip|face2_sf2_ups_mesh.zip}}
 + 
 ==== Face 3 ==== ==== Face 3 ====
  
 +{{data:datasets:photometricdepthsr:realsense_face3_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face3_sf2_deep.zip|face3_sf2_deep.zip}}
 +Download (png-file of deep net albedo estimate):
 +  * {{data:datasets:photometricdepthsr:realsense_face3_sf2_deep.png?linkonly|face3_albedo_deep.png}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face3_sf2_deep_input_mesh.zip|face3_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face3_sf2_deep_mesh.zip|face3_sf2_deep_mesh.zip}}
 + 
 ==== Face 4 ==== ==== Face 4 ====
  
 +{{data:datasets:photometricdepthsr:realsense_face4_teaser.png?nolink&700|}}
  
-==== Tabletcase ====+Download (mat-files): 
 +  * {{data:datasets:photometricdepthsr:realsense_face4_sf2_deep.zip|face4_sf2_deep.zip}} 
 +Download (png-file of deep net albedo estimate): 
 +  * {{data:datasets:photometricdepthsr:realsense_face4_sf2_deep.png?linkonly|face4_albedo_deep.png}} 
 +Download (obj-files): 
 +  * {{data:datasets:photometricdepthsr:realsense_face4_sf2_deep_input_mesh.zip|face4_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face4_sf2_deep_mesh.zip|face4_sf2_deep_mesh.zip}} 
 +  
 +==== Face 5 ====
  
-{{ :data:datasets:photometricdepthsr:tabletcase_teaser.png?nolink&700 |}}+{{data:datasets:photometricdepthsr:realsense_face5_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face5_sf2_deep.zip|face5_sf2_deep.zip}}
 +Download (png-file of deep net albedo estimate):
 +  * {{data:datasets:photometricdepthsr:realsense_face5_sf2_deep.png?linkonly|face5_albedo_deep.png}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face5_sf2_deep_input_mesh.zip|face5_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face5_sf2_deep_mesh.zip|face5_sf2_deep_mesh.zip}}
 + 
 +==== Face 6 ====
  
-==== Shirt ====+{{data:datasets:photometricdepthsr:realsense_face6_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face6_sf2_deep.zip|face6_sf2_deep.zip}}
 +Download (png-file of deep net albedo estimate):
 +  * {{data:datasets:photometricdepthsr:realsense_face6_sf2_deep.png?linkonly|face6_albedo_deep.png}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_face6_sf2_deep_input_mesh.zip|face6_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:realsense_face6_sf2_deep_mesh.zip|face6_sf2_deep_mesh.zip}}
 + 
 +==== Tabletcase ====
  
-==== Bag ====+{{data:datasets:photometricdepthsr:xtion_tabletcase_teaser.png?nolink&700|}} 
 + 
 +Download (mat-files): 
 +  * {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_sfs.zip|tabletcase_sf2_sfs.zip}} | {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_deep.zip|tabletcase_sf2_deep.zip}} | {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_ups.zip|tabletcase_sf2_ups.zip}} 
 +Download (png-file of deep net albedo estimate): 
 +  * {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_deep.png?linkonly|tabletcase_albedo_deep.png}} 
 +Download (obj-files): 
 +  * {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_ups_input_mesh.zip|tabletcase_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_sfs_mesh.zip|tabletcase_sf2_sfs_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_deep_mesh.zip|tabletcase_sf2_deep_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_tabletcase_sf2_ups_mesh.zip|tabletcase_sf2_ups_mesh.zip}} 
 + 
 +==== Shirt ====
  
 +{{data:datasets:photometricdepthsr:xtion_shirt_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:xtion_shirt_sf2_ups.zip|shirt_sf2_ups.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_shirt_sf4_ups.zip|shirt_sf4_ups.zip}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:xtion_shirt_sf2_ups_input_mesh.zip|shirt_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_shirt_sf2_ups_mesh.zip|shirt_sf2_ups_mesh.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_shirt_sf4_ups_input_mesh.zip|shirt_sf4_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_shirt_sf4_ups_mesh.zip|shirt_sf4_ups_mesh.zip}}
 + 
 ==== Backpack ==== ==== Backpack ====
  
 +{{data:datasets:photometricdepthsr:xtion_backpack_teaser.png?nolink&700|}}
  
-==== Oven mitt ====+Download (mat-files): 
 +  * {{data:datasets:photometricdepthsr:xtion_backpack_sf4_ups.zip|backpack_sf4_ups.zip}} 
 +Download (obj-files): 
 +  * {{data:datasets:photometricdepthsr:xtion_backpack_sf4_ups_input_mesh.zip|backpack_sf4_input_mesh.zip}}| {{data:datasets:photometricdepthsr:xtion_backpack_sf4_ups_mesh.zip|backpack_sf4_ups_mesh.zip}} 
 +  
 +==== Ovenmitt ====
  
 +{{data:datasets:photometricdepthsr:xtion_ovenmitt_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:xtion_ovenmitt_sf2_ups.zip|ovenmitt_sf2_ups.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_ovenmitt_sf4_ups.zip|ovenmitt_sf4_ups.zip}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:xtion_ovenmitt_sf2_ups_input_mesh.zip|ovenmitt_sf2_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_ovenmitt_sf2_ups_mesh.zip|ovenmitt_sf2_ups_mesh.zip}}
 +  * {{data:datasets:photometricdepthsr:xtion_ovenmitt_sf4_ups_input_mesh.zip|ovenmitt_sf4_input_mesh.zip}} | {{data:datasets:photometricdepthsr:xtion_ovenmitt_sf4_ups_mesh.zip|ovenmitt_sf4_ups_mesh.zip}}
 + 
 ==== Hat ==== ==== Hat ====
  
 +{{data:datasets:photometricdepthsr:realsense_hat_teaser.png?nolink&700|}}
  
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:realsense_hat_sf2_ups.zip|hat_sf2_ups.zip}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:realsense_hat_sf2_ups_input_mesh.zip|hat_sf2_input_mesh.zip}}| {{data:datasets:photometricdepthsr:realsense_hat_sf2_ups_mesh.zip|hat_sf2_ups_mesh.zip}}
 + 
 ==== Vase ==== ==== Vase ====
  
 +{{data:datasets:photometricdepthsr:xtion_vase_teaser.png?nolink&700|}}
 +
 +Download (mat-files):
 +  * {{data:datasets:photometricdepthsr:xtion_vase_sf4_ups.zip|vase_sf4_ups.zip}}
 +Download (obj-files):
 +  * {{data:datasets:photometricdepthsr:xtion_vase_sf4_ups_input_mesh.zip|vase_sf4_input_mesh.zip}}| {{data:datasets:photometricdepthsr:xtion_vase_sf4_ups_mesh.zip|vase_sf4_ups_mesh.zip}}
 + 
  
  
 ==== License ==== ==== License ====
  
-Unless stated otherwise, all data in the Intrinsic3D Dataset is licensed under a [[https://creativecommons.org/licenses/by/4.0/|Creative Commons 4.0 Attribution License (CC BY 4.0)]].+Unless stated otherwise, all data in the Photometric Depth Super-Resolution Dataset is licensed under a [[https://creativecommons.org/licenses/by-nc-sa/4.0/|Creative Commons 4.0 Attribution License (CC BY-NC-SA 4.0)]].
  
 ===== Related Publications ===== ===== Related Publications =====

Rechte Seite

Informatik IX
Computer Vision Group

Boltzmannstrasse 3
85748 Garching info@vision.in.tum.de

Follow us on:

News

04.03.2024

We have twelve papers accepted to CVPR 2024. Check our publication page for more details.

18.07.2023

We have four papers accepted to ICCV 2023. Check out our publication page for more details.

02.03.2023

CVPR 2023

We have six papers accepted to CVPR 2023. Check out our publication page for more details.

15.10.2022

NeurIPS 2022

We have two papers accepted to NeurIPS 2022. Check out our publication page for more details.

15.10.2022

WACV 2023

We have two papers accepted at WACV 2023. Check out our publication page for more details.

More