Learning Templates

Learning Templates - Data

←Home

Data

We would like to thank the 3D Warehouse[1], Wang et al.[2], and the users of SketchUp for making these models available. Before you start, take a look at this pre-analyzed example to see what the analyzed results look like (also, see the format description).

DatasetShapes*Results*Content
Seat - crawling scripts
- instructions
- ground truth
- all 850Mb    energy
- gt 13Mb    energy
7442 chairs, benches, ... [1][4]
Plane- all 330Mb    energy
- gt 12Mb    energy
3114 planes, jets, ... [1][4]
Bike- all 55Mb    energy
- gt 12Mb    energy
452 bicycles, motorcycles, ... [1][4]
Helicopter- all 54Mb    energy
- gt 13Mb    energy
471 helicopters [1][4]
Fuzzy Corrs
Dataset
- OFF 37Mb
- ground truth
- all 23Mb    energy111 chairs, 86 planes [1][3]
COSEG
Dataset
- COSEG website- all 92Mb
- auto 97Mb
20-400 obj of various classes [1][2]

Data Format

Getting the models and the ground truth.

Downloading the 3D Warehouse models

Unfortunately, we could not obtain a permission from 3D Warehouse to re-distribute the models. The models are still publicly available and we distribute scripts to obtain the models. Essentially, the scripts do the following: (1) download the *.skp files, (2) convert them to *.off. Note that the second step is complicated because we could not find any command-line converters for *skp files. Thus, our solution is to run a Ruby script from SketchUp, convert files to google earth / collada format, and then convert it further to *off using open collada. Here are the scripts:
- Step 1: Download scripts
- Step 2: Download SketchUp
- Step 3: Download models by executing DownloadSkp.py. Note that ids/*txt files contain a map between our *_export.tgz and 3D warehouse model ids.
- Step 4: Convert SketchUp to Google Earth format with ConvertSkp2Kmz.py (this step requires SketchUp and MacOSX)
- Step 5: Convert Google Earth format to OFF with ConvertKmz2Off.py (this step requires 64-bit MacOSX).

Feel free to contact me if you have troubles with the steps above.

Ground Truth

- _gt.zip contains ground truth data, where each 3D model has a corresponding text file. Only consider models where the first line says "Valid". It is followed by feature points, one point per line, ordered consistently across all models. Each line [tid b1 b2 b3 x y z] starts with barycentric coordinates (triangle id + 3 coordinates), and ends with 3 coordinate positions of the feature point. Note that (x y z) is the most reliable since triangulation might change depending on the conversion steps (e.g. we have no control over Sketchup -> Google Earth conversion). Lines with -1 for triangle id indicate that there is no feature point.

Results format

Archives in the second column contain analysis results that are simplified to make parsing easier.
_export.tgz contains analysis of ALL models.
_gt_export.tgz contains analysis of 100 models that have ground truth correspondences.
Here is the content description for these directories.

_scores.txt Lists models along with analysis meta-data. Each line has the following entries: model_id fitting_energy computation_time template_id 0. You can pick models with the smallest fitting_energy to ensure that you only have positive results (e.g. for an application).

References

You might need to cite some of the papers below if you use the datasets.

[1] Google / Trimble 3D Warehouse
     http://sketchup.google.com/3dwarehouse/

[2] Active Co-Analysis of a Set of Shapes
     Yunhai Wang, Shmulik Asafi, Oliver van Kaick, Hao Zhang, Daniel Cohen-Or, and Baoquan Chen
     SIGGRAPH_ASIA, 2012

[3] Exploring Collections of 3D Models using Fuzzy Correspondences
     Vladimir G. Kim, Wilmot Li, Niloy J. Mitra, Stephen DiVerdi, and Thomas Funkhouser
     SIGGRAPH 2012

[4] Learning Part-based Templates from Large Collections of 3D Shapes
     Vladimir G. Kim, Wilmot Li, Niloy J. Mitra, Siddhartha Chaudhuri, Stephen DiVerdi, and Thomas Funkhouser
     SIGGRAPH 2013