-
Notifications
You must be signed in to change notification settings - Fork 41
Preparing assets for annotation
Regardless of what you are trying to annotate, the most important step is to make your assets available to the framework. To do so, you will need to:
-
Decide on a name for your asset and prepare your asset files, and put them somewhere that is web accessible.
One way to do this is to put your assets underserver/static/assets/<yourassetname>
Your assets are then accessible at http://localhost:8010/resources/assets/.
Note that everything other
server/static/
is accessible viahttp://localhost:8010/
. However, depending on what you have setup in yourserver/app/routes/proxy-rules.js
file, other routes may take precedences. We use thehttp://localhost:8010/resources/
paths to point directly toserver/static/
.A logical way to organize your assets is by asset id. For example, for the semantic segmentation task you will need a input scan (
.ply
), an over-segmentation (.segs.json
), and a possible screenshot to preview your asset (.png
). They can be organized as follows:server/static/assets/<yourassetname>/<assetid>/<assetid>.ply server/static/assets/<yourassetname>/<assetid>/<assetid>.segs.json server/static/assets/<yourassetname>/<assetid>/<assetid>.png
-
Prepare metadata file describing your asset and a index listing your asset. The simplest index is a csv file that lists your asset id, and any per asset information you would like to store. If you plan to have a lot of assets (> 10,000), you should use a database or solr to index your assets.
See
server/static/data/nyuv2/nyuv2.json
andserver/static/data/nyuv2/nyuv2.csv
for example json metadata and csv listing the assets. Copy and modify as needed for your asset.Add your assets to
server/static/data/assets-extra.json
(used for web access of your assets) andssc/data/assets.json
(used for batch rendering - see step 4 below). -
Render screenshots of your assets using
ssc/render.js
or any other external tool. We provide two scripts for rendering (see ssc for more details) You will need tocd ssc
and donpm install
before ssc scripts can be used.ssc/render.js
fetches your assets remotely and requires having your assets registered (as specified in step 2). Once your assets are registered, you can# Render specific asset NODE_BASE_URL=<path to your assets or server> ./render.js --source <yourassetname> --id <assetid> # Render all assets for given <yourassetname> NODE_BASE_URL=<path to your assets or server> ./render.js --source <yourassetname> --id all # Render assets with ids (one line each) specified in <ids_file> NODE_BASE_URL=<path to your assets or server> ./render.js --source <yourassetname> --ids_file <ids_file>
ssc/render-file.js
allows for rendering directly from file (without registering your assets).If you skip this step, then your assets won't have a preview picture (e.g. the view in step 5 will have a blank space)
-
Make your assets web accessible. You can either
- Symlink your data so it resides under
server/static
or - Update
server/app/routes/proxy-rules.js
so your assets are available on the server. This is only necessary if your assets is externally hosted. You can skip this step if you already put your assets underserver/static
.
- Symlink your data so it resides under
-
Create view to browse your assets.
Example view is at:
server/proj/scannet/views/nyuv2-annotations.jade
Run the
getexamples.sh
script in the repository root to download one example scene from NYUv2 and connect to this view.The view can be accessed at: http://localhost:8010/scans/nyuv2
Note that the above URL path is routed through the parameters in the
server/proj/index.js
file.The route is hooked up in
server/proj/scannet/index.js
app.get('/nyuv2', function (req, res) { res.render('nyuv2-annotations', { baseUrl: config.baseUrl }); });
You can add a link to your view in the main page at
server/static/html/index.html
Here is an example of a basic asset metadata file that assumes the assets are placed under server/static/assets/yourassetname
and with special per-asset fields segment-annotations-manual
and scan-model-alignments
that are used to export the annotations.
{
"source": "yourassetname",
"assetType": "scan",
"rootPath": "${baseUrl}/assets/yourassetname",
"screenShotPath": "${rootPath}/${id}/${id}.png",
"hasThumbnail": false,
"assetFields": ["segment-annotations-manual", "scan-model-alignments"],
"formats": [
{ "format": "ply",
"path": "${rootPath}/${id}/${id}.ply",
"defaultUp": [ 0, 0, 1 ], "defaultFront": [ -1, 0, 0], "defaultUnit": 1,
"materialSidedness": "Front",
"useVertexColors": true,
"computeNormals": true
}
],
"surfaces": {
"format": "segmentGroups",
"file": "${rootPath}/${id}/${id}.segs.json"
},
"segment-annotations-manual": {
"format": "segmentGroups",
"files": {
"annIds": "${baseUrl}/scans/segment-annotations/list?itemId=${fullId}&$columns=id,workerId,data&format=json&condition[$in]=manual",
"segmentGroups": "${baseUrl}/scans/segment-annotations/aggregated?annId=${annId}",
"segments": "${rootPath}/${id}/${id}.segs.json",
"annotatedAssetIds": "${baseUrl}/scans/segment-annotations/list?$columns=itemId&format=json&condition[$in]=manual"
}
},
"scan-model-alignments": {
"files": {
"annIds": "${baseUrl}/annotations/list?itemId=${fullId}&$columns=id,workerId,data&format=json&type=scan-model-align&condition[$in]=manual",
"annotatedAssetIds": "${baseUrl}/annotations/list?$columns=itemId&format=json&type=scan-model-align&condition[$in]=manual"
}
}
}
Here is an example asset metadata json file for mesh reconstructions of the NYUv2 dataset (${baseUrl}
is the base url path for the scene toolkit webapp). This example is provided with the code (server/static/data/scannet/nyuv2.json
).
{
"source": "nyuv2",
"assetType": "scan",
"rootPath": "${baseUrl}/nyuv2",
"screenShotPath": "${rootPath}/${id}/${id}_vh_clean_2.png",
"hasThumbnails": true,
"formats": [
{ "format": "ply",
"path": "${rootPath}/${id}/${id}_vh_clean_2.ply",
"defaultUp": [ 0, 0, 1 ], "defaultFront": [ 0, -1, 0], "defaultUnit": 1,
"useVertexColors": true,
"computeNormals": true
}
],
"surfaces": {
"format": "segmentGroups",
"files": {
"segmentGroups": "${rootPath}/${id}/${id}.aggregation.json",
"segments": "${rootPath}/${id}/${id}_vh_clean_2.0.010000.segs.json"
}
},
"segment-annotations-raw": {
"format": "segmentGroups",
"files": {
"annIds": "${baseUrl}/segment-annotations/list?itemId=${fullId}&$columns=id,workerId,data&format=json&condition[$in]=nyu-new-batch1",
"segmentGroups": "${baseUrl}/segment-annotations/aggregated?annId=${annId}",
"segments": "${rootPath}/${id}/${id}_vh_clean_2.0.010000.segs.json"
}
},
"segment-annotations-clean": {
"format": "segmentGroups",
"files": {
"annIds": "${baseUrl}/segment-annotations/list?itemId=${fullId}&$columns=id,workerId,data&$clean=true&status=cleaned&format=json&condition[$in]=nyu-new-batch1",
"segmentGroups": "${baseUrl}/segment-annotations/aggregated?annId=${annId}&$clean=true&status=cleaned",
"segments": "${rootPath}/${id}/${id}_vh_clean_2.0.010000.segs.json"
}
}
}
Please see Scan Annotation Pipeline for details on how to prepare your scans for annotation.
- Home
- Main
- Annotators
- Assets
- Batch processing
- Development
- File formats
- Rendering
- Scene Tools
- Voxels