Skip to content

Commit d573322

Browse files
committed
provide training scripts and user tutorial
1 parent 20c44c9 commit d573322

File tree

1 file changed

+12
-4
lines changed

1 file changed

+12
-4
lines changed

README.md

Lines changed: 12 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,13 @@ Created by Kevin Lin, Huei-Fang Yang, and Chu-Song Chen at Academia Sinica, Taip
99

1010
We present a simple yet effective supervised deep hash approach that constructs binary hash codes from labeled data for large-scale image search. SSDH constructs hash functions as a latent layer in a deep network and the binary codes are learned by minimizing an objective function defined over classification error and other desirable hash codes properties. Compared to state-of-the-art results, SSDH achieves 26.30% (89.68% vs. 63.38%), 17.11% (89.00% vs. 71.89%) and 19.56% (31.28% vs. 11.72%) higher precisions averaged over a different number of top returned images for the CIFAR-10, NUS-WIDE, and SUN397 datasets, respectively.
1111

12+
<img src="https://www.csie.ntu.edu.tw/~r01944012/ssdh_intro.png" width="800">
13+
1214
The details can be found in the following [arXiv preprint.](http://arxiv.org/abs/1507.00101)
1315

14-
Deep Learning workshop presentation slide [PDF](http://www.csie.ntu.edu.tw/~r01944012/deepworkshop-slide.pdf)
16+
Presentation slide can be found [here](http://www.csie.ntu.edu.tw/~r01944012/deepworkshop-slide.pdf)
17+
1518

16-
<img src="https://www.csie.ntu.edu.tw/~r01944012/ssdh_intro.png" width="800">
1719

1820
### Citing the deep hashing work
1921

@@ -51,7 +53,8 @@ Launch matlab and run `demo.m`. This demo will generate 48-bits binary codes for
5153

5254
>> demo
5355

54-
<img src="https://www.csie.ntu.edu.tw/~r01944012/ssdh_demo.png" width="600">
56+
57+
<img src="https://www.csie.ntu.edu.tw/~r01944012/ssdh_demo.png" width="400">
5558

5659

5760
## Retrieval evaluation on CIFAR10
@@ -73,7 +76,7 @@ Moreover, simply run the following commands to generate the `precision at k` cur
7376
You will reproduce the precision curves with respect to different number of top retrieved samples when the 48-bit hash codes are
7477
used in the evaluation.
7578

76-
## Train your SSDH on CIFAR10
79+
## Train SSDH on CIFAR10
7780

7881
Simply run the following command to train SSDH:
7982

@@ -105,6 +108,11 @@ Launch matlab, run `demo.m` and enjoy!
105108

106109
>> demo
107110

111+
## Train SSDH on another dataset
112+
113+
It should be easy to train the model using another dataset as long as that dataset has label annotations. You need to convert the dataset into leveldb/lmdb format using "create_imagenet.sh". We will show you how to do this.
114+
115+
108116
## Contact
109117

110118
Please feel free to leave suggestions or comments to Kevin Lin (kevinlin311.tw@iis.sinica.edu.tw), Huei-Fang Yang (hfyang@citi.sinica.edu.tw) or Chu-Song Chen (song@iis.sinica.edu.tw)

0 commit comments

Comments
 (0)