Skip to content

Instantly share code, notes, and snippets.

@jimmie33
Last active May 24, 2022 12:22
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save jimmie33/509111f8a00a9ece2c3d5dde6a750129 to your computer and use it in GitHub Desktop.
Save jimmie33/509111f8a00a9ece2c3d5dde6a750129 to your computer and use it in GitHub Desktop.

##Information

name: The VGG16 model for generating salient object proposals.

caffemodel: VGG16_SOD_finetune.caffemodel

caffemodel_url: http://www.cs.bu.edu/groups/ivc/Subitizing/model/VGG16/VGG16_SOD_finetune.caffemodel

license: https://opensource.org/licenses/MIT

caffe_version: Tested on the official master branch of Caffe downloaded on 04/01/2016

gist_id: 509111f8a00a9ece2c3d5dde6a750129

Description

The model is a MultiBox-like model that outputs confidence scores for 100 prototype bounding boxes. It is used as an object proposal generation method for salient object detection. To extract a compact set of detection windows, a proposal subset optimization method is used. It is recommended to download the full implementation of our salient object detection system via the following link:

J. Zhang, S. Sclaroff, Z. Lin, X. Shen, B. Price and R. Mech. "Unconstrained Salient Salient Object Detection via Proposal Subset Optimization." CVPR, 2016.

https://github.com/jimmie33/SOD (It will automatically download the needed models and data.)

Please cite the paper if you use the model.

The input images should be zero-centered by mean pixel (rather than mean image) subtraction. Namely, the following BGR values should be subtracted: [103.939, 116.779, 123.68].

0.41796,0.65167,0.64246,0.95472
0.3343,0.46137,0.82036,0.95106
0.040568,0.046164,0.87636,0.97568
0.27258,0.18867,0.47082,0.43205
0.14697,0.24,0.5171,0.79616
0.30329,0.41203,0.44898,0.62463
0.038553,0.28013,0.93322,0.67902
0.03434,0.077796,0.3449,0.96912
0.10349,0.40725,0.41881,0.80828
0.52827,0.41234,0.6867,0.60466
0.031644,0.064574,0.54493,0.98843
0.45207,0.11801,0.64745,0.36468
0.54593,0.20367,0.96211,0.99138
0.23063,0.23718,0.80116,0.81111
0.20349,0.27242,0.94645,0.97627
0.69474,0.30513,0.95945,0.9444
0.25984,0.42934,0.5141,0.96691
0.185,0.21104,0.48173,0.97905
0.46158,0.27727,0.67703,0.79002
0.19755,0.55881,0.63461,0.92557
0.25139,0.39578,0.6522,0.86763
0.30562,0.090377,0.81135,0.83804
0.44745,0.055822,0.97111,0.96967
0.36313,0.24874,0.73608,0.60221
0.69864,0.18499,0.9596,0.72786
0.033313,0.20888,0.96436,0.98147
0.77936,0.38242,0.95547,0.65094
0.73332,0.55497,0.8999,0.8658
0.074735,0.21314,0.93785,0.83407
0.14624,0.064321,0.97277,0.98152
0.082366,0.38254,0.93694,0.86513
0.21839,0.3305,0.8452,0.65382
0.046851,0.051668,0.95956,0.81198
0.1967,0.082766,0.45653,0.87269
0.091678,0.41796,0.77068,0.97281
0.53436,0.07308,0.92615,0.40924
0.23387,0.25799,0.69223,0.97238
0.034175,0.28059,0.3143,0.96538
0.10243,0.10228,0.56561,0.5702
0.58061,0.25268,0.88168,0.6929
0.53738,0.57892,0.93384,0.94331
0.17352,0.070094,0.80364,0.97378
0.45619,0.069358,0.92488,0.63111
0.14673,0.28036,0.38692,0.91674
0.33599,0.19393,0.55294,0.60216
0.098932,0.12413,0.83883,0.87776
0.018555,0.048838,0.72876,0.9824
0.063828,0.44083,0.25903,0.73135
0.36082,0.29923,0.85825,0.82801
0.15586,0.10403,0.91016,0.44817
0.082706,0.13143,0.33282,0.4755
0.052176,0.075852,0.80403,0.71456
0.47733,0.087477,0.80232,0.96494
0.41816,0.23175,0.76373,0.95425
0.092942,0.080382,0.37181,0.649
0.31069,0.33452,0.50833,0.80896
0.51209,0.42625,0.74918,0.95147
0.66058,0.094443,0.92712,0.87877
0.39424,0.51579,0.5575,0.75225
0.23028,0.13803,0.94234,0.7261
0.05204,0.23645,0.73182,0.97077
0.17134,0.087041,0.61892,0.9664
0.63155,0.063139,0.97525,0.99025
0.49914,0.16354,0.94613,0.82275
0.29558,0.089199,0.75222,0.96918
0.054673,0.32927,0.68033,0.77491
0.018814,0.0424,0.98082,0.98372
0.18672,0.13179,0.67134,0.77178
0.50063,0.15203,0.75368,0.71922
0.66278,0.36968,0.82979,0.56797
0.39835,0.38929,0.60524,0.93794
0.3167,0.060664,0.69166,0.505
0.49714,0.38053,0.9536,0.95673
0.70992,0.15737,0.89101,0.41982
0.2628,0.054726,0.95841,0.97025
0.073771,0.54296,0.28988,0.94732
0.25751,0.37575,0.62275,0.6339
0.22242,0.66536,0.43276,0.89446
0.41524,0.38349,0.58725,0.67172
0.54438,0.17861,0.77942,0.53261
0.54273,0.39478,0.93078,0.73124
0.56772,0.54161,0.73304,0.81408
0.29422,0.25814,0.65849,0.74677
0.21801,0.49898,0.3772,0.73774
0.13146,0.2903,0.43448,0.64725
0.037938,0.41261,0.5101,0.98007
0.34773,0.1264,0.62886,0.79115
0.34937,0.061334,0.62648,0.95723
0.043071,0.19558,0.29064,0.76566
0.34733,0.21852,0.60098,0.95949
0.071752,0.10151,0.52253,0.36612
0.038672,0.088797,0.48427,0.81341
0.10533,0.3238,0.29018,0.55028
0.57825,0.27068,0.82832,0.89316
0.024269,0.23814,0.50095,0.95959
0.42563,0.41256,0.76515,0.74382
0.66399,0.39275,0.81499,0.73897
0.41154,0.33482,0.57357,0.51418
0.11641,0.59948,0.91404,0.92644
0.35923,0.19116,0.96186,0.98331
name: "VGG_ILSVRC_16_layers"
input: "data"
input_dim: 10
input_dim: 3
input_dim: 224
input_dim: 224
layer {
bottom: "data"
top: "conv1_1"
name: "conv1_1"
type: "Convolution"
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv1_1"
top: "conv1_1"
name: "relu1_1"
type: "ReLU"
}
layer {
bottom: "conv1_1"
top: "conv1_2"
name: "conv1_2"
type: "Convolution"
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv1_2"
top: "conv1_2"
name: "relu1_2"
type: "ReLU"
}
layer {
bottom: "conv1_2"
top: "pool1"
name: "pool1"
type: "Pooling"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
bottom: "pool1"
top: "conv2_1"
name: "conv2_1"
type: "Convolution"
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv2_1"
top: "conv2_1"
name: "relu2_1"
type: "ReLU"
}
layer {
bottom: "conv2_1"
top: "conv2_2"
name: "conv2_2"
type: "Convolution"
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv2_2"
top: "conv2_2"
name: "relu2_2"
type: "ReLU"
}
layer {
bottom: "conv2_2"
top: "pool2"
name: "pool2"
type: "Pooling"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
bottom: "pool2"
top: "conv3_1"
name: "conv3_1"
type: "Convolution"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv3_1"
top: "conv3_1"
name: "relu3_1"
type: "ReLU"
}
layer {
bottom: "conv3_1"
top: "conv3_2"
name: "conv3_2"
type: "Convolution"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv3_2"
top: "conv3_2"
name: "relu3_2"
type: "ReLU"
}
layer {
bottom: "conv3_2"
top: "conv3_3"
name: "conv3_3"
type: "Convolution"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv3_3"
top: "conv3_3"
name: "relu3_3"
type: "ReLU"
}
layer {
bottom: "conv3_3"
top: "pool3"
name: "pool3"
type: "Pooling"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
bottom: "pool3"
top: "conv4_1"
name: "conv4_1"
type: "Convolution"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv4_1"
top: "conv4_1"
name: "relu4_1"
type: "ReLU"
}
layer {
bottom: "conv4_1"
top: "conv4_2"
name: "conv4_2"
type: "Convolution"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv4_2"
top: "conv4_2"
name: "relu4_2"
type: "ReLU"
}
layer {
bottom: "conv4_2"
top: "conv4_3"
name: "conv4_3"
type: "Convolution"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv4_3"
top: "conv4_3"
name: "relu4_3"
type: "ReLU"
}
layer {
bottom: "conv4_3"
top: "pool4"
name: "pool4"
type: "Pooling"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
bottom: "pool4"
top: "conv5_1"
name: "conv5_1"
type: "Convolution"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv5_1"
top: "conv5_1"
name: "relu5_1"
type: "ReLU"
}
layer {
bottom: "conv5_1"
top: "conv5_2"
name: "conv5_2"
type: "Convolution"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv5_2"
top: "conv5_2"
name: "relu5_2"
type: "ReLU"
}
layer {
bottom: "conv5_2"
top: "conv5_3"
name: "conv5_3"
type: "Convolution"
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
bottom: "conv5_3"
top: "conv5_3"
name: "relu5_3"
type: "ReLU"
}
layer {
bottom: "conv5_3"
top: "pool5"
name: "pool5"
type: "Pooling"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
bottom: "pool5"
top: "fc6"
name: "fc6"
type: "InnerProduct"
inner_product_param {
num_output: 4096
}
}
layer {
bottom: "fc6"
top: "fc6"
name: "relu6"
type: "ReLU"
}
layer {
bottom: "fc6"
top: "fc7"
name: "fc7"
type: "InnerProduct"
inner_product_param {
num_output: 4096
}
}
layer {
bottom: "fc7"
top: "fc7"
name: "relu7"
type: "ReLU"
}
layer {
bottom: "fc7"
top: "fc8"
name: "fc8-SOD100"
type: "InnerProduct"
inner_product_param {
num_output: 100
}
}
layer {
bottom: "fc8"
top: "prob"
name: "prob"
type: "Sigmoid"
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment