測試caffe中的net

參考<深度學習21天實戰caffe>,P136,所用到的boost需要是boot_1_58_0版本。

編寫文件 net_demo.cpp,並保存在/home/sf/demo下:

#include <vector>
#include <iostream>
#include "caffe/net.hpp"
using namespace caffe;
using namespace std;


int main(void)
{
    std::string proto("deploy.prototxt");
    Net<float> nn(proto,caffe::TEST);
    vector<string> bn=nn.blob_names();
    for(int i=0;i<bn.size();i++)
    {
        cout<<"Blob #"<<i<<" : "<<bn[i]<<endl;


    }




return 0;


}

sf@ubuntu:~/demo$ g++ -o net_app ./net_demo.cpp -I $CAFFE_ROOT/include/ -D CPU_ONLY -I $CAFFE_ROOT/.build_release/src -L $CAFFE_ROOT/build/lib/ -lcaffe -lglog -lboost_system -lprotobuf 

sf@ubuntu:~/demo$ ./net_app 
WARNING: Logging before InitGoogleLogging() is written to STDERR
F0731 11:08:11.340579 15561 io.cpp:36] Check failed: fd != -1 (-1 vs. -1) File not found: deploy.prototxt
*** Check failure stack trace: ***

已放棄  

這是因爲沒有把deploy.prototxt文件複製到demo下,把caffe/models/bvlc_reference_caffnnet/deploy.prototxt複製到demo下即可。

sf@ubuntu:~/demo$ ./net_app 
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0731 11:09:40.718194 15743 net.cpp:58] Initializing net from parameters: 
name: "CaffeNet"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc8"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc8"
  inner_product_param {
    num_output: 1000
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8"
  top: "prob"
}
I0731 11:09:40.723222 15743 layer_factory.hpp:77] Creating layer data
I0731 11:09:40.723254 15743 net.cpp:100] Creating Layer data
I0731 11:09:40.723271 15743 net.cpp:408] data -> data
I0731 11:09:40.723369 15743 net.cpp:150] Setting up data
I0731 11:09:40.723402 15743 net.cpp:157] Top shape: 10 3 227 227 (1545870)
I0731 11:09:40.723413 15743 net.cpp:165] Memory required for data: 6183480
I0731 11:09:40.723428 15743 layer_factory.hpp:77] Creating layer conv1
I0731 11:09:40.723449 15743 net.cpp:100] Creating Layer conv1
I0731 11:09:40.723464 15743 net.cpp:434] conv1 <- data
I0731 11:09:40.723480 15743 net.cpp:408] conv1 -> conv1
I0731 11:09:40.723665 15743 net.cpp:150] Setting up conv1
I0731 11:09:40.723687 15743 net.cpp:157] Top shape: 10 96 55 55 (2904000)
I0731 11:09:40.723698 15743 net.cpp:165] Memory required for data: 17799480
I0731 11:09:40.723726 15743 layer_factory.hpp:77] Creating layer relu1
I0731 11:09:40.723742 15743 net.cpp:100] Creating Layer relu1
I0731 11:09:40.723754 15743 net.cpp:434] relu1 <- conv1
I0731 11:09:40.723768 15743 net.cpp:395] relu1 -> conv1 (in-place)
I0731 11:09:40.723800 15743 net.cpp:150] Setting up relu1
I0731 11:09:40.723819 15743 net.cpp:157] Top shape: 10 96 55 55 (2904000)
I0731 11:09:40.723829 15743 net.cpp:165] Memory required for data: 29415480
I0731 11:09:40.723839 15743 layer_factory.hpp:77] Creating layer pool1
I0731 11:09:40.723853 15743 net.cpp:100] Creating Layer pool1
I0731 11:09:40.723865 15743 net.cpp:434] pool1 <- conv1
I0731 11:09:40.723877 15743 net.cpp:408] pool1 -> pool1
I0731 11:09:40.723906 15743 net.cpp:150] Setting up pool1
I0731 11:09:40.723922 15743 net.cpp:157] Top shape: 10 96 27 27 (699840)
I0731 11:09:40.723933 15743 net.cpp:165] Memory required for data: 32214840
I0731 11:09:40.723943 15743 layer_factory.hpp:77] Creating layer norm1
I0731 11:09:40.723958 15743 net.cpp:100] Creating Layer norm1
I0731 11:09:40.723970 15743 net.cpp:434] norm1 <- pool1
I0731 11:09:40.723984 15743 net.cpp:408] norm1 -> norm1
I0731 11:09:40.724002 15743 net.cpp:150] Setting up norm1
I0731 11:09:40.724016 15743 net.cpp:157] Top shape: 10 96 27 27 (699840)
I0731 11:09:40.724027 15743 net.cpp:165] Memory required for data: 35014200
I0731 11:09:40.724038 15743 layer_factory.hpp:77] Creating layer conv2
I0731 11:09:40.724052 15743 net.cpp:100] Creating Layer conv2
I0731 11:09:40.724064 15743 net.cpp:434] conv2 <- norm1
I0731 11:09:40.724077 15743 net.cpp:408] conv2 -> conv2
I0731 11:09:40.725082 15743 net.cpp:150] Setting up conv2
I0731 11:09:40.728862 15743 net.cpp:157] Top shape: 10 256 27 27 (1866240)
I0731 11:09:40.728886 15743 net.cpp:165] Memory required for data: 42479160
I0731 11:09:40.728912 15743 layer_factory.hpp:77] Creating layer relu2
I0731 11:09:40.728932 15743 net.cpp:100] Creating Layer relu2
I0731 11:09:40.728943 15743 net.cpp:434] relu2 <- conv2
I0731 11:09:40.728955 15743 net.cpp:395] relu2 -> conv2 (in-place)
I0731 11:09:40.728972 15743 net.cpp:150] Setting up relu2
I0731 11:09:40.729006 15743 net.cpp:157] Top shape: 10 256 27 27 (1866240)
I0731 11:09:40.729017 15743 net.cpp:165] Memory required for data: 49944120
I0731 11:09:40.732414 15743 layer_factory.hpp:77] Creating layer pool2
I0731 11:09:40.732501 15743 net.cpp:100] Creating Layer pool2
I0731 11:09:40.732564 15743 net.cpp:434] pool2 <- conv2
I0731 11:09:40.732614 15743 net.cpp:408] pool2 -> pool2
I0731 11:09:40.732689 15743 net.cpp:150] Setting up pool2
I0731 11:09:40.732744 15743 net.cpp:157] Top shape: 10 256 13 13 (432640)
I0731 11:09:40.732777 15743 net.cpp:165] Memory required for data: 51674680
I0731 11:09:40.732858 15743 layer_factory.hpp:77] Creating layer norm2
I0731 11:09:40.732908 15743 net.cpp:100] Creating Layer norm2
I0731 11:09:40.732947 15743 net.cpp:434] norm2 <- pool2
I0731 11:09:40.732982 15743 net.cpp:408] norm2 -> norm2
I0731 11:09:40.733024 15743 net.cpp:150] Setting up norm2
I0731 11:09:40.733062 15743 net.cpp:157] Top shape: 10 256 13 13 (432640)
I0731 11:09:40.733093 15743 net.cpp:165] Memory required for data: 53405240
I0731 11:09:40.733114 15743 layer_factory.hpp:77] Creating layer conv3
I0731 11:09:40.733140 15743 net.cpp:100] Creating Layer conv3
I0731 11:09:40.733162 15743 net.cpp:434] conv3 <- norm2
I0731 11:09:40.733186 15743 net.cpp:408] conv3 -> conv3
I0731 11:09:40.759501 15743 net.cpp:150] Setting up conv3
I0731 11:09:40.762949 15743 net.cpp:157] Top shape: 10 384 13 13 (648960)
I0731 11:09:40.763232 15743 net.cpp:165] Memory required for data: 56001080
I0731 11:09:40.763977 15743 layer_factory.hpp:77] Creating layer relu3
I0731 11:09:40.765498 15743 net.cpp:100] Creating Layer relu3
I0731 11:09:40.766160 15743 net.cpp:434] relu3 <- conv3
I0731 11:09:40.766926 15743 net.cpp:395] relu3 -> conv3 (in-place)
I0731 11:09:40.769448 15743 net.cpp:150] Setting up relu3
I0731 11:09:40.769975 15743 net.cpp:157] Top shape: 10 384 13 13 (648960)
I0731 11:09:40.770359 15743 net.cpp:165] Memory required for data: 58596920
I0731 11:09:40.770670 15743 layer_factory.hpp:77] Creating layer conv4
I0731 11:09:40.770933 15743 net.cpp:100] Creating Layer conv4
I0731 11:09:40.771000 15743 net.cpp:434] conv4 <- conv3
I0731 11:09:40.771065 15743 net.cpp:408] conv4 -> conv4
I0731 11:09:40.772445 15743 net.cpp:150] Setting up conv4
I0731 11:09:40.776958 15743 net.cpp:157] Top shape: 10 384 13 13 (648960)
I0731 11:09:40.777078 15743 net.cpp:165] Memory required for data: 61192760
I0731 11:09:40.777261 15743 layer_factory.hpp:77] Creating layer relu4
I0731 11:09:40.777587 15743 net.cpp:100] Creating Layer relu4
I0731 11:09:40.777786 15743 net.cpp:434] relu4 <- conv4
I0731 11:09:40.778097 15743 net.cpp:395] relu4 -> conv4 (in-place)
I0731 11:09:40.778388 15743 net.cpp:150] Setting up relu4
I0731 11:09:40.778656 15743 net.cpp:157] Top shape: 10 384 13 13 (648960)
I0731 11:09:40.778726 15743 net.cpp:165] Memory required for data: 63788600
I0731 11:09:40.778782 15743 layer_factory.hpp:77] Creating layer conv5
I0731 11:09:40.778859 15743 net.cpp:100] Creating Layer conv5
I0731 11:09:40.778914 15743 net.cpp:434] conv5 <- conv4
I0731 11:09:40.778980 15743 net.cpp:408] conv5 -> conv5
I0731 11:09:40.780534 15743 net.cpp:150] Setting up conv5
I0731 11:09:40.780606 15743 net.cpp:157] Top shape: 10 256 13 13 (432640)
I0731 11:09:40.780656 15743 net.cpp:165] Memory required for data: 65519160
I0731 11:09:40.780722 15743 layer_factory.hpp:77] Creating layer relu5
I0731 11:09:40.780781 15743 net.cpp:100] Creating Layer relu5
I0731 11:09:40.780867 15743 net.cpp:434] relu5 <- conv5
I0731 11:09:40.780927 15743 net.cpp:395] relu5 -> conv5 (in-place)
I0731 11:09:40.780988 15743 net.cpp:150] Setting up relu5
I0731 11:09:40.781044 15743 net.cpp:157] Top shape: 10 256 13 13 (432640)
I0731 11:09:40.781092 15743 net.cpp:165] Memory required for data: 67249720
I0731 11:09:40.781142 15743 layer_factory.hpp:77] Creating layer pool5
I0731 11:09:40.781195 15743 net.cpp:100] Creating Layer pool5
I0731 11:09:40.781245 15743 net.cpp:434] pool5 <- conv5
I0731 11:09:40.781297 15743 net.cpp:408] pool5 -> pool5
I0731 11:09:40.781374 15743 net.cpp:150] Setting up pool5
I0731 11:09:40.781430 15743 net.cpp:157] Top shape: 10 256 6 6 (92160)
I0731 11:09:40.781479 15743 net.cpp:165] Memory required for data: 67618360
I0731 11:09:40.781528 15743 layer_factory.hpp:77] Creating layer fc6
I0731 11:09:40.781584 15743 net.cpp:100] Creating Layer fc6
I0731 11:09:40.781635 15743 net.cpp:434] fc6 <- pool5
I0731 11:09:40.781692 15743 net.cpp:408] fc6 -> fc6
I0731 11:09:40.906853 15743 net.cpp:150] Setting up fc6
I0731 11:09:40.907140 15743 net.cpp:157] Top shape: 10 4096 (40960)
I0731 11:09:40.907177 15743 net.cpp:165] Memory required for data: 67782200
I0731 11:09:40.907218 15743 layer_factory.hpp:77] Creating layer relu6
I0731 11:09:40.907258 15743 net.cpp:100] Creating Layer relu6
I0731 11:09:40.907291 15743 net.cpp:434] relu6 <- fc6
I0731 11:09:40.907326 15743 net.cpp:395] relu6 -> fc6 (in-place)
I0731 11:09:40.907366 15743 net.cpp:150] Setting up relu6
I0731 11:09:40.907399 15743 net.cpp:157] Top shape: 10 4096 (40960)
I0731 11:09:40.907459 15743 net.cpp:165] Memory required for data: 67946040
I0731 11:09:40.907505 15743 layer_factory.hpp:77] Creating layer drop6
I0731 11:09:40.907555 15743 net.cpp:100] Creating Layer drop6
I0731 11:09:40.907605 15743 net.cpp:434] drop6 <- fc6
I0731 11:09:40.907651 15743 net.cpp:395] drop6 -> fc6 (in-place)
I0731 11:09:40.907694 15743 net.cpp:150] Setting up drop6
I0731 11:09:40.907728 15743 net.cpp:157] Top shape: 10 4096 (40960)
I0731 11:09:40.907758 15743 net.cpp:165] Memory required for data: 68109880
I0731 11:09:40.912937 15743 layer_factory.hpp:77] Creating layer fc7
I0731 11:09:40.913004 15743 net.cpp:100] Creating Layer fc7
I0731 11:09:40.913039 15743 net.cpp:434] fc7 <- fc6
I0731 11:09:40.913074 15743 net.cpp:408] fc7 -> fc7
I0731 11:09:40.995509 15743 net.cpp:150] Setting up fc7
I0731 11:09:41.001080 15743 net.cpp:157] Top shape: 10 4096 (40960)
I0731 11:09:41.001133 15743 net.cpp:165] Memory required for data: 68273720
I0731 11:09:41.001178 15743 layer_factory.hpp:77] Creating layer relu7
I0731 11:09:41.001219 15743 net.cpp:100] Creating Layer relu7
I0731 11:09:41.001253 15743 net.cpp:434] relu7 <- fc7
I0731 11:09:41.001296 15743 net.cpp:395] relu7 -> fc7 (in-place)
I0731 11:09:41.001341 15743 net.cpp:150] Setting up relu7
I0731 11:09:41.001376 15743 net.cpp:157] Top shape: 10 4096 (40960)
I0731 11:09:41.001407 15743 net.cpp:165] Memory required for data: 68437560
I0731 11:09:41.001438 15743 layer_factory.hpp:77] Creating layer drop7
I0731 11:09:41.001492 15743 net.cpp:100] Creating Layer drop7
I0731 11:09:41.001552 15743 net.cpp:434] drop7 <- fc7
I0731 11:09:41.001600 15743 net.cpp:395] drop7 -> fc7 (in-place)
I0731 11:09:41.001638 15743 net.cpp:150] Setting up drop7
I0731 11:09:41.001672 15743 net.cpp:157] Top shape: 10 4096 (40960)
I0731 11:09:41.001703 15743 net.cpp:165] Memory required for data: 68601400
I0731 11:09:41.001734 15743 layer_factory.hpp:77] Creating layer fc8
I0731 11:09:41.001766 15743 net.cpp:100] Creating Layer fc8
I0731 11:09:41.001797 15743 net.cpp:434] fc8 <- fc7
I0731 11:09:41.001830 15743 net.cpp:408] fc8 -> fc8
I0731 11:09:41.069466 15743 net.cpp:150] Setting up fc8
I0731 11:09:41.069607 15743 net.cpp:157] Top shape: 10 1000 (10000)
I0731 11:09:41.069641 15743 net.cpp:165] Memory required for data: 68641400
I0731 11:09:41.069681 15743 layer_factory.hpp:77] Creating layer prob
I0731 11:09:41.069721 15743 net.cpp:100] Creating Layer prob
I0731 11:09:41.069754 15743 net.cpp:434] prob <- fc8
I0731 11:09:41.069790 15743 net.cpp:408] prob -> prob
I0731 11:09:41.069839 15743 net.cpp:150] Setting up prob
I0731 11:09:41.069878 15743 net.cpp:157] Top shape: 10 1000 (10000)
I0731 11:09:41.069908 15743 net.cpp:165] Memory required for data: 68681400
I0731 11:09:41.069939 15743 net.cpp:228] prob does not need backward computation.
I0731 11:09:41.069975 15743 net.cpp:228] fc8 does not need backward computation.
I0731 11:09:41.070006 15743 net.cpp:228] drop7 does not need backward computation.
I0731 11:09:41.070106 15743 net.cpp:228] relu7 does not need backward computation.
I0731 11:09:41.070158 15743 net.cpp:228] fc7 does not need backward computation.
I0731 11:09:41.070205 15743 net.cpp:228] drop6 does not need backward computation.
I0731 11:09:41.070255 15743 net.cpp:228] relu6 does not need backward computation.
I0731 11:09:41.070304 15743 net.cpp:228] fc6 does not need backward computation.
I0731 11:09:41.070350 15743 net.cpp:228] pool5 does not need backward computation.
I0731 11:09:41.070394 15743 net.cpp:228] relu5 does not need backward computation.
I0731 11:09:41.070466 15743 net.cpp:228] conv5 does not need backward computation.
I0731 11:09:41.070504 15743 net.cpp:228] relu4 does not need backward computation.
I0731 11:09:41.070570 15743 net.cpp:228] conv4 does not need backward computation.
I0731 11:09:41.070617 15743 net.cpp:228] relu3 does not need backward computation.
I0731 11:09:41.070663 15743 net.cpp:228] conv3 does not need backward computation.
I0731 11:09:41.070710 15743 net.cpp:228] norm2 does not need backward computation.
I0731 11:09:41.070760 15743 net.cpp:228] pool2 does not need backward computation.
I0731 11:09:41.070808 15743 net.cpp:228] relu2 does not need backward computation.
I0731 11:09:41.070843 15743 net.cpp:228] conv2 does not need backward computation.
I0731 11:09:41.070874 15743 net.cpp:228] norm1 does not need backward computation.
I0731 11:09:41.070904 15743 net.cpp:228] pool1 does not need backward computation.
I0731 11:09:41.070935 15743 net.cpp:228] relu1 does not need backward computation.
I0731 11:09:41.070965 15743 net.cpp:228] conv1 does not need backward computation.
I0731 11:09:41.070996 15743 net.cpp:228] data does not need backward computation.
I0731 11:09:41.071058 15743 net.cpp:270] This network produces output prob
I0731 11:09:41.071111 15743 net.cpp:283] Network initialization done.
Blob #0 : data
Blob #1 : conv1
Blob #2 : pool1
Blob #3 : norm1
Blob #4 : conv2
Blob #5 : pool2
Blob #6 : norm2
Blob #7 : conv3
Blob #8 : conv4
Blob #9 : conv5
Blob #10 : pool5
Blob #11 : fc6
Blob #12 : fc7
Blob #13 : fc8
Blob #14 : prob
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章