SSGF-for-HRRS-scene-classification-master程序執行中caffe相關內容配置

/home/sys1710/anaconda3/envs/python27/bin/python2.7 /home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0322 21:10:03.595321 27878 upgrade_proto.cpp:69] Attempting to upgrade input file specified using deprecated input fields: deploy_svm_caffenet.prototxt
I0322 21:10:03.595340 27878 upgrade_proto.cpp:72] Successfully upgraded file specified using deprecated input fields.
W0322 21:10:03.595342 27878 upgrade_proto.cpp:74] Note that future Caffe releases will only support input layers and not input fields.
I0322 21:10:03.595369 27878 net.cpp:53] Initializing net from parameters: 
name: "landuse_on_CaffeNet"
state {
  phase: TEST
  level: 0
}
layer {
  name: "input"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 4096
    weight_filler {
      type: "gaussian"
      std: 0.005
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 4096
    weight_filler {
      type: "gaussian"
      std: 0.005
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
I0322 21:10:03.595485 27878 layer_factory.hpp:77] Creating layer input
I0322 21:10:03.595492 27878 net.cpp:86] Creating Layer input
I0322 21:10:03.595495 27878 net.cpp:382] input -> data
I0322 21:10:03.595507 27878 net.cpp:124] Setting up input
I0322 21:10:03.595510 27878 net.cpp:131] Top shape: 10 3 227 227 (1545870)
I0322 21:10:03.595515 27878 net.cpp:139] Memory required for data: 6183480
I0322 21:10:03.595516 27878 layer_factory.hpp:77] Creating layer conv1
I0322 21:10:03.595521 27878 net.cpp:86] Creating Layer conv1
I0322 21:10:03.595523 27878 net.cpp:408] conv1 <- data
I0322 21:10:03.595527 27878 net.cpp:382] conv1 -> conv1
I0322 21:10:03.595824 27878 net.cpp:124] Setting up conv1
I0322 21:10:03.595829 27878 net.cpp:131] Top shape: 10 96 55 55 (2904000)
I0322 21:10:03.595831 27878 net.cpp:139] Memory required for data: 17799480
I0322 21:10:03.595839 27878 layer_factory.hpp:77] Creating layer relu1
I0322 21:10:03.595842 27878 net.cpp:86] Creating Layer relu1
I0322 21:10:03.595844 27878 net.cpp:408] relu1 <- conv1
I0322 21:10:03.595847 27878 net.cpp:369] relu1 -> conv1 (in-place)
I0322 21:10:03.595852 27878 net.cpp:124] Setting up relu1
I0322 21:10:03.595854 27878 net.cpp:131] Top shape: 10 96 55 55 (2904000)
I0322 21:10:03.595856 27878 net.cpp:139] Memory required for data: 29415480
I0322 21:10:03.595857 27878 layer_factory.hpp:77] Creating layer pool1
I0322 21:10:03.595860 27878 net.cpp:86] Creating Layer pool1
I0322 21:10:03.595862 27878 net.cpp:408] pool1 <- conv1
I0322 21:10:03.595865 27878 net.cpp:382] pool1 -> pool1
I0322 21:10:03.595870 27878 net.cpp:124] Setting up pool1
I0322 21:10:03.595871 27878 net.cpp:131] Top shape: 10 96 27 27 (699840)
I0322 21:10:03.595873 27878 net.cpp:139] Memory required for data: 32214840
I0322 21:10:03.595875 27878 layer_factory.hpp:77] Creating layer norm1
I0322 21:10:03.595880 27878 net.cpp:86] Creating Layer norm1
I0322 21:10:03.595881 27878 net.cpp:408] norm1 <- pool1
I0322 21:10:03.595883 27878 net.cpp:382] norm1 -> norm1
I0322 21:10:03.595887 27878 net.cpp:124] Setting up norm1
I0322 21:10:03.595890 27878 net.cpp:131] Top shape: 10 96 27 27 (699840)
I0322 21:10:03.595891 27878 net.cpp:139] Memory required for data: 35014200
I0322 21:10:03.595893 27878 layer_factory.hpp:77] Creating layer conv2
I0322 21:10:03.595897 27878 net.cpp:86] Creating Layer conv2
I0322 21:10:03.595899 27878 net.cpp:408] conv2 <- norm1
I0322 21:10:03.595902 27878 net.cpp:382] conv2 -> conv2
I0322 21:10:03.598060 27878 net.cpp:124] Setting up conv2
I0322 21:10:03.598064 27878 net.cpp:131] Top shape: 10 256 27 27 (1866240)
I0322 21:10:03.598067 27878 net.cpp:139] Memory required for data: 42479160
I0322 21:10:03.598071 27878 layer_factory.hpp:77] Creating layer relu2
I0322 21:10:03.598075 27878 net.cpp:86] Creating Layer relu2
I0322 21:10:03.598093 27878 net.cpp:408] relu2 <- conv2
I0322 21:10:03.598095 27878 net.cpp:369] relu2 -> conv2 (in-place)
I0322 21:10:03.598098 27878 net.cpp:124] Setting up relu2
I0322 21:10:03.598100 27878 net.cpp:131] Top shape: 10 256 27 27 (1866240)
I0322 21:10:03.598103 27878 net.cpp:139] Memory required for data: 49944120
I0322 21:10:03.598104 27878 layer_factory.hpp:77] Creating layer pool2
I0322 21:10:03.598107 27878 net.cpp:86] Creating Layer pool2
I0322 21:10:03.598109 27878 net.cpp:408] pool2 <- conv2
I0322 21:10:03.598111 27878 net.cpp:382] pool2 -> pool2
I0322 21:10:03.598115 27878 net.cpp:124] Setting up pool2
I0322 21:10:03.598117 27878 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0322 21:10:03.598119 27878 net.cpp:139] Memory required for data: 51674680
I0322 21:10:03.598121 27878 layer_factory.hpp:77] Creating layer norm2
I0322 21:10:03.598125 27878 net.cpp:86] Creating Layer norm2
I0322 21:10:03.598147 27878 net.cpp:408] norm2 <- pool2
I0322 21:10:03.598150 27878 net.cpp:382] norm2 -> norm2
I0322 21:10:03.598153 27878 net.cpp:124] Setting up norm2
I0322 21:10:03.598155 27878 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0322 21:10:03.598157 27878 net.cpp:139] Memory required for data: 53405240
I0322 21:10:03.598160 27878 layer_factory.hpp:77] Creating layer conv3
I0322 21:10:03.598165 27878 net.cpp:86] Creating Layer conv3
I0322 21:10:03.598165 27878 net.cpp:408] conv3 <- norm2
I0322 21:10:03.598168 27878 net.cpp:382] conv3 -> conv3
I0322 21:10:03.604506 27878 net.cpp:124] Setting up conv3
I0322 21:10:03.604516 27878 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0322 21:10:03.604521 27878 net.cpp:139] Memory required for data: 56001080
I0322 21:10:03.604526 27878 layer_factory.hpp:77] Creating layer relu3
I0322 21:10:03.604532 27878 net.cpp:86] Creating Layer relu3
I0322 21:10:03.604534 27878 net.cpp:408] relu3 <- conv3
I0322 21:10:03.604537 27878 net.cpp:369] relu3 -> conv3 (in-place)
I0322 21:10:03.604542 27878 net.cpp:124] Setting up relu3
I0322 21:10:03.604543 27878 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0322 21:10:03.604545 27878 net.cpp:139] Memory required for data: 58596920
I0322 21:10:03.604547 27878 layer_factory.hpp:77] Creating layer conv4
I0322 21:10:03.604553 27878 net.cpp:86] Creating Layer conv4
I0322 21:10:03.604555 27878 net.cpp:408] conv4 <- conv3
I0322 21:10:03.604559 27878 net.cpp:382] conv4 -> conv4
I0322 21:10:03.609283 27878 net.cpp:124] Setting up conv4
I0322 21:10:03.609290 27878 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0322 21:10:03.609293 27878 net.cpp:139] Memory required for data: 61192760
I0322 21:10:03.609297 27878 layer_factory.hpp:77] Creating layer relu4
I0322 21:10:03.609302 27878 net.cpp:86] Creating Layer relu4
I0322 21:10:03.609303 27878 net.cpp:408] relu4 <- conv4
I0322 21:10:03.609308 27878 net.cpp:369] relu4 -> conv4 (in-place)
I0322 21:10:03.609310 27878 net.cpp:124] Setting up relu4
I0322 21:10:03.609311 27878 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0322 21:10:03.609314 27878 net.cpp:139] Memory required for data: 63788600
I0322 21:10:03.609315 27878 layer_factory.hpp:77] Creating layer conv5
I0322 21:10:03.609321 27878 net.cpp:86] Creating Layer conv5
I0322 21:10:03.609323 27878 net.cpp:408] conv5 <- conv4
I0322 21:10:03.609328 27878 net.cpp:382] conv5 -> conv5
I0322 21:10:03.612459 27878 net.cpp:124] Setting up conv5
I0322 21:10:03.612465 27878 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0322 21:10:03.612468 27878 net.cpp:139] Memory required for data: 65519160
I0322 21:10:03.612474 27878 layer_factory.hpp:77] Creating layer relu5
I0322 21:10:03.612478 27878 net.cpp:86] Creating Layer relu5
I0322 21:10:03.612480 27878 net.cpp:408] relu5 <- conv5
I0322 21:10:03.612483 27878 net.cpp:369] relu5 -> conv5 (in-place)
I0322 21:10:03.612486 27878 net.cpp:124] Setting up relu5
I0322 21:10:03.612488 27878 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0322 21:10:03.612490 27878 net.cpp:139] Memory required for data: 67249720
I0322 21:10:03.612493 27878 layer_factory.hpp:77] Creating layer pool5
I0322 21:10:03.612496 27878 net.cpp:86] Creating Layer pool5
I0322 21:10:03.612498 27878 net.cpp:408] pool5 <- conv5
I0322 21:10:03.612501 27878 net.cpp:382] pool5 -> pool5
I0322 21:10:03.612506 27878 net.cpp:124] Setting up pool5
I0322 21:10:03.612509 27878 net.cpp:131] Top shape: 10 256 6 6 (92160)
I0322 21:10:03.612511 27878 net.cpp:139] Memory required for data: 67618360
I0322 21:10:03.612512 27878 layer_factory.hpp:77] Creating layer fc6
I0322 21:10:03.612522 27878 net.cpp:86] Creating Layer fc6
I0322 21:10:03.612524 27878 net.cpp:408] fc6 <- pool5
I0322 21:10:03.612527 27878 net.cpp:382] fc6 -> fc6
I0322 21:10:03.877022 27878 net.cpp:124] Setting up fc6
I0322 21:10:03.877033 27878 net.cpp:131] Top shape: 10 4096 (40960)
I0322 21:10:03.877038 27878 net.cpp:139] Memory required for data: 67782200
I0322 21:10:03.877043 27878 layer_factory.hpp:77] Creating layer relu6
I0322 21:10:03.877049 27878 net.cpp:86] Creating Layer relu6
I0322 21:10:03.877053 27878 net.cpp:408] relu6 <- fc6
I0322 21:10:03.877056 27878 net.cpp:369] relu6 -> fc6 (in-place)
I0322 21:10:03.877060 27878 net.cpp:124] Setting up relu6
I0322 21:10:03.877061 27878 net.cpp:131] Top shape: 10 4096 (40960)
I0322 21:10:03.877064 27878 net.cpp:139] Memory required for data: 67946040
I0322 21:10:03.877065 27878 layer_factory.hpp:77] Creating layer drop6
I0322 21:10:03.877069 27878 net.cpp:86] Creating Layer drop6
I0322 21:10:03.877071 27878 net.cpp:408] drop6 <- fc6
I0322 21:10:03.877074 27878 net.cpp:369] drop6 -> fc6 (in-place)
I0322 21:10:03.877077 27878 net.cpp:124] Setting up drop6
I0322 21:10:03.877079 27878 net.cpp:131] Top shape: 10 4096 (40960)
I0322 21:10:03.877081 27878 net.cpp:139] Memory required for data: 68109880
I0322 21:10:03.877082 27878 layer_factory.hpp:77] Creating layer fc7
I0322 21:10:03.877087 27878 net.cpp:86] Creating Layer fc7
I0322 21:10:03.877089 27878 net.cpp:408] fc7 <- fc6
I0322 21:10:03.877092 27878 net.cpp:382] fc7 -> fc7
I0322 21:10:03.994551 27878 net.cpp:124] Setting up fc7
I0322 21:10:03.994563 27878 net.cpp:131] Top shape: 10 4096 (40960)
I0322 21:10:03.994568 27878 net.cpp:139] Memory required for data: 68273720
I0322 21:10:03.994573 27878 layer_factory.hpp:77] Creating layer relu7
I0322 21:10:03.994580 27878 net.cpp:86] Creating Layer relu7
I0322 21:10:03.994582 27878 net.cpp:408] relu7 <- fc7
I0322 21:10:03.994586 27878 net.cpp:369] relu7 -> fc7 (in-place)
I0322 21:10:03.994590 27878 net.cpp:124] Setting up relu7
I0322 21:10:03.994591 27878 net.cpp:131] Top shape: 10 4096 (40960)
I0322 21:10:03.994593 27878 net.cpp:139] Memory required for data: 68437560
I0322 21:10:03.994596 27878 layer_factory.hpp:77] Creating layer drop7
I0322 21:10:03.994598 27878 net.cpp:86] Creating Layer drop7
I0322 21:10:03.994601 27878 net.cpp:408] drop7 <- fc7
I0322 21:10:03.994603 27878 net.cpp:369] drop7 -> fc7 (in-place)
I0322 21:10:03.994606 27878 net.cpp:124] Setting up drop7
I0322 21:10:03.994607 27878 net.cpp:131] Top shape: 10 4096 (40960)
I0322 21:10:03.994609 27878 net.cpp:139] Memory required for data: 68601400
I0322 21:10:03.994612 27878 net.cpp:202] drop7 does not need backward computation.
I0322 21:10:03.994614 27878 net.cpp:202] relu7 does not need backward computation.
I0322 21:10:03.994616 27878 net.cpp:202] fc7 does not need backward computation.
I0322 21:10:03.994618 27878 net.cpp:202] drop6 does not need backward computation.
I0322 21:10:03.994621 27878 net.cpp:202] relu6 does not need backward computation.
I0322 21:10:03.994622 27878 net.cpp:202] fc6 does not need backward computation.
I0322 21:10:03.994624 27878 net.cpp:202] pool5 does not need backward computation.
I0322 21:10:03.994626 27878 net.cpp:202] relu5 does not need backward computation.
I0322 21:10:03.994628 27878 net.cpp:202] conv5 does not need backward computation.
I0322 21:10:03.994630 27878 net.cpp:202] relu4 does not need backward computation.
I0322 21:10:03.994632 27878 net.cpp:202] conv4 does not need backward computation.
I0322 21:10:03.994634 27878 net.cpp:202] relu3 does not need backward computation.
I0322 21:10:03.994637 27878 net.cpp:202] conv3 does not need backward computation.
I0322 21:10:03.994638 27878 net.cpp:202] norm2 does not need backward computation.
I0322 21:10:03.994642 27878 net.cpp:202] pool2 does not need backward computation.
I0322 21:10:03.994643 27878 net.cpp:202] relu2 does not need backward computation.
I0322 21:10:03.994645 27878 net.cpp:202] conv2 does not need backward computation.
I0322 21:10:03.994647 27878 net.cpp:202] norm1 does not need backward computation.
I0322 21:10:03.994649 27878 net.cpp:202] pool1 does not need backward computation.
I0322 21:10:03.994652 27878 net.cpp:202] relu1 does not need backward computation.
I0322 21:10:03.994653 27878 net.cpp:202] conv1 does not need backward computation.
I0322 21:10:03.994655 27878 net.cpp:202] input does not need backward computation.
I0322 21:10:03.994657 27878 net.cpp:244] This network produces output fc7
I0322 21:10:03.994665 27878 net.cpp:257] Network initialization done.
Traceback (most recent call last):
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 333, in <module>
    main(sys.argv)
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 328, in main
    go(num)
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 264, in go
    svmTrain(num)
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 193, in svmTrain
    image_dims=(imagesWidth, imagesHeight))
  File "/usr/local/caffe/python/caffe/classifier.py", line 26, in __init__
    caffe.Net.__init__(self, model_file, caffe.TEST, weights=pretrained_file)
RuntimeError: Could not open file /usr/local/caffe/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel

# 第一次運行需要聯網下載模型.該部分如果報錯的話建議刪掉,自己手動下載,即到scripts目錄下執行download_model_binary.py,然後查看models/bvlc_reference_caffenet目錄下是否有bvlc_reference_caffenet.caffemodel
import os
if os.path.isfile(caffe_root + 'models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel'):
    print 'CaffeNet found.'
else:
    print 'Downloading pre-trained CaffeNet model...'
    os.system('../scripts/download_model_binary.py ../models/bvlc_reference_caffenet')

下載不了,手工下載:

http://dl.caffe.berkeleyvision.org/bvlc_reference_caffenet.caffemodel

Traceback (most recent call last):
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 338, in <module>
    main(sys.argv)
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 333, in main
    go(num)
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 268, in go
    svmTrain(num)
  File "/home/sys1710/PycharmProjects/guo/SSGF-for-HRRS-scene-classification-master/1selftraining.py", line 223, in svmTrain
    a=len(prediction)
UnboundLocalError: local variable 'prediction' referenced before assignment

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章