FCN在點雲數據PCL方面應用淺析

歡迎訪問我的個人博客:zengzeyu.com

前言


FCN(fully convolutional networks, 全卷積神經網絡)的圖片語義分割(semantic segmentation)論文:Fully Convolutional Networks for Semantic Segmentation。全卷積網絡首現於這篇文章。這篇文章是將CNN結構應用到圖像語義分割領域並取得突出結果的開山之作,因而拿到了CVPR 2015年的Best paper honorable mention。圖像語義分割,簡而言之就是對一張圖片上的所有像素點進行分類。如下圖就是一個語義分割例子,不同顏色像素代表不同類別:

原圖
語義分割(semantic segmentation)

UCB的FCN源碼Github地址:https://github.com/shelhamer/fcn.berkeleyvision.org
源碼中一共包含了4種網絡結構模型:nyud-fcnpascalcontext-fcnsiftflow-fcnvoc-fcn。每一種網絡結構根據提取卷積層不同,又分了3-4個不等的網絡類別。
工作中個人的數據類型和格式不一定與voc-fcn-alexnet源代碼提供的數據接口相同或類似(圖片),如本文接下來要輸入網絡模型的數據類型爲由激光雷達(LiDAR)掃描得到的點雲數據(.pcd),那麼如何進行實際操作呢?下面一步一步進行。

1. 激光雷達數據轉換


1.1 激光雷達點雲數據介紹

首先介紹機械式旋轉激光雷達生成的數據格式,激光雷達內部電機以一定角速度旋轉,通過固定於其上的激光發射器和激光接收器測量激光雷達到障礙物的距離。以速騰聚創公司生產的16線激光雷達RS-LiDAR-16爲例,每秒進行10次360°旋轉(10Hz),每次旋轉掃描得到周圍場景的信息,每一線激光旋轉一週得到2016個點,儲存在 .pcd 格式文件中。以二維彩色圖像的方式(如.png)來理解.pcd文件,16線代表圖片高度,2016代表圖片寬度,一共16x2016=32256個像素點。每個點 point 的數據有[x, y, z, intensity],與二維圖片中的RGB通道(RGB chanel)是同樣的道理,每一個數據代表一個通道。

速騰聚創 RS-LiDAR-16

激光雷達點雲示意圖

1.2 點雲預處理

根據點雲數據特徵屬性對其進行預處理,每個 point 的處理後特徵有[row, column, height, range, mark]分別代表 point 的:[行序號列標號高度距離屬性],其中 heightz 值相等,rangesqrt(x^2 + y^2 + z^2) 計算得出, mark 爲通過決策樹(Decision tree)方式對 point 進行分類得到屬性:障礙物點(obstacle mark)或地面點(ground mark),與ground true圖片道理相同,作爲訓練預測分類的結果參考標準用於計算loss。這裏作用相當於,人工添加了更多的特徵通道,方便進行分類和預測。
以上預處理得到的數據通過cnpy庫轉換爲 .npy 格式的二進制文件,方便NumPy對數據進行讀取,cnpy庫使用教程請移步:cnpy庫使用筆記以及官方example。每一幀點雲數據儲存爲一個 .npy 格式文件,命名方式越簡單越好,方便讀取排序,本文直接以序號作爲文件名[0.npy, 1.npy, …, n.npy ]

2. FCN-AlexNet的點雲數據分類任務


FCN-AlexNet的點雲數據分類任務工程包含:
- 5個Python文件: pcl_data_layer.pynet.pysolver.pysurgery.pyscore.py
- 3個prototxt文件: train.prototxtval.prototxtsolver.prototxt
- 1個caffe_model文件: fcn-alexnet-pascal.caffemodel

2.1 FCN-AlexNet讀取數據層(Data layer)

文件命名爲pcl_data_layer.py,該文件內包含class PCLSegDataLayer()類函數:

import caffe
import numpy as np
import random
import os

class PCLSegDataLayer(caffe.Layer):

    def setup(self, bottom, top):

        params = eval(self.param_str)
        self.npy_dir = params["pcl_dir"]
        self.list_name = list()

        # two tops: data and label
        if len(top) != 2:
            raise Exception("Need to define two tops: data and label.")
        # data layers have no bottoms
        if len(bottom) != 0:
            raise Exception("Do not define a bottom.")

        self.load_file_name( self.npy_dir, self.list_name )
        self.idx = 0


    def reshape(self, bottom, top):
        self.data, self.label = self.load_file( self.idx )
        # reshape tops to fit (leading 1 is for batch dimension)
        top[0].reshape(1, *self.data.shape)
        top[1].reshape(1, *self.label.shape)


    def forward(self, bottom, top):
        # assign output
        top[0].data[...] = self.data
        top[1].data[...] = self.label

        # pick next input
        self.idx += 1
        if self.idx == len(self.list_name):
            self.idx = 0

    def backward(self, top, propagate_down, bottom):
        pass

    def load_file(self, idx):
        in_file = np.load(self.list_name[idx]) #[mark, row, col, height, range]
        in_data = in_file[:,:,1:-1]
        in_data = in_data.transpose((2, 0, 1))
        in_label = in_file[:,:,0]
        return in_data, in_label

    def load_file_name(self, path, list_name):
        for file in os.listdir(path):
            file_path = os.path.join(path, file)
            if os.path.isdir(file_path):
                os.listdir(file_path, list_name)
            else:
                list_name.append(file_path)
  • setup(): 建立類時的參數
  • reshape(): 根據輸入調整模型入口大小
  • forward(): 前向傳播,由於是數據輸入層,所以輸出爲原點雲數據及其分類label
  • backward(): 後向傳播,數據層沒有後向傳播,所以捨棄
  • load_file_name(): 讀取指定文件夾內 .npy 格式文件並儲存如列表list
  • load_file(): 載入單個.npy 文件,並按照儲存順序對屬性進行分類,輸出data和label

2.2 FCN-AlexNet模型定義函數(net.py)

net.py文件用於生成net.prototxt文件,其定義了整個模型的結構和模型每層的各個參數。當然,模型網絡結構可以利用官方已經訓練好的fcn-alexnet-pascal.caffemodel來導出,也可以使用net.py自己生成,爲了簡化操作,本文使用fcn-alexnet-pascal.caffemodel來導出模型網絡結構。

import sys
sys.path.append('../../python')

import caffe
from caffe import layers as L, params as P
from caffe.coord_map import crop

def conv_relu(bottom, ks, nout, stride=1, pad=0, group=1):
    conv = L.Convolution(bottom, kernel_size=ks, stride=stride,
                                num_output=nout, pad=pad, group=group)
    return conv, L.ReLU(conv, in_place=True)

def max_pool(bottom, ks, stride=1):
    return L.Pooling(bottom, pool=P.Pooling.MAX, kernel_size=ks, stride=stride)

def fcn(split):
    n = caffe.NetSpec()
    pydata_params = dict()
    pydata_params['pcl_dir'] = '../fcn_data_gen/data/npy' #.npy files path
    pylayer = 'PCLSegDataLayer'
    n.data, n.label = L.Python(module='pcl_data_layer', layer=pylayer,
            ntop=2, param_str=str(pydata_params))

    # the base net
    n.conv1, n.relu1 = conv_relu(n.data, 11, 96, stride=4, pad=100)
    n.pool1 = max_pool(n.relu1, 3, stride=2)
    n.norm1 = L.LRN(n.pool1, local_size=5, alpha=1e-4, beta=0.75)
    n.conv2, n.relu2 = conv_relu(n.norm1, 5, 256, pad=2, group=2)
    n.pool2 = max_pool(n.relu2, 3, stride=2)
    n.norm2 = L.LRN(n.pool2, local_size=5, alpha=1e-4, beta=0.75)
    n.conv3, n.relu3 = conv_relu(n.norm2, 3, 384, pad=1)
    n.conv4, n.relu4 = conv_relu(n.relu3, 3, 384, pad=1, group=2)
    n.conv5, n.relu5 = conv_relu(n.relu4, 3, 256, pad=1, group=2)
    n.pool5 = max_pool(n.relu5, 3, stride=2)

    # fully conv
    n.fc6, n.relu6 = conv_relu(n.pool5, 6, 4096)
    n.drop6 = L.Dropout(n.relu6, dropout_ratio=0.5, in_place=True)
    n.fc7, n.relu7 = conv_relu(n.drop6, 1, 4096)
    n.drop7 = L.Dropout(n.relu7, dropout_ratio=0.5, in_place=True)

    n.score_fr = L.Convolution(n.drop7, num_output=21, kernel_size=1, pad=0,
        param=[dict(lr_mult=1, decay_mult=1), dict(lr_mult=2, decay_mult=0)])
    n.upscore = L.Deconvolution(n.score_fr,
        convolution_param=dict(num_output=21, kernel_size=63, stride=32,
            bias_term=False),
        param=[dict(lr_mult=0)])
    n.score = crop(n.upscore, n.data)
    n.loss = L.SoftmaxWithLoss(n.score, n.label,
            loss_param=dict(normalize=True, ignore_label=255))

    return n.to_proto()

def make_net():
    with open('train.prototxt', 'w') as f:
        f.write(str(fcn('train')))

    with open('val.prototxt', 'w') as f:
        f.write(str(fcn('seg11valid')))

if __name__ == '__main__':
    make_net()
  • conv_relu(): 定義卷積層輸入參數
  • max_pool(): 定義池化層輸入參數
  • fcn(): 定義模型網絡結構

fcn()模型結構詳解

這裏建議結合AlexNet原論文ImageNet Classification with Deep Convolutional Neural Networks一起看,並參考AlexNet模型結構圖例來進行比較好理解每個參數的意義。

(1). 數據輸入層
    n = caffe.NetSpec()
    pydata_params = dict()
    pydata_params['pcl_dir'] = '../fcn_data_gen/data/npy' #.npy files path
    pylayer = 'PCLSegDataLayer'
    n.data, n.label = L.Python(module='pcl_data_layer', layer=pylayer,
            ntop=2, param_str=str(pydata_params))

找到pcl_data_layer.py文件中的PCLSegDataLayer函數,使用該類處理數據方式作爲模型數據輸入層函數。

(2). 第一個卷積層
    n.conv1, n.relu1 = conv_relu(n.data, 11, 96, stride=4, pad=100)
    n.pool1 = max_pool(n.relu1, 3, stride=2)
    n.norm1 = L.LRN(n.pool1, local_size=5, alpha=1e-4, beta=0.75)

關於爲何pad=100,此文中有詳細解釋:FCN學習:Semantic Segmentation

(3). 第二個卷積層
    n.conv2, n.relu2 = conv_relu(n.norm1, 5, 256, pad=2, group=2)
    n.pool2 = max_pool(n.relu2, 3, stride=2)
    n.norm2 = L.LRN(n.pool2, local_size=5, alpha=1e-4, beta=0.75)
(4). 第三個卷積層
    n.conv3, n.relu3 = conv_relu(n.norm2, 3, 384, pad=1)
(5). 第四個卷積層
    n.conv4, n.relu4 = conv_relu(n.relu3, 3, 384, pad=1, group=2)
(6). 第五個卷積層
    n.conv5, n.relu5 = conv_relu(n.relu4, 3, 256, pad=1, group=2)
    n.pool5 = max_pool(n.relu5, 3, stride=2)
(7). 第六個全連接層
    n.fc6, n.relu6 = conv_relu(n.pool5, 6, 4096)
    n.drop6 = L.Dropout(n.relu6, dropout_ratio=0.5, in_place=True)
(8). 第七個全連接層
    n.fc7, n.relu7 = conv_relu(n.drop6, 1, 4096)
    n.drop7 = L.Dropout(n.relu7, dropout_ratio=0.5, in_place=True)
(9). 第八個全連接層
    n.score_fr = L.Convolution(n.drop7, num_output=21, kernel_size=1, pad=0,
        param=[dict(lr_mult=1, decay_mult=1), dict(lr_mult=2, decay_mult=0)])
    n.upscore = L.Deconvolution(n.score_fr,
        convolution_param=dict(num_output=21, kernel_size=63, stride=32,
            bias_term=False),
        param=[dict(lr_mult=0)])
    n.score = crop(n.upscore, n.data)
    n.loss = L.SoftmaxWithLoss(n.score, n.label,
            loss_param=dict(normalize=True, ignore_label=255))

2.3 FCN-AlexNet求解函數(solve.py)

solve.py 文件是整個模型的入口,它整合各個文件,輸入外部參數,對結果進行求解並輸出。由 solve.py 生成的 solver.prototxt 文件定義了求解函數的結構。

import caffe
import surgery, score

import numpy as np
import os
import sys

try:
    import setproctitle
    setproctitle.setproctitle(os.path.basename(os.getcwd()))
except:
    pass

weights = '../ilsvrc-nets/fcn-alexnet-pascal.caffemodel'

# init
# caffe.set_device(int(sys.argv[0]))
# caffe.set_mode_gpu()

solver = caffe.SGDSolver('solver.prototxt')
solver.net.copy_from(weights)

# surgeries
interp_layers = [k for k in solver.net.params.keys() if 'up' in k]
surgery.interp(solver.net, interp_layers)

# scoring
val = np.loadtxt('../data/pascal/seg11valid.txt', dtype=str)

for _ in range(25):
    solver.step(4000)
    score.seg_tests(solver, False, val, layer='score')
  • weights = '../ilsvrc-nets/fcn-alexnet-pascal.caffemodel': 導入訓練好的模型,可在[Netscope]中輸入net.prototxt來進行網絡結構可視化
  • # caffe.set_device(int(sys.argv[0]))
    # caffe.set_mode_gpu(): 設置gpu來進行訓練,本人電腦使用gpu報錯,所以沒有使用
  • solver = caffe.SGDSolver('solver.prototxt')
    solver.net.copy_from(weights):設置求解器模型
  • # surgeries: (待補充)
  • # scoring : (待補充)

3. 點雲分割試驗結果


pydev debugger: process 9249 is connecting

Connected to pydev debugger (build 173.4301.16)
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0313 11:41:39.369604  9249 solver.cpp:45] Initializing solver from parameters: 
train_net: "train.prototxt"
test_net: "val.prototxt"
test_iter: 736
test_interval: 999999999
base_lr: 0.0001
display: 20
max_iter: 100000
lr_policy: "fixed"
momentum: 0.9
weight_decay: 0.0005
snapshot: 4000
snapshot_prefix: "snapshot/train"
test_initialization: false
average_loss: 20
iter_size: 20
I0313 11:41:39.369671  9249 solver.cpp:92] Creating training net from train_net file: train.prototxt
I0313 11:41:39.370101  9249 net.cpp:51] Initializing net from parameters: 
state {
  phase: TRAIN
}
layer {
  name: "data"
  type: "Python"
  top: "data"
  top: "label"
  python_param {
    module: "pcl_data_layer"
    layer: "PCLSegDataLayer"
    param_str: "{\'pcl_dir\': \'/home/zzy/CLionProjects/ROS_Project/ws/src/fcn_data_gen/data/npy\'}"
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    pad: 100
    kernel_size: 11
    group: 1
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    stride: 1
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 1
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    stride: 1
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    stride: 1
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "Convolution"
  bottom: "pool5"
  top: "fc6"
  convolution_param {
    num_output: 4096
    pad: 0
    kernel_size: 6
    group: 1
    stride: 1
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "Convolution"
  bottom: "fc6"
  top: "fc7"
  convolution_param {
    num_output: 4096
    pad: 0
    kernel_size: 1
    group: 1
    stride: 1
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "score_fr"
  type: "Convolution"
  bottom: "fc7"
  top: "score_fr"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 21
    pad: 0
    kernel_size: 1
  }
}
layer {
  name: "upscore"
  type: "Deconvolution"
  bottom: "score_fr"
  top: "upscore"
  param {
    lr_mult: 0
  }
  convolution_param {
    num_output: 21
    bias_term: false
    kernel_size: 63
    stride: 32
  }
}
layer {
  name: "score"
  type: "Crop"
  bottom: "upscore"
  bottom: "data"
  top: "score"
  crop_param {
    axis: 2
    offset: 18
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "score"
  bottom: "label"
  top: "loss"
  loss_param {
    ignore_label: 255
    normalize: true
  }
}
I0313 11:41:39.370163  9249 layer_factory.hpp:77] Creating layer data
I0313 11:41:39.370743  9249 net.cpp:84] Creating Layer data
I0313 11:41:39.370753  9249 net.cpp:380] data -> data
I0313 11:41:39.370759  9249 net.cpp:380] data -> label
I0313 11:41:39.372340  9249 net.cpp:122] Setting up data
I0313 11:41:39.372354  9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.372357  9249 net.cpp:129] Top shape: 1 16 2016 (32256)
I0313 11:41:39.372360  9249 net.cpp:137] Memory required for data: 516096
I0313 11:41:39.372364  9249 layer_factory.hpp:77] Creating layer data_data_0_split
I0313 11:41:39.372370  9249 net.cpp:84] Creating Layer data_data_0_split
I0313 11:41:39.372372  9249 net.cpp:406] data_data_0_split <- data
I0313 11:41:39.372376  9249 net.cpp:380] data_data_0_split -> data_data_0_split_0
I0313 11:41:39.372382  9249 net.cpp:380] data_data_0_split -> data_data_0_split_1
I0313 11:41:39.372387  9249 net.cpp:122] Setting up data_data_0_split
I0313 11:41:39.372391  9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.372395  9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.372397  9249 net.cpp:137] Memory required for data: 1290240
I0313 11:41:39.372400  9249 layer_factory.hpp:77] Creating layer conv1
I0313 11:41:39.372406  9249 net.cpp:84] Creating Layer conv1
I0313 11:41:39.372409  9249 net.cpp:406] conv1 <- data_data_0_split_0
I0313 11:41:39.372412  9249 net.cpp:380] conv1 -> conv1
I0313 11:41:39.372515  9249 net.cpp:122] Setting up conv1
I0313 11:41:39.372521  9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.372524  9249 net.cpp:137] Memory required for data: 12312576
I0313 11:41:39.372531  9249 layer_factory.hpp:77] Creating layer relu1
I0313 11:41:39.372535  9249 net.cpp:84] Creating Layer relu1
I0313 11:41:39.372539  9249 net.cpp:406] relu1 <- conv1
I0313 11:41:39.372541  9249 net.cpp:367] relu1 -> conv1 (in-place)
I0313 11:41:39.372546  9249 net.cpp:122] Setting up relu1
I0313 11:41:39.372550  9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.372552  9249 net.cpp:137] Memory required for data: 23334912
I0313 11:41:39.372555  9249 layer_factory.hpp:77] Creating layer pool1
I0313 11:41:39.372558  9249 net.cpp:84] Creating Layer pool1
I0313 11:41:39.372560  9249 net.cpp:406] pool1 <- conv1
I0313 11:41:39.372565  9249 net.cpp:380] pool1 -> pool1
I0313 11:41:39.372573  9249 net.cpp:122] Setting up pool1
I0313 11:41:39.372576  9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.372579  9249 net.cpp:137] Memory required for data: 26090496
I0313 11:41:39.372581  9249 layer_factory.hpp:77] Creating layer norm1
I0313 11:41:39.372586  9249 net.cpp:84] Creating Layer norm1
I0313 11:41:39.372588  9249 net.cpp:406] norm1 <- pool1
I0313 11:41:39.372593  9249 net.cpp:380] norm1 -> norm1
I0313 11:41:39.372599  9249 net.cpp:122] Setting up norm1
I0313 11:41:39.372602  9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.372604  9249 net.cpp:137] Memory required for data: 28846080
I0313 11:41:39.372607  9249 layer_factory.hpp:77] Creating layer conv2
I0313 11:41:39.372611  9249 net.cpp:84] Creating Layer conv2
I0313 11:41:39.372613  9249 net.cpp:406] conv2 <- norm1
I0313 11:41:39.372617  9249 net.cpp:380] conv2 -> conv2
I0313 11:41:39.373008  9249 net.cpp:122] Setting up conv2
I0313 11:41:39.373013  9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.373015  9249 net.cpp:137] Memory required for data: 36194304
I0313 11:41:39.373021  9249 layer_factory.hpp:77] Creating layer relu2
I0313 11:41:39.373025  9249 net.cpp:84] Creating Layer relu2
I0313 11:41:39.373028  9249 net.cpp:406] relu2 <- conv2
I0313 11:41:39.373030  9249 net.cpp:367] relu2 -> conv2 (in-place)
I0313 11:41:39.373034  9249 net.cpp:122] Setting up relu2
I0313 11:41:39.373039  9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.373040  9249 net.cpp:137] Memory required for data: 43542528
I0313 11:41:39.373042  9249 layer_factory.hpp:77] Creating layer pool2
I0313 11:41:39.373046  9249 net.cpp:84] Creating Layer pool2
I0313 11:41:39.373049  9249 net.cpp:406] pool2 <- conv2
I0313 11:41:39.373052  9249 net.cpp:380] pool2 -> pool2
I0313 11:41:39.373057  9249 net.cpp:122] Setting up pool2
I0313 11:41:39.373061  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.373064  9249 net.cpp:137] Memory required for data: 45379584
I0313 11:41:39.373065  9249 layer_factory.hpp:77] Creating layer norm2
I0313 11:41:39.373070  9249 net.cpp:84] Creating Layer norm2
I0313 11:41:39.373071  9249 net.cpp:406] norm2 <- pool2
I0313 11:41:39.373075  9249 net.cpp:380] norm2 -> norm2
I0313 11:41:39.373080  9249 net.cpp:122] Setting up norm2
I0313 11:41:39.373082  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.373085  9249 net.cpp:137] Memory required for data: 47216640
I0313 11:41:39.373087  9249 layer_factory.hpp:77] Creating layer conv3
I0313 11:41:39.373091  9249 net.cpp:84] Creating Layer conv3
I0313 11:41:39.373093  9249 net.cpp:406] conv3 <- norm2
I0313 11:41:39.373096  9249 net.cpp:380] conv3 -> conv3
I0313 11:41:39.373900  9249 net.cpp:122] Setting up conv3
I0313 11:41:39.373906  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.373909  9249 net.cpp:137] Memory required for data: 49972224
I0313 11:41:39.373914  9249 layer_factory.hpp:77] Creating layer relu3
I0313 11:41:39.373919  9249 net.cpp:84] Creating Layer relu3
I0313 11:41:39.373921  9249 net.cpp:406] relu3 <- conv3
I0313 11:41:39.373924  9249 net.cpp:367] relu3 -> conv3 (in-place)
I0313 11:41:39.373929  9249 net.cpp:122] Setting up relu3
I0313 11:41:39.373931  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.373934  9249 net.cpp:137] Memory required for data: 52727808
I0313 11:41:39.373936  9249 layer_factory.hpp:77] Creating layer conv4
I0313 11:41:39.373941  9249 net.cpp:84] Creating Layer conv4
I0313 11:41:39.373944  9249 net.cpp:406] conv4 <- conv3
I0313 11:41:39.373947  9249 net.cpp:380] conv4 -> conv4
I0313 11:41:39.374778  9249 net.cpp:122] Setting up conv4
I0313 11:41:39.374783  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.374785  9249 net.cpp:137] Memory required for data: 55483392
I0313 11:41:39.374789  9249 layer_factory.hpp:77] Creating layer relu4
I0313 11:41:39.374794  9249 net.cpp:84] Creating Layer relu4
I0313 11:41:39.374795  9249 net.cpp:406] relu4 <- conv4
I0313 11:41:39.374800  9249 net.cpp:367] relu4 -> conv4 (in-place)
I0313 11:41:39.374804  9249 net.cpp:122] Setting up relu4
I0313 11:41:39.374807  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.374809  9249 net.cpp:137] Memory required for data: 58238976
I0313 11:41:39.374811  9249 layer_factory.hpp:77] Creating layer conv5
I0313 11:41:39.374816  9249 net.cpp:84] Creating Layer conv5
I0313 11:41:39.374819  9249 net.cpp:406] conv5 <- conv4
I0313 11:41:39.374824  9249 net.cpp:380] conv5 -> conv5
I0313 11:41:39.375376  9249 net.cpp:122] Setting up conv5
I0313 11:41:39.375382  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.375385  9249 net.cpp:137] Memory required for data: 60076032
I0313 11:41:39.375392  9249 layer_factory.hpp:77] Creating layer relu5
I0313 11:41:39.375397  9249 net.cpp:84] Creating Layer relu5
I0313 11:41:39.375399  9249 net.cpp:406] relu5 <- conv5
I0313 11:41:39.375402  9249 net.cpp:367] relu5 -> conv5 (in-place)
I0313 11:41:39.375406  9249 net.cpp:122] Setting up relu5
I0313 11:41:39.375409  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.375412  9249 net.cpp:137] Memory required for data: 61913088
I0313 11:41:39.375414  9249 layer_factory.hpp:77] Creating layer pool5
I0313 11:41:39.375421  9249 net.cpp:84] Creating Layer pool5
I0313 11:41:39.375422  9249 net.cpp:406] pool5 <- conv5
I0313 11:41:39.375425  9249 net.cpp:380] pool5 -> pool5
I0313 11:41:39.375432  9249 net.cpp:122] Setting up pool5
I0313 11:41:39.375434  9249 net.cpp:129] Top shape: 1 256 6 69 (105984)
I0313 11:41:39.375437  9249 net.cpp:137] Memory required for data: 62337024
I0313 11:41:39.375439  9249 layer_factory.hpp:77] Creating layer fc6
I0313 11:41:39.375444  9249 net.cpp:84] Creating Layer fc6
I0313 11:41:39.375447  9249 net.cpp:406] fc6 <- pool5
I0313 11:41:39.375452  9249 net.cpp:380] fc6 -> fc6
I0313 11:41:39.404399  9249 net.cpp:122] Setting up fc6
I0313 11:41:39.404426  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.404430  9249 net.cpp:137] Memory required for data: 63385600
I0313 11:41:39.404438  9249 layer_factory.hpp:77] Creating layer relu6
I0313 11:41:39.404445  9249 net.cpp:84] Creating Layer relu6
I0313 11:41:39.404449  9249 net.cpp:406] relu6 <- fc6
I0313 11:41:39.404453  9249 net.cpp:367] relu6 -> fc6 (in-place)
I0313 11:41:39.404460  9249 net.cpp:122] Setting up relu6
I0313 11:41:39.404464  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.404466  9249 net.cpp:137] Memory required for data: 64434176
I0313 11:41:39.404469  9249 layer_factory.hpp:77] Creating layer drop6
I0313 11:41:39.404474  9249 net.cpp:84] Creating Layer drop6
I0313 11:41:39.404476  9249 net.cpp:406] drop6 <- fc6
I0313 11:41:39.404481  9249 net.cpp:367] drop6 -> fc6 (in-place)
I0313 11:41:39.404486  9249 net.cpp:122] Setting up drop6
I0313 11:41:39.404489  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.404492  9249 net.cpp:137] Memory required for data: 65482752
I0313 11:41:39.404495  9249 layer_factory.hpp:77] Creating layer fc7
I0313 11:41:39.404500  9249 net.cpp:84] Creating Layer fc7
I0313 11:41:39.404503  9249 net.cpp:406] fc7 <- fc6
I0313 11:41:39.404506  9249 net.cpp:380] fc7 -> fc7
I0313 11:41:39.417629  9249 net.cpp:122] Setting up fc7
I0313 11:41:39.417654  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.417657  9249 net.cpp:137] Memory required for data: 66531328
I0313 11:41:39.417665  9249 layer_factory.hpp:77] Creating layer relu7
I0313 11:41:39.417672  9249 net.cpp:84] Creating Layer relu7
I0313 11:41:39.417676  9249 net.cpp:406] relu7 <- fc7
I0313 11:41:39.417680  9249 net.cpp:367] relu7 -> fc7 (in-place)
I0313 11:41:39.417687  9249 net.cpp:122] Setting up relu7
I0313 11:41:39.417690  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.417693  9249 net.cpp:137] Memory required for data: 67579904
I0313 11:41:39.417696  9249 layer_factory.hpp:77] Creating layer drop7
I0313 11:41:39.417703  9249 net.cpp:84] Creating Layer drop7
I0313 11:41:39.417706  9249 net.cpp:406] drop7 <- fc7
I0313 11:41:39.417709  9249 net.cpp:367] drop7 -> fc7 (in-place)
I0313 11:41:39.417713  9249 net.cpp:122] Setting up drop7
I0313 11:41:39.417716  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.417719  9249 net.cpp:137] Memory required for data: 68628480
I0313 11:41:39.417721  9249 layer_factory.hpp:77] Creating layer score_fr
I0313 11:41:39.417727  9249 net.cpp:84] Creating Layer score_fr
I0313 11:41:39.417729  9249 net.cpp:406] score_fr <- fc7
I0313 11:41:39.417734  9249 net.cpp:380] score_fr -> score_fr
I0313 11:41:39.417858  9249 net.cpp:122] Setting up score_fr
I0313 11:41:39.417865  9249 net.cpp:129] Top shape: 1 21 1 64 (1344)
I0313 11:41:39.417867  9249 net.cpp:137] Memory required for data: 68633856
I0313 11:41:39.417871  9249 layer_factory.hpp:77] Creating layer upscore
I0313 11:41:39.417877  9249 net.cpp:84] Creating Layer upscore
I0313 11:41:39.417881  9249 net.cpp:406] upscore <- score_fr
I0313 11:41:39.417884  9249 net.cpp:380] upscore -> upscore
I0313 11:41:39.419461  9249 net.cpp:122] Setting up upscore
I0313 11:41:39.419472  9249 net.cpp:129] Top shape: 1 21 63 2079 (2750517)
I0313 11:41:39.419476  9249 net.cpp:137] Memory required for data: 79635924
I0313 11:41:39.419484  9249 layer_factory.hpp:77] Creating layer score
I0313 11:41:39.419497  9249 net.cpp:84] Creating Layer score
I0313 11:41:39.419499  9249 net.cpp:406] score <- upscore
I0313 11:41:39.419503  9249 net.cpp:406] score <- data_data_0_split_1
I0313 11:41:39.419507  9249 net.cpp:380] score -> score
I0313 11:41:39.419517  9249 net.cpp:122] Setting up score
I0313 11:41:39.419539  9249 net.cpp:129] Top shape: 1 21 16 2016 (677376)
I0313 11:41:39.419543  9249 net.cpp:137] Memory required for data: 82345428
I0313 11:41:39.419544  9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.419554  9249 net.cpp:84] Creating Layer loss
I0313 11:41:39.419558  9249 net.cpp:406] loss <- score
I0313 11:41:39.419560  9249 net.cpp:406] loss <- label
I0313 11:41:39.419564  9249 net.cpp:380] loss -> loss
I0313 11:41:39.419572  9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.420116  9249 net.cpp:122] Setting up loss
I0313 11:41:39.420122  9249 net.cpp:129] Top shape: (1)
I0313 11:41:39.420125  9249 net.cpp:132]     with loss weight 1
I0313 11:41:39.420135  9249 net.cpp:137] Memory required for data: 82345432
I0313 11:41:39.420137  9249 net.cpp:198] loss needs backward computation.
I0313 11:41:39.420140  9249 net.cpp:198] score needs backward computation.
I0313 11:41:39.420143  9249 net.cpp:198] upscore needs backward computation.
I0313 11:41:39.420145  9249 net.cpp:198] score_fr needs backward computation.
I0313 11:41:39.420148  9249 net.cpp:198] drop7 needs backward computation.
I0313 11:41:39.420151  9249 net.cpp:198] relu7 needs backward computation.
I0313 11:41:39.420155  9249 net.cpp:198] fc7 needs backward computation.
I0313 11:41:39.420156  9249 net.cpp:198] drop6 needs backward computation.
I0313 11:41:39.420159  9249 net.cpp:198] relu6 needs backward computation.
I0313 11:41:39.420161  9249 net.cpp:198] fc6 needs backward computation.
I0313 11:41:39.420164  9249 net.cpp:198] pool5 needs backward computation.
I0313 11:41:39.420167  9249 net.cpp:198] relu5 needs backward computation.
I0313 11:41:39.420169  9249 net.cpp:198] conv5 needs backward computation.
I0313 11:41:39.420172  9249 net.cpp:198] relu4 needs backward computation.
I0313 11:41:39.420176  9249 net.cpp:198] conv4 needs backward computation.
I0313 11:41:39.420177  9249 net.cpp:198] relu3 needs backward computation.
I0313 11:41:39.420181  9249 net.cpp:198] conv3 needs backward computation.
I0313 11:41:39.420183  9249 net.cpp:198] norm2 needs backward computation.
I0313 11:41:39.420186  9249 net.cpp:198] pool2 needs backward computation.
I0313 11:41:39.420189  9249 net.cpp:198] relu2 needs backward computation.
I0313 11:41:39.420192  9249 net.cpp:198] conv2 needs backward computation.
I0313 11:41:39.420194  9249 net.cpp:198] norm1 needs backward computation.
I0313 11:41:39.420197  9249 net.cpp:198] pool1 needs backward computation.
I0313 11:41:39.420200  9249 net.cpp:198] relu1 needs backward computation.
I0313 11:41:39.420203  9249 net.cpp:198] conv1 needs backward computation.
I0313 11:41:39.420207  9249 net.cpp:200] data_data_0_split does not need backward computation.
I0313 11:41:39.420210  9249 net.cpp:200] data does not need backward computation.
I0313 11:41:39.420212  9249 net.cpp:242] This network produces output loss
I0313 11:41:39.420224  9249 net.cpp:255] Network initialization done.
I0313 11:41:39.420586  9249 solver.cpp:190] Creating test net (#0) specified by test_net file: val.prototxt
I0313 11:41:39.420764  9249 net.cpp:51] Initializing net from parameters: 
state {
  phase: TEST
}
layer {
  name: "data"
  type: "Python"
  top: "data"
  top: "label"
  python_param {
    module: "pcl_data_layer"
    layer: "PCLSegDataLayer"
    param_str: "{\'pcl_dir\': \'/home/zzy/CLionProjects/ROS_Project/ws/src/fcn_data_gen/data/npy\'}"
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    pad: 100
    kernel_size: 11
    group: 1
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    stride: 1
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 1
    stride: 1
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    stride: 1
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    stride: 1
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "Convolution"
  bottom: "pool5"
  top: "fc6"
  convolution_param {
    num_output: 4096
    pad: 0
    kernel_size: 6
    group: 1
    stride: 1
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "Convolution"
  bottom: "fc6"
  top: "fc7"
  convolution_param {
    num_output: 4096
    pad: 0
    kernel_size: 1
    group: 1
    stride: 1
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "score_fr"
  type: "Convolution"
  bottom: "fc7"
  top: "score_fr"
  param {
    lr_mult: 1
    decay_mult: 1
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  convolution_param {
    num_output: 21
    pad: 0
    kernel_size: 1
  }
}
layer {
  name: "upscore"
  type: "Deconvolution"
  bottom: "score_fr"
  top: "upscore"
  param {
    lr_mult: 0
  }
  convolution_param {
    num_output: 21
    bias_term: false
    kernel_size: 63
    stride: 32
  }
}
layer {
  name: "score"
  type: "Crop"
  bottom: "upscore"
  bottom: "data"
  top: "score"
  crop_param {
    axis: 2
    offset: 18
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "score"
  bottom: "label"
  top: "loss"
  loss_param {
    ignore_label: 255
    normalize: true
  }
}
I0313 11:41:39.420830  9249 layer_factory.hpp:77] Creating layer data
I0313 11:41:39.420866  9249 net.cpp:84] Creating Layer data
I0313 11:41:39.420871  9249 net.cpp:380] data -> data
I0313 11:41:39.420877  9249 net.cpp:380] data -> label
I0313 11:41:39.422286  9249 net.cpp:122] Setting up data
I0313 11:41:39.422296  9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.422299  9249 net.cpp:129] Top shape: 1 16 2016 (32256)
I0313 11:41:39.422302  9249 net.cpp:137] Memory required for data: 516096
I0313 11:41:39.422305  9249 layer_factory.hpp:77] Creating layer data_data_0_split
I0313 11:41:39.422310  9249 net.cpp:84] Creating Layer data_data_0_split
I0313 11:41:39.422313  9249 net.cpp:406] data_data_0_split <- data
I0313 11:41:39.422317  9249 net.cpp:380] data_data_0_split -> data_data_0_split_0
I0313 11:41:39.422322  9249 net.cpp:380] data_data_0_split -> data_data_0_split_1
I0313 11:41:39.422327  9249 net.cpp:122] Setting up data_data_0_split
I0313 11:41:39.422332  9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.422334  9249 net.cpp:129] Top shape: 1 3 16 2016 (96768)
I0313 11:41:39.422338  9249 net.cpp:137] Memory required for data: 1290240
I0313 11:41:39.422339  9249 layer_factory.hpp:77] Creating layer conv1
I0313 11:41:39.422346  9249 net.cpp:84] Creating Layer conv1
I0313 11:41:39.422349  9249 net.cpp:406] conv1 <- data_data_0_split_0
I0313 11:41:39.422353  9249 net.cpp:380] conv1 -> conv1
I0313 11:41:39.422446  9249 net.cpp:122] Setting up conv1
I0313 11:41:39.422451  9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.422454  9249 net.cpp:137] Memory required for data: 12312576
I0313 11:41:39.422461  9249 layer_factory.hpp:77] Creating layer relu1
I0313 11:41:39.422466  9249 net.cpp:84] Creating Layer relu1
I0313 11:41:39.422469  9249 net.cpp:406] relu1 <- conv1
I0313 11:41:39.422472  9249 net.cpp:367] relu1 -> conv1 (in-place)
I0313 11:41:39.422477  9249 net.cpp:122] Setting up relu1
I0313 11:41:39.422479  9249 net.cpp:129] Top shape: 1 96 52 552 (2755584)
I0313 11:41:39.422482  9249 net.cpp:137] Memory required for data: 23334912
I0313 11:41:39.422484  9249 layer_factory.hpp:77] Creating layer pool1
I0313 11:41:39.422488  9249 net.cpp:84] Creating Layer pool1
I0313 11:41:39.422490  9249 net.cpp:406] pool1 <- conv1
I0313 11:41:39.422495  9249 net.cpp:380] pool1 -> pool1
I0313 11:41:39.422502  9249 net.cpp:122] Setting up pool1
I0313 11:41:39.422504  9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.422507  9249 net.cpp:137] Memory required for data: 26090496
I0313 11:41:39.422508  9249 layer_factory.hpp:77] Creating layer norm1
I0313 11:41:39.422513  9249 net.cpp:84] Creating Layer norm1
I0313 11:41:39.422516  9249 net.cpp:406] norm1 <- pool1
I0313 11:41:39.422519  9249 net.cpp:380] norm1 -> norm1
I0313 11:41:39.422524  9249 net.cpp:122] Setting up norm1
I0313 11:41:39.422528  9249 net.cpp:129] Top shape: 1 96 26 276 (688896)
I0313 11:41:39.422529  9249 net.cpp:137] Memory required for data: 28846080
I0313 11:41:39.422531  9249 layer_factory.hpp:77] Creating layer conv2
I0313 11:41:39.422536  9249 net.cpp:84] Creating Layer conv2
I0313 11:41:39.422539  9249 net.cpp:406] conv2 <- norm1
I0313 11:41:39.422543  9249 net.cpp:380] conv2 -> conv2
I0313 11:41:39.422933  9249 net.cpp:122] Setting up conv2
I0313 11:41:39.422940  9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.422941  9249 net.cpp:137] Memory required for data: 36194304
I0313 11:41:39.422947  9249 layer_factory.hpp:77] Creating layer relu2
I0313 11:41:39.422951  9249 net.cpp:84] Creating Layer relu2
I0313 11:41:39.422955  9249 net.cpp:406] relu2 <- conv2
I0313 11:41:39.422958  9249 net.cpp:367] relu2 -> conv2 (in-place)
I0313 11:41:39.422962  9249 net.cpp:122] Setting up relu2
I0313 11:41:39.422966  9249 net.cpp:129] Top shape: 1 256 26 276 (1837056)
I0313 11:41:39.422967  9249 net.cpp:137] Memory required for data: 43542528
I0313 11:41:39.422971  9249 layer_factory.hpp:77] Creating layer pool2
I0313 11:41:39.422973  9249 net.cpp:84] Creating Layer pool2
I0313 11:41:39.422976  9249 net.cpp:406] pool2 <- conv2
I0313 11:41:39.422979  9249 net.cpp:380] pool2 -> pool2
I0313 11:41:39.422984  9249 net.cpp:122] Setting up pool2
I0313 11:41:39.422988  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.422991  9249 net.cpp:137] Memory required for data: 45379584
I0313 11:41:39.422992  9249 layer_factory.hpp:77] Creating layer norm2
I0313 11:41:39.422997  9249 net.cpp:84] Creating Layer norm2
I0313 11:41:39.422999  9249 net.cpp:406] norm2 <- pool2
I0313 11:41:39.423003  9249 net.cpp:380] norm2 -> norm2
I0313 11:41:39.423008  9249 net.cpp:122] Setting up norm2
I0313 11:41:39.423012  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.423013  9249 net.cpp:137] Memory required for data: 47216640
I0313 11:41:39.423015  9249 layer_factory.hpp:77] Creating layer conv3
I0313 11:41:39.423020  9249 net.cpp:84] Creating Layer conv3
I0313 11:41:39.423023  9249 net.cpp:406] conv3 <- norm2
I0313 11:41:39.423027  9249 net.cpp:380] conv3 -> conv3
I0313 11:41:39.423882  9249 net.cpp:122] Setting up conv3
I0313 11:41:39.423888  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.423892  9249 net.cpp:137] Memory required for data: 49972224
I0313 11:41:39.423897  9249 layer_factory.hpp:77] Creating layer relu3
I0313 11:41:39.423902  9249 net.cpp:84] Creating Layer relu3
I0313 11:41:39.423904  9249 net.cpp:406] relu3 <- conv3
I0313 11:41:39.423907  9249 net.cpp:367] relu3 -> conv3 (in-place)
I0313 11:41:39.423912  9249 net.cpp:122] Setting up relu3
I0313 11:41:39.423914  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.423918  9249 net.cpp:137] Memory required for data: 52727808
I0313 11:41:39.423919  9249 layer_factory.hpp:77] Creating layer conv4
I0313 11:41:39.423923  9249 net.cpp:84] Creating Layer conv4
I0313 11:41:39.423925  9249 net.cpp:406] conv4 <- conv3
I0313 11:41:39.423930  9249 net.cpp:380] conv4 -> conv4
I0313 11:41:39.424738  9249 net.cpp:122] Setting up conv4
I0313 11:41:39.424744  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.424747  9249 net.cpp:137] Memory required for data: 55483392
I0313 11:41:39.424751  9249 layer_factory.hpp:77] Creating layer relu4
I0313 11:41:39.424756  9249 net.cpp:84] Creating Layer relu4
I0313 11:41:39.424757  9249 net.cpp:406] relu4 <- conv4
I0313 11:41:39.424762  9249 net.cpp:367] relu4 -> conv4 (in-place)
I0313 11:41:39.424764  9249 net.cpp:122] Setting up relu4
I0313 11:41:39.424767  9249 net.cpp:129] Top shape: 1 384 13 138 (688896)
I0313 11:41:39.424770  9249 net.cpp:137] Memory required for data: 58238976
I0313 11:41:39.424772  9249 layer_factory.hpp:77] Creating layer conv5
I0313 11:41:39.424777  9249 net.cpp:84] Creating Layer conv5
I0313 11:41:39.424779  9249 net.cpp:406] conv5 <- conv4
I0313 11:41:39.424784  9249 net.cpp:380] conv5 -> conv5
I0313 11:41:39.425376  9249 net.cpp:122] Setting up conv5
I0313 11:41:39.425384  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.425385  9249 net.cpp:137] Memory required for data: 60076032
I0313 11:41:39.425393  9249 layer_factory.hpp:77] Creating layer relu5
I0313 11:41:39.425397  9249 net.cpp:84] Creating Layer relu5
I0313 11:41:39.425400  9249 net.cpp:406] relu5 <- conv5
I0313 11:41:39.425403  9249 net.cpp:367] relu5 -> conv5 (in-place)
I0313 11:41:39.425406  9249 net.cpp:122] Setting up relu5
I0313 11:41:39.425410  9249 net.cpp:129] Top shape: 1 256 13 138 (459264)
I0313 11:41:39.425412  9249 net.cpp:137] Memory required for data: 61913088
I0313 11:41:39.425415  9249 layer_factory.hpp:77] Creating layer pool5
I0313 11:41:39.425420  9249 net.cpp:84] Creating Layer pool5
I0313 11:41:39.425423  9249 net.cpp:406] pool5 <- conv5
I0313 11:41:39.425426  9249 net.cpp:380] pool5 -> pool5
I0313 11:41:39.425432  9249 net.cpp:122] Setting up pool5
I0313 11:41:39.425436  9249 net.cpp:129] Top shape: 1 256 6 69 (105984)
I0313 11:41:39.425437  9249 net.cpp:137] Memory required for data: 62337024
I0313 11:41:39.425441  9249 layer_factory.hpp:77] Creating layer fc6
I0313 11:41:39.425446  9249 net.cpp:84] Creating Layer fc6
I0313 11:41:39.425448  9249 net.cpp:406] fc6 <- pool5
I0313 11:41:39.425452  9249 net.cpp:380] fc6 -> fc6
I0313 11:41:39.454087  9249 net.cpp:122] Setting up fc6
I0313 11:41:39.454115  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.454118  9249 net.cpp:137] Memory required for data: 63385600
I0313 11:41:39.454126  9249 layer_factory.hpp:77] Creating layer relu6
I0313 11:41:39.454134  9249 net.cpp:84] Creating Layer relu6
I0313 11:41:39.454138  9249 net.cpp:406] relu6 <- fc6
I0313 11:41:39.454143  9249 net.cpp:367] relu6 -> fc6 (in-place)
I0313 11:41:39.454149  9249 net.cpp:122] Setting up relu6
I0313 11:41:39.454152  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.454155  9249 net.cpp:137] Memory required for data: 64434176
I0313 11:41:39.454157  9249 layer_factory.hpp:77] Creating layer drop6
I0313 11:41:39.454162  9249 net.cpp:84] Creating Layer drop6
I0313 11:41:39.454165  9249 net.cpp:406] drop6 <- fc6
I0313 11:41:39.454169  9249 net.cpp:367] drop6 -> fc6 (in-place)
I0313 11:41:39.454174  9249 net.cpp:122] Setting up drop6
I0313 11:41:39.454177  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.454180  9249 net.cpp:137] Memory required for data: 65482752
I0313 11:41:39.454182  9249 layer_factory.hpp:77] Creating layer fc7
I0313 11:41:39.454188  9249 net.cpp:84] Creating Layer fc7
I0313 11:41:39.454190  9249 net.cpp:406] fc7 <- fc6
I0313 11:41:39.454195  9249 net.cpp:380] fc7 -> fc7
I0313 11:41:39.467375  9249 net.cpp:122] Setting up fc7
I0313 11:41:39.467401  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.467403  9249 net.cpp:137] Memory required for data: 66531328
I0313 11:41:39.467411  9249 layer_factory.hpp:77] Creating layer relu7
I0313 11:41:39.467418  9249 net.cpp:84] Creating Layer relu7
I0313 11:41:39.467422  9249 net.cpp:406] relu7 <- fc7
I0313 11:41:39.467427  9249 net.cpp:367] relu7 -> fc7 (in-place)
I0313 11:41:39.467433  9249 net.cpp:122] Setting up relu7
I0313 11:41:39.467437  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.467439  9249 net.cpp:137] Memory required for data: 67579904
I0313 11:41:39.467442  9249 layer_factory.hpp:77] Creating layer drop7
I0313 11:41:39.467449  9249 net.cpp:84] Creating Layer drop7
I0313 11:41:39.467452  9249 net.cpp:406] drop7 <- fc7
I0313 11:41:39.467455  9249 net.cpp:367] drop7 -> fc7 (in-place)
I0313 11:41:39.467460  9249 net.cpp:122] Setting up drop7
I0313 11:41:39.467463  9249 net.cpp:129] Top shape: 1 4096 1 64 (262144)
I0313 11:41:39.467465  9249 net.cpp:137] Memory required for data: 68628480
I0313 11:41:39.467468  9249 layer_factory.hpp:77] Creating layer score_fr
I0313 11:41:39.467474  9249 net.cpp:84] Creating Layer score_fr
I0313 11:41:39.467476  9249 net.cpp:406] score_fr <- fc7
I0313 11:41:39.467481  9249 net.cpp:380] score_fr -> score_fr
I0313 11:41:39.467617  9249 net.cpp:122] Setting up score_fr
I0313 11:41:39.467622  9249 net.cpp:129] Top shape: 1 21 1 64 (1344)
I0313 11:41:39.467624  9249 net.cpp:137] Memory required for data: 68633856
I0313 11:41:39.467629  9249 layer_factory.hpp:77] Creating layer upscore
I0313 11:41:39.467635  9249 net.cpp:84] Creating Layer upscore
I0313 11:41:39.467638  9249 net.cpp:406] upscore <- score_fr
I0313 11:41:39.467643  9249 net.cpp:380] upscore -> upscore
I0313 11:41:39.469235  9249 net.cpp:122] Setting up upscore
I0313 11:41:39.469246  9249 net.cpp:129] Top shape: 1 21 63 2079 (2750517)
I0313 11:41:39.469249  9249 net.cpp:137] Memory required for data: 79635924
I0313 11:41:39.469259  9249 layer_factory.hpp:77] Creating layer score
I0313 11:41:39.469266  9249 net.cpp:84] Creating Layer score
I0313 11:41:39.469269  9249 net.cpp:406] score <- upscore
I0313 11:41:39.469272  9249 net.cpp:406] score <- data_data_0_split_1
I0313 11:41:39.469276  9249 net.cpp:380] score -> score
I0313 11:41:39.469285  9249 net.cpp:122] Setting up score
I0313 11:41:39.469288  9249 net.cpp:129] Top shape: 1 21 16 2016 (677376)
I0313 11:41:39.469290  9249 net.cpp:137] Memory required for data: 82345428
I0313 11:41:39.469293  9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.469300  9249 net.cpp:84] Creating Layer loss
I0313 11:41:39.469301  9249 net.cpp:406] loss <- score
I0313 11:41:39.469305  9249 net.cpp:406] loss <- label
I0313 11:41:39.469308  9249 net.cpp:380] loss -> loss
I0313 11:41:39.469314  9249 layer_factory.hpp:77] Creating layer loss
I0313 11:41:39.469894  9249 net.cpp:122] Setting up loss
I0313 11:41:39.469900  9249 net.cpp:129] Top shape: (1)
I0313 11:41:39.469903  9249 net.cpp:132]     with loss weight 1
I0313 11:41:39.469913  9249 net.cpp:137] Memory required for data: 82345432
I0313 11:41:39.469915  9249 net.cpp:198] loss needs backward computation.
I0313 11:41:39.469918  9249 net.cpp:198] score needs backward computation.
I0313 11:41:39.469920  9249 net.cpp:198] upscore needs backward computation.
I0313 11:41:39.469923  9249 net.cpp:198] score_fr needs backward computation.
I0313 11:41:39.469926  9249 net.cpp:198] drop7 needs backward computation.
I0313 11:41:39.469929  9249 net.cpp:198] relu7 needs backward computation.
I0313 11:41:39.469931  9249 net.cpp:198] fc7 needs backward computation.
I0313 11:41:39.469934  9249 net.cpp:198] drop6 needs backward computation.
I0313 11:41:39.469936  9249 net.cpp:198] relu6 needs backward computation.
I0313 11:41:39.469939  9249 net.cpp:198] fc6 needs backward computation.
I0313 11:41:39.469943  9249 net.cpp:198] pool5 needs backward computation.
I0313 11:41:39.469945  9249 net.cpp:198] relu5 needs backward computation.
I0313 11:41:39.469947  9249 net.cpp:198] conv5 needs backward computation.
I0313 11:41:39.469950  9249 net.cpp:198] relu4 needs backward computation.
I0313 11:41:39.469952  9249 net.cpp:198] conv4 needs backward computation.
I0313 11:41:39.469955  9249 net.cpp:198] relu3 needs backward computation.
I0313 11:41:39.469957  9249 net.cpp:198] conv3 needs backward computation.
I0313 11:41:39.469960  9249 net.cpp:198] norm2 needs backward computation.
I0313 11:41:39.469964  9249 net.cpp:198] pool2 needs backward computation.
I0313 11:41:39.469965  9249 net.cpp:198] relu2 needs backward computation.
I0313 11:41:39.469969  9249 net.cpp:198] conv2 needs backward computation.
I0313 11:41:39.469971  9249 net.cpp:198] norm1 needs backward computation.
I0313 11:41:39.469974  9249 net.cpp:198] pool1 needs backward computation.
I0313 11:41:39.469979  9249 net.cpp:198] relu1 needs backward computation.
I0313 11:41:39.469981  9249 net.cpp:198] conv1 needs backward computation.
I0313 11:41:39.469985  9249 net.cpp:200] data_data_0_split does not need backward computation.
I0313 11:41:39.469987  9249 net.cpp:200] data does not need backward computation.
I0313 11:41:39.469990  9249 net.cpp:242] This network produces output loss
I0313 11:41:39.470001  9249 net.cpp:255] Network initialization done.
I0313 11:41:39.470055  9249 solver.cpp:57] Solver scaffolding done.
I0313 11:42:40.745103  9249 solver.cpp:239] Iteration 0 (-1.4013e-45 iter/s, 61.136s/20 iters), loss = 4.54161
I0313 11:42:40.745129  9249 solver.cpp:258]     Train net output #0: loss = 4.00278 (* 1 = 4.00278 loss)
I0313 11:42:40.745136  9249 sgd_solver.cpp:112] Iteration 0, lr = 0.0001
I0313 12:02:52.273387  9249 solver.cpp:239] Iteration 20 (0.0165081 iter/s, 1211.53s/20 iters), loss = 17.0233
I0313 12:02:52.273416  9249 solver.cpp:258]     Train net output #0: loss = 19.2508 (* 1 = 19.2508 loss)
I0313 12:02:52.273422  9249 sgd_solver.cpp:112] Iteration 20, lr = 0.0001
I0313 12:23:09.810516  9249 solver.cpp:239] Iteration 40 (0.0164266 iter/s, 1217.54s/20 iters), loss = 26.7316
I0313 12:23:09.810544  9249 solver.cpp:258]     Train net output #0: loss = 30.1355 (* 1 = 30.1355 loss)
I0313 12:23:09.810550  9249 sgd_solver.cpp:112] Iteration 40, lr = 0.0001
I0313 12:43:32.716285  9249 solver.cpp:239] Iteration 60 (0.0163545 iter/s, 1222.91s/20 iters), loss = 30.2106
I0313 12:43:32.716313  9249 solver.cpp:258]     Train net output #0: loss = 22.8696 (* 1 = 22.8696 loss)
I0313 12:43:32.716320  9249 sgd_solver.cpp:112] Iteration 60, lr = 0.0001
I0313 13:03:49.434516  9249 solver.cpp:239] Iteration 80 (0.0164377 iter/s, 1216.72s/20 iters), loss = 31.0818
I0313 13:03:49.434543  9249 solver.cpp:258]     Train net output #0: loss = 23.1428 (* 1 = 23.1428 loss)
I0313 13:03:49.434551  9249 sgd_solver.cpp:112] Iteration 80, lr = 0.0001
I0313 13:23:51.860294  9249 solver.cpp:239] Iteration 100 (0.0166331 iter/s, 1202.43s/20 iters), loss = 32.5238
I0313 13:23:51.860322  9249 solver.cpp:258]     Train net output #0: loss = 35.1909 (* 1 = 35.1909 loss)
I0313 13:23:51.860328  9249 sgd_solver.cpp:112] Iteration 100, lr = 0.0001
I0313 13:43:38.481149  9249 solver.cpp:239] Iteration 120 (0.0168546 iter/s, 1186.62s/20 iters), loss = 33.0024
I0313 13:43:38.481176  9249 solver.cpp:258]     Train net output #0: loss = 40.9104 (* 1 = 40.9104 loss)
I0313 13:43:38.481182  9249 sgd_solver.cpp:112] Iteration 120, lr = 0.0001
I0313 14:03:27.667078  9249 solver.cpp:239] Iteration 140 (0.0168182 iter/s, 1189.19s/20 iters), loss = 36.4908
I0313 14:03:27.667104  9249 solver.cpp:258]     Train net output #0: loss = 53.9975 (* 1 = 53.9975 loss)
I0313 14:03:27.667111  9249 sgd_solver.cpp:112] Iteration 140, lr = 0.0001
I0313 14:23:25.009404  9249 solver.cpp:239] Iteration 160 (0.0167037 iter/s, 1197.34s/20 iters), loss = 52.2285
I0313 14:23:25.009431  9249 solver.cpp:258]     Train net output #0: loss = 26.9314 (* 1 = 26.9314 loss)
I0313 14:23:25.009438  9249 sgd_solver.cpp:112] Iteration 160, lr = 0.0001
I0313 14:43:35.026921  9249 solver.cpp:239] Iteration 180 (0.0165287 iter/s, 1210.02s/20 iters), loss = 33.087
I0313 14:43:35.026950  9249 solver.cpp:258]     Train net output #0: loss = 44.6887 (* 1 = 44.6887 loss)
I0313 14:43:35.026957  9249 sgd_solver.cpp:112] Iteration 180, lr = 0.0001
I0313 15:03:45.718956  9249 solver.cpp:239] Iteration 200 (0.0165195 iter/s, 1210.69s/20 iters), loss = 33.0793
I0313 15:03:45.718984  9249 solver.cpp:258]     Train net output #0: loss = 34.2235 (* 1 = 34.2235 loss)
I0313 15:03:45.718991  9249 sgd_solver.cpp:112] Iteration 200, lr = 0.0001
I0313 15:24:27.503715  9249 solver.cpp:239] Iteration 220 (0.0161059 iter/s, 1241.78s/20 iters), loss = 33.1698
I0313 15:24:27.503741  9249 solver.cpp:258]     Train net output #0: loss = 45.0323 (* 1 = 45.0323 loss)
I0313 15:24:27.503748  9249 sgd_solver.cpp:112] Iteration 220, lr = 0.0001
I0313 15:44:53.585564  9249 solver.cpp:239] Iteration 240 (0.0163121 iter/s, 1226.08s/20 iters), loss = 35.7697
I0313 15:44:53.585592  9249 solver.cpp:258]     Train net output #0: loss = 45.5302 (* 1 = 45.5302 loss)
I0313 15:44:53.585598  9249 sgd_solver.cpp:112] Iteration 240, lr = 0.0001
I0313 16:04:44.744935  9249 solver.cpp:239] Iteration 260 (0.0167904 iter/s, 1191.16s/20 iters), loss = 29.4003
I0313 16:04:44.744963  9249 solver.cpp:258]     Train net output #0: loss = 19.9242 (* 1 = 19.9242 loss)
I0313 16:04:44.744969  9249 sgd_solver.cpp:112] Iteration 260, lr = 0.0001
I0313 16:24:00.216655  9249 solver.cpp:239] Iteration 280 (0.017309 iter/s, 1155.47s/20 iters), loss = 24.0391
I0313 16:24:00.216681  9249 solver.cpp:258]     Train net output #0: loss = 35.6398 (* 1 = 35.6398 loss)
I0313 16:24:00.216687  9249 sgd_solver.cpp:112] Iteration 280, lr = 0.0001
I0313 16:43:22.672458  9249 solver.cpp:239] Iteration 300 (0.017205 iter/s, 1162.45s/20 iters), loss = 33.2369
I0313 16:43:22.672485  9249 solver.cpp:258]     Train net output #0: loss = 38.8301 (* 1 = 38.8301 loss)
I0313 16:43:22.672492  9249 sgd_solver.cpp:112] Iteration 300, lr = 0.0001
I0313 17:02:56.876072  9249 solver.cpp:239] Iteration 320 (0.0170328 iter/s, 1174.2s/20 iters), loss = 33.7243
I0313 17:02:56.876101  9249 solver.cpp:258]     Train net output #0: loss = 33.6585 (* 1 = 33.6585 loss)
I0313 17:02:56.876106  9249 sgd_solver.cpp:112] Iteration 320, lr = 0.0001

可以看到在未更改其他網絡參數的情況下,loss居高不下,本文着重與對Data layer的理解,下一篇文章將對網絡結構內部進行優化,以達到點雲數據loss達到預期的目標。


以上。


參考文獻:
1. FCN學習:Semantic Segmentation
2. AlexNet

發佈了36 篇原創文章 · 獲贊 32 · 訪問量 12萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章