GreemPlum6.7.1 Centos7部署文檔

GreemPlum6.7.1部署文檔

一、部署規劃

1.1 版本說明

操作系統版本 CentOS Linux release 7.7.1908
數據庫版本 GreenPlum6.7.1( pg 9.4.24)
內核版本 3.10.0-1062.el7.x86_64
CPU 2C
MEM 4G

1.2 節點規劃

IP host 節點規劃
192.168.188.87 node1 master
192.168.188.88 node2 seg1,seg2,mirror3,mirror4
192.168.188.89 node3 seg3,seg4,mirror1,mirror2

二、系統配置

系統參數需要使用root用戶修改,修改完需要重啓系統,也可以修改完成後一併重啓系統。
建議先修改master主機的參數,待安裝好master的gp後,打通ssh,使用gpscp ,gpssh 批量修改其他節點的系統參數

2.1 關閉selinux

setenforce 0 && sed -i 's/SELINUX=enforcing/SELINUX=disabled/g' /etc/selinux/config
sestatus

2.2 關閉防火牆

systemctl stop firewalld.service
systemctl disable firewalld.service
systemctl list-unit-files firewalld.service

2.3 修改/etc/sysctl.conf

# kernel.shmall = _PHYS_PAGES / 2 =echo $(expr $(getconf _PHYS_PAGES) / 2)
kernel.shmall = 483888
# kernel.shmmax = kernel.shmall * PAGE_SIZE =echo $(expr $(getconf _PHYS_PAGES) / 2 \* $(getconf PAGE_SIZE))
kernel.shmmax = 1982005248
kernel.shmmni = 4096
vm.overcommit_memory = 2
vm.overcommit_ratio = 95 
net.ipv4.ip_local_port_range = 10000 65535
kernel.sem = 500 2048000 200 40960
kernel.sysrq = 1
kernel.core_uses_pid = 1
kernel.msgmnb = 65536
kernel.msgmax = 65536
kernel.msgmni = 2048
net.ipv4.tcp_syncookies = 1
net.ipv4.conf.default.accept_source_route = 0
net.ipv4.tcp_max_syn_backlog = 4096
net.ipv4.conf.all.arp_filter = 1
net.core.netdev_max_backlog = 10000
net.core.rmem_max = 2097152
net.core.wmem_max = 2097152
vm.swappiness = 10
vm.zone_reclaim_mode = 0
vm.dirty_expire_centisecs = 500
vm.dirty_writeback_centisecs = 100
#vm.dirty_background_ratio = 0
#vm.dirty_ratio = 0
#vm.dirty_background_bytes = 1610612736
#vm.dirty_bytes = 4294967296
vm.dirty_background_ratio = 3 
vm.dirty_ratio = 10

系統內存大於64G ,建議以下配置

vm.dirty_background_ratio = 0
vm.dirty_ratio = 0
vm.dirty_background_bytes = 1610612736 # 1.5GB
vm.dirty_bytes = 4294967296 # 4GB

系統內存小於等於 64GB,移除vm.dirty_background_bytes 設置,並設置以下參數

vm.dirty_background_ratio = 3
vm.dirty_ratio = 10

增加 vm.min_free_kbytes ,確保網絡和存儲驅動程序PF_MEMALLOC得到分配。這對內存大的系統尤其重要。一般系統上,默認值通常太低。可以使用awk命令計算vm.min_free_kbytes的值,通常是建議的系統物理內存的3%:

awk 'BEGIN {OFMT = "%.0f";} /MemTotal/ {print "vm.min_free_kbytes =", $2 * .03;}' /proc/meminfo >> /etc/sysctl.conf 
sysctl -p

2.4 系統資源規劃

/etc/security/limits.conf 
* soft nofile 524288
* hard nofile 524288
* soft nproc 131072
* hard nproc 131072

/etc/security/limits.d/20-nproc.conf
*          soft    nproc     131072
root       soft    nproc     unlimited

2.5 調整掛載項

XFS相比較ext4具有如下優點:
XFS的擴展性明顯優於ext4,ext4的單個文件目錄超過200W個性能下降明顯
ext4作爲傳統文件系統確實非常穩定,但是隨着存儲需求的越來越大,ext4漸漸不在適應
由於歷史磁盤原因,ext4的inode個數限制(32位),最多隻能支持40多億個文件,單個文件最大支持到16T
XFS使用的是64位管理空間,文件系統規模可以達到EB級別,XFS是基於B+Tree管理元數據
GP 需要使用XFS的文件系統,RHEL/CentOS 7 和Oracle Linux將XFS作爲默認文件系統,SUSE/openSUSE已經爲XFS做了長期支持。

/dev/sda3 /data xfs nodev,noatime,nobarrier,inode64 0 0

2.6 設置文件預讀大小

/sbin/blockdev --setra 16384 /dev/sda3
/sbin/blockdev --getra /dev/sda3
#在/etc/rc.d/rc.local添加以上命令
chmod +x /etc/rc.d/rc.local

2.7 設置IO調度模式

grubby --update-kernel=ALL --args="elevator=deadline"
grubby --info=ALL

2.8 禁用透明大頁面

grubby --update-kernel=ALL --args="transparent_hugepage=never"
cat /sys/kernel/mm/*transparent_hugepage/enabled

2.9 禁用 刪除IPC

/etc/systemd/logind.conf
RemoveIPC=no

2.10 修改/etc/ssh/sshd_config

/etc/ssh/sshd_config
MaxStartups 10:30:100

2.11 安裝依賴

yum install -y apr apr-util bash bzip2 curl krb5 libcurl libevent libxml2 libyaml zlib openldap openssh openssl openssl-libs perl readline rsync R sed tar zip krb5-devel

重啓主機並查看已修改參數

三、GreenPlum安裝

3.1 創建用戶

在各節點執行

groupadd gpadmin
useradd gpadmin -r -m -g gpadmin
passwd gpadmin

3.2 執行安裝程序

yum install -y greenplum-db-6.7.1-rhel7-x86_64.rpm

3.3 創建hostfile_exkeys

在$GPHOME目錄創建兩個host文件(all_host,seg_host),用於後續使用gpssh,gpscp 等腳本host參數文件
all_host : 內容是集羣所有主機名或ip,包含master,segment,standby等。
seg_host: 內容是所有 segment主機名或ip
若一臺機器有多網卡,且網卡沒有綁定成bond0模式時,需要將多網卡的ip 或者host都列出來。

cd /usr/local/greenplum-db
[root@node1 greenplum-db]# cat all_host 
node1
node2
node3
[root@node1 greenplum-db]# cat seg_host 
node2
node3
#修改權限
chown -R gpadmin:gpadmin /usr/local/greenplum*

3.4 配置集羣互信

ssh-keygen
ssh-copy-id node2
ssh-copy-id node3
source /usr/local/greenplum-db/greenplum_path.sh
gpssh-exkeys -f all_host
[STEP 1 of 5] create local ID and authorize on local host
  ... /root/.ssh/id_rsa file exists ... key generation skipped

[STEP 2 of 5] keyscan all hosts and update known_hosts file

[STEP 3 of 5] retrieving credentials from remote hosts
  ... send to node2
  ... send to node3

[STEP 4 of 5] determine common authentication file content

[STEP 5 of 5] copy authentication files to all remote hosts
  ... finished key exchange with node2
  ... finished key exchange with node3

[INFO] completed successfully
#驗證gpssh
gpssh -f /usr/local/greenplum-db/all_host -e 'ls /usr/local/'
[node1] ls /usr/local/
[node1] bin  games	   greenplum-db-6.7.1  lib    libexec  share
[node1] etc  greenplum-db  include	       lib64  sbin     src
[node2] ls /usr/local/
[node2] bin  etc  games  include  lib  lib64  libexec  sbin  share  src
[node3] ls /usr/local/
[node3] bin  etc  games  include  lib  lib64  libexec  sbin  share  src

3.5 同步master 配置到各個主機

3.5.1 批量添加gpadmin用戶

gpssh -f seg_host -e 'groupadd gpadmin;useradd gpadmin -r -m -g gpadmin;echo "gpadmin" | passwd --stdin gpadmin;'
gpssh -f seg_host -e 'ls /home/'

3.5.2 配置gpadmin免密登陸

su - gpadmin
source /usr/local/greenplum-db/greenplum_path.sh
ssh-keygen
ssh-copy-id node2
ssh-copy-id node3
gpssh-exkeys -f /usr/local/greenplum-db/all_host
[STEP 1 of 5] create local ID and authorize on local host
  ... /home/gpadmin/.ssh/id_rsa file exists ... key generation skipped

[STEP 2 of 5] keyscan all hosts and update known_hosts file

[STEP 3 of 5] retrieving credentials from remote hosts
  ... send to node2
  ... send to node3

[STEP 4 of 5] determine common authentication file content

[STEP 5 of 5] copy authentication files to all remote hosts
  ... finished key exchange with node2
  ... finished key exchange with node3

[INFO] completed successfully

3.5.3 配置gpadmin用戶的環境變量

cat >> /home/gpadmin/.bash_profile << EOF
source /usr/local/greenplum-db/greenplum_path.sh
EOF
cat >> /home/gpadmin/.bashrc << EOF
source /usr/local/greenplum-db/greenplum_path.sh
EOF

gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bash_profile gpadmin@=:/home/gpadmin/.bash_profile 
gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bashrc gpadmin@=:/home/gpadmin/.bashrc

3.5.4 複製系統參數到其他節點

gpscp -f seg_host /etc/hosts root@=:/etc/hosts
gpscp -f seg_host /etc/security/limits.conf root@=:/etc/security/limits.conf 
gpscp -f seg_host /etc/sysctl.conf  root@=:/etc/sysctl.conf
gpscp -f seg_host /etc/security/limits.d/20-nproc.conf root@=:/etc/security/limits.d/20-nproc.conf
gpssh -f seg_host -e '/sbin/blockdev --setra 16384 /dev/sda3'
gpssh -f seg_host -e 'echo deadline > /sys/block/sda/queue/scheduler'
gpssh -f seg_host -e 'sysctl -p'
gpssh -f seg_host -e 'reboot'

3.6 gpsegment部署

3.6.1 分發配置

# 變量設置
link_name='greenplum-db'                    #軟連接名
binary_dir_location='/usr/local'            #安裝路徑
binary_dir_name='greenplum-db-6.7.1'        #安裝目錄
binary_path='/usr/local/greenplum-db-6.7.1' #全目錄

chown -R gpadmin:gpadmin $binary_path
rm -f ${binary_path}.tar; rm -f ${binary_path}.tar.gz
cd $binary_dir_location; tar cf ${binary_dir_name}.tar ${binary_dir_name}
gzip ${binary_path}.tar
gpssh -f ${binary_path}/seg_host -e "mkdir -p ${binary_dir_location};rm -rf ${binary_path};rm -rf ${binary_path}.tar;rm -rf ${binary_path}.tar.gz"
gpscp -f ${binary_path}/seg_host ${binary_path}.tar.gz root@=:${binary_path}.tar.gz
gpssh -f ${binary_path}/seg_host -e "cd ${binary_dir_location};gzip -f -d ${binary_path}.tar.gz;tar xf ${binary_path}.tar"
gpssh -f ${binary_path}/seg_host -e "rm -rf ${binary_path}.tar;rm -rf ${binary_path}.tar.gz;rm -f ${binary_dir_location}/${link_name}"
gpssh -f ${binary_path}/seg_host -e ln -fs ${binary_dir_location}/${binary_dir_name} ${binary_dir_location}/${link_name}
gpssh -f ${binary_path}/seg_host -e "chown -R gpadmin:gpadmin ${binary_dir_location}/${link_name};chown -R gpadmin:gpadmin ${binary_dir_location}/${binary_dir_name}"
gpssh -f ${binary_path}/seg_host -e "source ${binary_path}/greenplum_path.sh"
gpssh -f ${binary_path}/seg_host -e "cd ${binary_dir_location};ll"

3.6.2 創建集羣數據目錄

#創建master數據目錄
mkdir -p /data/greenplum/data/master
chown gpadmin:gpadmin /data/greenplum/data/master

#創建segment數據目錄
source /usr/local/greenplum-db/greenplum_path.sh
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data1/primary'
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data1/mirror'
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data2/primary'
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data2/mirror'
gpssh -f /usr/local/greenplum-db/seg_host -e 'chown -R gpadmin /data/greenplum/data*'

3.7 集羣性能測試

推薦:

磁盤要達到2000M/s
網絡至少1000M/s

3.7.1 網絡測試

gpcheckperf -f /usr/local/greenplum-db/seg_host -r N -d /tmp

-------------------
--  NETPERF TEST
-------------------
NOTICE: -t is deprecated, and has no effect
NOTICE: -f is deprecated, and has no effect
NOTICE: -t is deprecated, and has no effect
NOTICE: -f is deprecated, and has no effect

====================
==  RESULT 2020-05-04T13:38:25.039440
====================
Netperf bisection bandwidth test
node2 -> node3 = 708.600000
node3 -> node2 = 758.750000

Summary:
sum = 1467.35 MB/sec
min = 708.60 MB/sec
max = 758.75 MB/sec
avg = 733.67 MB/sec
median = 758.75 MB/sec

3.7.2 磁盤測試

gpcheckperf -f /usr/local/greenplum-db/seg_host -r ds -D -d /data/greenplum/data1/primary

--------------------
--  DISK WRITE TEST
--------------------

--------------------
--  DISK READ TEST
--------------------

--------------------
--  STREAM TEST
--------------------

====================
==  RESULT 2020-05-04T13:40:48.565801
====================

 disk write avg time (sec): 39.45
 disk write tot bytes: 15836053504
 disk write tot bandwidth (MB/s): 385.66
 disk write min bandwidth (MB/s): 176.04 [node3]
 disk write max bandwidth (MB/s): 209.61 [node2]
 -- per host bandwidth --
    disk write bandwidth (MB/s): 176.04 [node3]
    disk write bandwidth (MB/s): 209.61 [node2]


 disk read avg time (sec): 9.57
 disk read tot bytes: 15836053504
 disk read tot bandwidth (MB/s): 1581.69
 disk read min bandwidth (MB/s): 752.67 [node3]
 disk read max bandwidth (MB/s): 829.03 [node2]
 -- per host bandwidth --
    disk read bandwidth (MB/s): 752.67 [node3]
    disk read bandwidth (MB/s): 829.03 [node2]


 stream tot bandwidth (MB/s): 40897.60
 stream min bandwidth (MB/s): 18901.20 [node2]
 stream max bandwidth (MB/s): 21996.40 [node3]
 -- per host bandwidth --
    stream bandwidth (MB/s): 21996.40 [node3]
    stream bandwidth (MB/s): 18901.20 [node2]

3.7.3 時鐘校驗

gpssh -f /usr/local/greenplum-db/all_host -e 'date'
[node1] date
[node1] 2020年 05月 04日 星期一 13:45:16 CST
[node3] date
[node3] 2020年 05月 04日 星期一 13:45:16 CST
[node2] date
[node2] 2020年 05月 04日 星期一 13:45:16 CST

3.8 編輯配置文件

su - gpadmin
mkdir -p /home/gpadmin/gpconfigs
cp $GPHOME/docs/cli_help/gpconfigs/gpinitsystem_config /home/gpadmin/gpconfigs/gpinitsystem_config
#修改配置文件
ARRAY_NAME="Greenplum Data Platform"
SEG_PREFIX=gpseg
PORT_BASE=6000
declare -a DATA_DIRECTORY=(/data/greenplum/data1/primary /data/greenplum/data2/primary)
MASTER_HOSTNAME=node1
MASTER_DIRECTORY=/data/greenplum/data/master
MASTER_PORT=5432
TRUSTED_SHELL=ssh
CHECK_POINT_SEGMENTS=8
ENCODING=UNICODE
MIRROR_PORT_BASE=7000
declare -a MIRROR_DATA_DIRECTORY=(/data/greenplum/data1/mirror /data/greenplum/data2/mirror)
DATABASE_NAME=pgdw

3.9 集羣初始化

3.9.1 初始化集羣

gpinitsystem -c /home/gpadmin/gpconfigs/gpinitsystem_config -h /usr/local/greenplum-db/seg_host -D
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Main
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Command line options passed to utility = -c /home/gpadmin/gpconfigs/gpinitsystem_config -h /usr/local/greenplum-db/seg_host -D
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_GPDB_ID
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Current user id of gpadmin, matches initdb id of gpadmin
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_GPDB_ID
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_PARAMS
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking configuration parameters, please wait...
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_FILE
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking file /home/gpadmin/gpconfigs/gpinitsystem_config
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_FILE
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Dumping /home/gpadmin/gpconfigs/gpinitsystem_config to logfile for reference
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed /home/gpadmin/gpconfigs/gpinitsystem_config dump to logfile
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Reading Greenplum configuration file /home/gpadmin/gpconfigs/gpinitsystem_config
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Could not find HEAP_CHECKSUM in cluster config, defaulting to on.
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Could not find HBA_HOSTNAMES in cluster config, defaulting to 0.
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale has not been set in /home/gpadmin/gpconfigs/gpinitsystem_config, will set to default value
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_LOCALE_KNOWN
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale check passed on this host
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_LOCALE_KNOWN
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale set to en_US.utf8
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Dump current system locale to log file
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End of system locale dump
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_FILE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking file /usr/local/greenplum-db/seg_host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_FILE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed check of file /usr/local/greenplum-db/seg_host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DUPLICATES
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-No duplicate segment instance hostnames found, will proceed
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DUPLICATES
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting up segment instance list array
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data/master on master host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed /data/greenplum/data/master directory on master host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Will create database gpdw
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-MASTER_MAX_CONNECT not set, will set to default value 250
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting segment instance MAX_CONNECTIONS to 
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number of segment instance hosts = 2
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain IP address of Master host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address array = ::1
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking configuration parameters, Completed
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_PARAMS
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_MULTI_HOME
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing multi-home checks, please wait...
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Configuring build for standard array
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing multi-home checks, Completed
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_MULTI_HOME
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_QE_ARRAY
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Building primary segment instance array, please wait...
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6000.lock found for port=6000
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6001.lock found for port=6001
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6000.lock found for port=6000
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6001.lock found for port=6001
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_GROUP_MIRROR_ARRAY
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Building group mirror array type , please wait...
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7000.lock found for port=7000
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7001.lock found for port=7001
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7000.lock found for port=7000
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7001.lock found for port=7001
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_GROUP_MIRROR_ARRAY
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_QE_ARRAY
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_QES
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking Master host
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file in /tmp found for port=5432
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking new segment hosts, please wait...
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_LOCALE_KNOWN
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale check passed on node2
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_LOCALE_KNOWN
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_OPEN_FILES
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node1
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_OPEN_FILES
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_VERSION_CHK
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:- Current postgres version = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:- postgres version on node1 = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_VERSION_CHK
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment instance node2 /usr/local/greenplum-db/./lib checked
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_LOCALE_KNOWN
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale check passed on node3
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_LOCALE_KNOWN
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_OPEN_FILES
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node1
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_OPEN_FILES
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_VERSION_CHK
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:- Current postgres version = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:- postgres version on node1 = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_VERSION_CHK
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment instance node3 /usr/local/greenplum-db/./lib checked
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Primary segment instance directory check
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data1/primary/gpseg0
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 /etc/hosts for localhost set as node2
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/primary on node2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data1/primary directory
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data2/primary/gpseg1
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 /etc/hosts for localhost set as node2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/primary on node2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data2/primary directory
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data1/primary/gpseg2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 /etc/hosts for localhost set as node3
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/primary on node3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data1/primary directory
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data2/primary/gpseg3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 /etc/hosts for localhost set as node3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/primary on node3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data2/primary directory
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirror segment instance directory check
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data1/mirror/gpseg0
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/mirror on node3
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data1/mirror directory
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data2/mirror/gpseg1
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/mirror on node3
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data2/mirror directory
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data1/mirror/gpseg2
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/mirror on node2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data1/mirror directory
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data2/mirror/gpseg3
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/mirror on node2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data2/mirror directory
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking new segment hosts, Completed
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_QES
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function DISPLAY_CONFIG
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Database Creation Parameters
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master Configuration
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master instance name       = Greenplum Data Platform
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master hostname            = node1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master port                = 5432
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master instance dir        = /data/greenplum/data/master/gpseg-1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master LOCALE              = en_US.utf8
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum segment prefix   = gpseg
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master Database            = gpdw
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master connections         = 250
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master buffers             = 128000kB
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment connections        = 750
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment buffers            = 128000kB
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checkpoint segments        = 8
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Encoding                   = UNICODE
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Postgres param file        = Off
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Initdb to be used          = /usr/local/greenplum-db/./bin/initdb
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-GP_LIBRARY_PATH is         = /usr/local/greenplum-db/./lib
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-HEAP_CHECKSUM is           = on
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-HBA_HOSTNAMES is           = 0
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Ulimit check               = Passed
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Array host connect type    = Single hostname per node
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address [1]      = ::1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address [2]      = 192.168.188.87
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address [3]      = fe80::3157:3eb8:c80e:8f40
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Standby Master             = Not Configured
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number of primary segments = 2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total Database segments    = 4
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Trusted shell              = ssh
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number segment hosts       = 2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirror port base           = 7000
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number of mirror segments  = 2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirroring config           = ON
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirroring type             = Group
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:----------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Primary Segment Configuration
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:----------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data1/primary/gpseg0 	6000 	2 	0
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data2/primary/gpseg1 	6001 	3 	1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data1/primary/gpseg2 	6000 	4 	2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data2/primary/gpseg3 	6001 	5 	3
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Mirror Segment Configuration
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data1/mirror/gpseg0 	7000 	6 	0
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data2/mirror/gpseg1 	7001 	7 	1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data1/mirror/gpseg2 	7000 	8 	2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data2/mirror/gpseg3 	7001 	9 	3

Continue with Greenplum creation Yy|Nn (default=N):
> y
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function DISPLAY_CONFIG
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ARRAY_REORDER
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ARRAY_REORDER
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_QD_DB
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Building the Master instance database, please wait...
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Initializing Master Postgres instance /data/greenplum/data/master/gpseg-1
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing local /usr/local/greenplum-db/./bin/initdb -E UNICODE -D /data/greenplum/data/master/gpseg-1 --locale=en_US.utf8        --max_connections=250 --shared_buffers=128000kB --data-checksums --backend_output=/data/greenplum/data/master/gpseg-1.initdb
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed Master instance initialization
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting the Master port to 5432
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set Master port=5432 in postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed setting the Master port to 5432
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting the Master listen addresses to '*'
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set Master listen addresses to '*' in postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed setting the listen addresses to '*'
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master logging option
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Appended line log_statement=all to /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set log_statement=all in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master instance check point segments
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set checkpoint_segments=8 in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master instance content id
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Appended line gp_contentid=-1 to /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set gp_contentid=-1 in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master instance db id
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Appended line gp_dbid=1 to /data/greenplum/data/master/gpseg-1/internal.auto.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set gp_dbid=1 in /data/greenplum/data/master/gpseg-1/internal.auto.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding gp_dumpall access to pg_hba.conf for master host
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BUILD_MASTER_PG_HBA_FILE
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Clearing values in Master pg_hba.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting local access
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting local host access
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Complete Master pg_hba.conf configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BUILD_MASTER_PG_HBA_FILE
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Creating gpssh configuration file
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BUILD_GPSSH_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BUILD_GPSSH_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Creating perfmon directories and configuration file
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BUILD_PERFMON
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed create perfmon directories and configuration file
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Starting the Master in admin mode
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed starting the Master in admin mode
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-node1 contact established
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding -1 on node1, path /data/greenplum/data/master/gpseg-1 to Master gp_segment_configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add -1 on node1 in dir /data/greenplum/data/master/gpseg-1 to Master gp_segment_configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function LOAD_QE_SYSTEM_DATA
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 0 on node2, path /data/greenplum/data1/primary/gpseg0 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 0 on node2 in dir /data/greenplum/data1/primary/gpseg0 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 2 on node3, path /data/greenplum/data1/primary/gpseg2 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 2 on node3 in dir /data/greenplum/data1/primary/gpseg2 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 1 on node2, path /data/greenplum/data2/primary/gpseg1 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 1 on node2 in dir /data/greenplum/data2/primary/gpseg1 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 3 on node3, path /data/greenplum/data2/primary/gpseg3 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 3 on node3 in dir /data/greenplum/data2/primary/gpseg3 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function LOAD_QE_SYSTEM_DATA
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_QD_DB
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_ARRAY_SORTED_ON_CONTENT_ID
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_ARRAY_SORTED_ON_CONTENT_ID
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_SEGMENT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing parallel build of primary segment instances
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SETUP
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Spawning parallel processes    batch [1], please wait...
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SETUP
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_WAIT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Waiting for parallel processes batch [1], please wait...
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_WAIT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Parallel process exit status
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as completed           = 4
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as killed              = 0
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as failed              = 0
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_SEGMENT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Deleting distributed backout files
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Removing back out file
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-No errors generated from parallel processes
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function STOP_QD_PRODUCTION
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Restarting the Greenplum instance in production mode
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Starting gpstop with args: -a -l /home/gpadmin/gpAdminLogs -m -d /data/greenplum/data/master/gpseg-1 -v
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Setting level of parallelism to: 64
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Gathering information and validating the environment...
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:---Checking that current user can use GP binaries
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Obtaining master's port from master data directory
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Read from postgresql.conf port=5432
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Obtaining Greenplum Master catalog information
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Obtaining Segment details from master...
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Greenplum Version: 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7'
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Commencing Master instance shutdown with mode='smart'
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Master segment instance directory=/data/greenplum/data/master/gpseg-1
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - end_gp_era
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Stopping master segment and waiting for user connections to finish ...
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
server shutting down
20200504:18:29:07:019448 gpstop:node1:gpadmin-[INFO]:-Attempting forceful termination of any leftover master process
20200504:18:29:07:019448 gpstop:node1:gpadmin-[DEBUG]:-Running Command: cat /tmp/.s.PGSQL.5432.lock
20200504:18:29:07:019448 gpstop:node1:gpadmin-[INFO]:-Terminating processes for segment /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019448 gpstop:node1:gpadmin-[DEBUG]:-Running Command: ps ux | grep "[p]ostgres:\s*port\s*5432" | awk '{print $2}'
20200504:18:29:07:019448 gpstop:node1:gpadmin-[ERROR]:-Failed to kill processes for segment /data/greenplum/data/master/gpseg-1: ([Errno 3] No such process)
20200504:18:29:07:019448 gpstop:node1:gpadmin-[DEBUG]:-Successfully shutdown the Master instance in admin mode
20200504:18:29:07:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully shutdown the new Greenplum instance
20200504:18:29:07:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function STOP_QD_PRODUCTION
20200504:18:29:07:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function START_QD_PRODUCTION
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Starting gpstart with args: -a -l /home/gpadmin/gpAdminLogs -d /data/greenplum/data/master/gpseg-1 -v
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Setting level of parallelism to: 64
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Gathering information and validating the environment...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:---Checking that current user can use GP binaries
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Obtaining master's port from master data directory
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Read from postgresql.conf port=5432
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Read from postgresql.conf max_connections=250
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Greenplum Binary Version: 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --catalog-version
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Greenplum Catalog Version: '301908232'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Check if Master is already running...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Starting Master instance in admin mode
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=None $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 -l /data/greenplum/data/master/gpseg-1/pg_log/startup.log -w -t 600 -o " -p 5432 -c gp_role=utility " start
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Obtaining Greenplum Master catalog information
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Obtaining Segment details from master...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Setting new master era
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - write_gp_era
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-opening new file
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-wrote era: 9c0a65a67950ab11_200504182907
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-setting read only
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-verifying file
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Master Started...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Checking if standby has been activated...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Shutting down master
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/gpstop -a -m -f -d /data/greenplum/data/master/gpseg-1 -v -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-results of forcing master shutdown: Shutting down master cmdStr='$GPHOME/bin/gpstop -a -m -f -d /data/greenplum/data/master/gpseg-1 -v -B 64 -l '/home/gpadmin/gpAdminLogs''  had result: cmd had rc=0 completed=True halted=False
  stdout='20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Starting gpstop with args: -a -m -f -d /data/greenplum/data/master/gpseg-1 -v -B 64 -l /home/gpadmin/gpAdminLogs
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Setting level of parallelism to: 64
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Gathering information and validating the environment...
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:---Checking that current user can use GP binaries
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Obtaining master's port from master data directory
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Read from postgresql.conf port=5432
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Obtaining Greenplum Master catalog information
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Obtaining Segment details from master...
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Greenplum Version: 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Commencing Master instance shutdown with mode='fast'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Master segment instance directory=/data/greenplum/data/master/gpseg-1
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - end_gp_era
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-found existing file
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-removed existing file
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 -m fast -w -t 120 stop
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Attempting forceful termination of any leftover master process
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: cat /tmp/.s.PGSQL.5432.lock
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Terminating processes for segment /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: ps ux | grep "[p]ostgres:\s*port\s*5432" | awk '{print $2}'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[ERROR]:-Failed to kill processes for segment /data/greenplum/data/master/gpseg-1: ([Errno 3] No such process)
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Successfully shutdown the Master instance in admin mode
'
  stderr=''
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-gp_segment_configuration indicates following valid segments
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node2:/data/greenplum/data1/primary/gpseg0:content=0:dbid=2:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node2:/data/greenplum/data2/primary/gpseg1:content=1:dbid=3:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node3:/data/greenplum/data1/primary/gpseg2:content=2:dbid=4:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node3:/data/greenplum/data2/primary/gpseg3:content=3:dbid=5:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-dbIdsToNotStart has 0 entries
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-gparray does not have mirrors
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: /bin/ping -c 1 node3
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: /bin/ping -c 1 node2
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] got cmd: /bin/ping -c 1 node3
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] got cmd: /bin/ping -c 1 node2
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: /bin/ping -c 1 node3
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: /bin/ping -c 1 node2
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] finished cmd: ping cmdStr='/bin/ping -c 1 node3'  had result: cmd had rc=0 completed=True halted=False
  stdout='PING node3 (192.168.188.89) 56(84) bytes of data.
64 bytes from node3 (192.168.188.89): icmp_seq=1 ttl=64 time=0.298 ms

--- node3 ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.298/0.298/0.298/0.000 ms
'
  stderr=''
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] finished cmd: ping cmdStr='/bin/ping -c 1 node2'  had result: cmd had rc=0 completed=True halted=False
  stdout='PING node2 (192.168.188.88) 56(84) bytes of data.
64 bytes from node2 (192.168.188.88): icmp_seq=1 ttl=64 time=0.265 ms

--- node2 ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.265/0.265/0.265/0.000 ms
'
  stderr=''
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Construct host-->datadirs mapping:
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Host node3 has Datadirs: [/data/greenplum/data1/primary/gpseg2,/data/greenplum/data2/primary/gpseg3]
Host node2 has Datadirs: [/data/greenplum/data1/primary/gpseg0,/data/greenplum/data2/primary/gpseg1]
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Commencing parallel segment instance startup, please wait...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Dispatching command to start segments on host: node3, with 4 contents in cluster
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-$GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Dispatching command to start segments on host: node2, with 4 contents in cluster
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] got cmd: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-$GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] got cmd: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] finished cmd: remote segment starts on host 'node2' cmdStr='ssh -o StrictHostKeyChecking=no -o ServerAliveInterval=60 node2 ". /usr/local/greenplum-db/./greenplum_path.sh; $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'"'  had result: cmd had rc=0 completed=True halted=False
  stdout='20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Starting gpsegstart.py with args: -M mirrorless -V postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7 -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D 2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0 -D 3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1 -B 64 -l /home/gpadmin/gpAdminLogs
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Validating directories...
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Validating directory: /data/greenplum/data1/primary/gpseg0
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Validating directory: /data/greenplum/data2/primary/gpseg1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Starting segments... (mirroringMode mirrorless)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data1/primary/gpseg0
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data2/primary/gpseg1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] finished cmd: Starting seg at dir /data/greenplum/data2/primary/gpseg1 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] finished cmd: Starting seg at dir /data/greenplum/data1/primary/gpseg0 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Checking segment postmasters... (must_be_running True)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Postmaster /data/greenplum/data1/primary/gpseg0 is running (pid 6437)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Postmaster /data/greenplum/data2/primary/gpseg1 is running (pid 6438)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-
COMMAND RESULTS
STATUS--DIR:/data/greenplum/data1/primary/gpseg0--STARTED:True--REASONCODE:0--REASON:Start Succeeded
STATUS--DIR:/data/greenplum/data2/primary/gpseg1--STARTED:True--REASONCODE:0--REASON:Start Succeeded
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-WorkerPool haltWork()
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] haltWork
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] haltWork
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] got a halt cmd
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] got a halt cmd
'
  stderr=''
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] finished cmd: remote segment starts on host 'node3' cmdStr='ssh -o StrictHostKeyChecking=no -o ServerAliveInterval=60 node3 ". /usr/local/greenplum-db/./greenplum_path.sh; $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'"'  had result: cmd had rc=0 completed=True halted=False
  stdout='20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Starting gpsegstart.py with args: -M mirrorless -V postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7 -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D 4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2 -D 5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3 -B 64 -l /home/gpadmin/gpAdminLogs
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Validating directories...
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Validating directory: /data/greenplum/data2/primary/gpseg3
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Validating directory: /data/greenplum/data1/primary/gpseg2
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Starting segments... (mirroringMode mirrorless)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data2/primary/gpseg3
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data1/primary/gpseg2
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] finished cmd: Starting seg at dir /data/greenplum/data2/primary/gpseg3 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] finished cmd: Starting seg at dir /data/greenplum/data1/primary/gpseg2 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Checking segment postmasters... (must_be_running True)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Postmaster /data/greenplum/data2/primary/gpseg3 is running (pid 6421)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Postmaster /data/greenplum/data1/primary/gpseg2 is running (pid 6422)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-
COMMAND RESULTS
STATUS--DIR:/data/greenplum/data2/primary/gpseg3--STARTED:True--REASONCODE:0--REASON:Start Succeeded
STATUS--DIR:/data/greenplum/data1/primary/gpseg2--STARTED:True--REASONCODE:0--REASON:Start Succeeded
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-WorkerPool haltWork()
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] haltWork
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] haltWork
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] got a halt cmd
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] got a halt cmd
'
  stderr=''
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Process results...
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:2  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:3  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:5  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:4  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------


20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-   Successful segment starts                                            = 4
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-   Failed segment starts                                                = 0
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-   Skipped segment starts (segments are marked down in configuration)   = 0
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Successfully started 4 of 4 segment instances 
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Starting Master instance node1 directory /data/greenplum/data/master/gpseg-1 
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 -l /data/greenplum/data/master/gpseg-1/pg_log/startup.log -w -t 600 -o " -p 5432 -E " start
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 status
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Command pg_ctl reports Master node1 instance active
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Connecting to dbname='template1' connect_timeout=15
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - write_gp_era
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-opening new file
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-wrote era: 9c0a65a67950ab11_200504182907
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-setting read only
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-verifying file
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-No standby master configured.  skipping...
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Database successfully started
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-WorkerPool haltWork()
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker4] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker4] got a halt cmd
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully started new Greenplum instance
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed restart of Greenplum instance in production mode
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function START_QD_PRODUCTION
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_DATABASE
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed create database gpdw
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_DATABASE
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SET_GP_USER_PW
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed update Greenplum superuser password
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SET_GP_USER_PW
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function REGISTER_MIRRORS
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=0
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=1
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=2
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=3
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function REGISTER_MIRRORS
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_SEGMENT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing parallel build of mirror segment instances
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SETUP
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Spawning parallel processes    batch [1], please wait...
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SETUP
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_WAIT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Waiting for parallel processes batch [1], please wait...
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_WAIT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Parallel process exit status
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as completed           = 4
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as killed              = 0
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as failed              = 0
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_SEGMENT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function FORCE_FTS_PROBE
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function FORCE_FTS_PROBE
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SCAN_LOG
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Scanning utility log file for any warning messages
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Log file scan check passed
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SCAN_LOG
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Database instance successfully created
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-------------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-To complete the environment configuration, please 
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-update gpadmin .bashrc file with the following
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-1. Ensure that the greenplum_path.sh file is sourced
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-2. Add "export MASTER_DATA_DIRECTORY=/data/greenplum/data/master/gpseg-1"
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-   to access the Greenplum scripts for this instance:
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-   or, use -d /data/greenplum/data/master/gpseg-1 option for the Greenplum scripts
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-   Example gpstate -d /data/greenplum/data/master/gpseg-1
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Script log file = /home/gpadmin/gpAdminLogs/gpinitsystem_20200504.log
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-To remove instance, run gpdeletesystem utility
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-To initialize a Standby Master Segment for this Greenplum instance
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Review options for gpinitstandby
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-------------------------------------------------------
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-The Master /data/greenplum/data/master/gpseg-1/pg_hba.conf post gpinitsystem
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-has been configured to allow all hosts within this new
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-array to intercommunicate. Any hosts external to this
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-new array must be explicitly added to this file
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-Refer to the Greenplum Admin support guide which is
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-located in the /usr/local/greenplum-db/./docs directory
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-------------------------------------------------------
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Main

3.9.2 添加standby master

gpinitstandby -s node2
20200504:19:23:41:023459 gpinitstandby:node1:gpadmin-[INFO]:-Validating environment and parameters for standby initialization...
20200504:19:23:41:023459 gpinitstandby:node1:gpadmin-[INFO]:-Checking for data directory /data/greenplum/data/master/gpseg-1 on node2
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:------------------------------------------------------
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master initialization parameters
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:------------------------------------------------------
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum master hostname               = node1
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum master data directory         = /data/greenplum/data/master/gpseg-1
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum master port                   = 5432
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master hostname       = node2
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master port           = 5432
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master data directory = /data/greenplum/data/master/gpseg-1
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum update system catalog         = On
Do you want to continue with standby master initialization? Yy|Nn (default=N):
> y
20200504:19:23:43:023459 gpinitstandby:node1:gpadmin-[INFO]:-Syncing Greenplum Database extensions to standby
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-The packages on node2 are consistent.
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-Adding standby master to catalog...
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-Database catalog updated successfully.
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-Updating pg_hba.conf file...
20200504:19:23:45:023459 gpinitstandby:node1:gpadmin-[INFO]:-pg_hba.conf files updated successfully.
20200504:19:23:46:023459 gpinitstandby:node1:gpadmin-[INFO]:-Starting standby master
20200504:19:23:46:023459 gpinitstandby:node1:gpadmin-[INFO]:-Checking if standby master is running on host: node2  in directory: /data/greenplum/data/master/gpseg-1
20200504:19:23:48:023459 gpinitstandby:node1:gpadmin-[INFO]:-Cleaning up pg_hba.conf backup files...
20200504:19:23:49:023459 gpinitstandby:node1:gpadmin-[INFO]:-Backup files of pg_hba.conf cleaned up successfully.
20200504:19:23:49:023459 gpinitstandby:node1:gpadmin-[INFO]:-Successfully created standby master on node2

 #查看安裝日誌根據告警和報錯做出相對調整
 cat /home/gpadmin/gpAdminLogs/gpinitsystem_20200504.log |grep -E -i 'WARN|ERROR]'
#根據日誌添加環境變量
#PGPORT、PGUSER、PGDATABASE需要額外添加
cat >> /home/gpadmin/.bash_profile << EOF
export MASTER_DATA_DIRECTORY=/data/greenplum/data/master/gpseg-1
export PGPORT=5432
export PGUSER=gpadmin
export PGDATABASE=gpdw
EOF

cat >> /home/gpadmin/.bashrc << EOF
export MASTER_DATA_DIRECTORY=/data/greenplum/data/master/gpseg-1
export PGPORT=5432
export PGUSER=gpadmin
export PGDATABASE=gpdw
EOF
#同步環境變量
gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bash_profile gpadmin@=:/home/gpadmin/.bash_profile
gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bashrc gpadmin@=:/home/gpadmin/.bashrc
gpssh -f /usr/local/greenplum-db/all_host -e 'source /home/gpadmin/.bash_profile;source /home/gpadmin/.bashrc;'

3.9.3 刪除重裝

#刪除重裝
gpdeletesystem -d /opt/greenplum/data/master/gpseg-1 -f
-d 後面跟 MASTER_DATA_DIRECTORY(master 的數據目錄),會清除master,segment所有的數據目錄。
-f force, 終止所有進程,強制刪除。

3.10 安裝成功後配置

3.10.1 配置pg_hba.conf

/data/greenplum/data/master/gpseg-1/pg_hba.conf
local    all         gpadmin         ident
host     all         gpadmin         127.0.0.1/28    trust
host     all         gpadmin         192.168.188.87/32       trust
host     all         gpadmin         ::1/128       trust
host     all         gpadmin         fe80::3157:3eb8:c80e:8f40/128       trust
#添加允許任意IP密碼登陸
host     all         gpadmin         0.0.0.0/0   md5
local    replication gpadmin         ident
host     replication gpadmin         samehost       trust
host     replication gpadmin         192.168.188.87/32       trust

3.10.2 配置postgresql.conf

/data/greenplum/data/master/gpseg-1/postgresql.conf 
#允許監聽任意IP
listen_addresses = '*'
#不重啓數據庫,reload配置文件
gpstop -u

3.10.3 登陸數據庫

#登陸主節點
PGOPTIONS='-c gp_session_role=utility' psql -h node1 -p5432 -d postgres
 
#登陸到segment,需要指定segment 端口。
PGOPTIONS='-c gp_session_role=utility' psql -h node2 -p6000 -d postgres

postgres=# select * from gp_segment_configuration;
 dbid | content | role | preferred_role | mode | status | port | hostname | address |               datadir                
------+---------+------+----------------+------+--------+------+----------+---------+--------------------------------------
    1 |      -1 | p    | p              | n    | u      | 5432 | node1    | node1   | /data/greenplum/data/master/gpseg-1
    2 |       0 | p    | p              | s    | u      | 6000 | node2    | node2   | /data/greenplum/data1/primary/gpseg0
    6 |       0 | m    | m              | s    | u      | 7000 | node3    | node3   | /data/greenplum/data1/mirror/gpseg0
    4 |       2 | p    | p              | s    | u      | 6000 | node3    | node3   | /data/greenplum/data1/primary/gpseg2
    8 |       2 | m    | m              | s    | u      | 7000 | node2    | node2   | /data/greenplum/data1/mirror/gpseg2
    3 |       1 | p    | p              | s    | u      | 6001 | node2    | node2   | /data/greenplum/data2/primary/gpseg1
    7 |       1 | m    | m              | s    | u      | 7001 | node3    | node3   | /data/greenplum/data2/mirror/gpseg1
    5 |       3 | p    | p              | s    | u      | 6001 | node3    | node3   | /data/greenplum/data2/primary/gpseg3
    9 |       3 | m    | m              | s    | u      | 7001 | node2    | node2   | /data/greenplum/data2/mirror/gpseg3
   10 |      -1 | m    | m              | s    | u      | 5432 | node2    | node2   | /data/greenplum/data/master/gpseg-1
(10 rows)

四、附錄

4.1 常用命令

gpstate -b =》 顯示簡要狀態
gpstate -c =》 顯示主鏡像映射
gpstart -d =》 指定數據目錄(默認值:$MASTER_DATA_DIRECTORY)
gpstate -e =》 顯示具有鏡像狀態問題的片段
gpstate -f =》 顯示備用主機詳細信息
gpstate -i =》 顯示GRIPLUM數據庫版本
gpstate -m =》 顯示鏡像實例同步狀態
gpstate -p =》 顯示使用端口
gpstate -Q =》 快速檢查主機狀態
gpstate -s =》 顯示集羣詳細信息
gpstate -v =》 顯示詳細信息
                            作用
gpconfig -c =》 --change param_name  通過在postgresql.conf 文件的底部添加新的設置來改變配置參數的設置。
gpconfig -v =》 --value value 用於由-c選項指定的配置參數的值。默認情況下,此值將應用於所有Segment及其鏡像、Master和後備Master。
gpconfig -m =》 --mastervalue master_value 用於由-c 選項指定的配置參數的Master值。如果指定,則該值僅適用於Master和後備Master。該選項只能與-v一起使用。
gpconfig -masteronly =》當被指定時,gpconfig 將僅編輯Master的postgresql.conf文件。
gpconfig -r =》 --remove param_name 通過註釋掉postgresql.conf文件中的項刪除配置參數。
gpconfig -l =》 --list 列出所有被gpconfig工具支持的配置參數。
gpconfig -s =》 --show param_name 顯示在Greenplum數據庫系統中所有實例(Master和Segment)上使用的配置參數的值。如果實例中參數值存在差異,則工具將顯示錯誤消息。使用-s=》選項運行gpconfig將直接從數據庫中讀取參數值,而不是從postgresql.conf文件中讀取。如果用戶使用gpconfig 在所有Segment中設置配置參數,然後運行gpconfig -s來驗證更改,用戶仍可能會看到以前的(舊)值。用戶必須重新加載配置文件(gpstop -u)或重新啓動系統(gpstop -r)以使更改生效。
gpconfig --file =》 對於配置參數,顯示在Greenplum數據庫系統中的所有Segment(Master和Segment)上的postgresql.conf文件中的值。如果實例中的參數值存在差異,則工具會顯示一個消息。必須與-s選項一起指定。
gpconfig --file-compare 對於配置參數,將當前Greenplum數據庫值與主機(Master和Segment)上postgresql.conf文件中的值進行比較。
gpconfig --skipvalidation 覆蓋gpconfig的系統驗證檢查,並允許用戶對任何服務器配置參數進行操作,包括隱藏參數和gpconfig無法更改的受限參數。當與-l選項(列表)一起使用時,它顯示受限參數的列表。 警告: 使用此選項設置配置參數時要格外小心。
gpconfig --verbose 在gpconfig命令執行期間顯示額外的日誌信息。
gpconfig --debug 設置日誌輸出級別爲調試級別。
gpconfig -? | -h | --help 顯示在線幫助。

gpstart -a => 快速啓動|
gpstart -d => 指定數據目錄(默認值:$MASTER_DATA_DIRECTORY)
gpstart -q => 在安靜模式下運行。命令輸出不顯示在屏幕,但仍然寫入日誌文件。
gpstart -m => 以維護模式連接到Master進行目錄維護。例如:$ PGOPTIONS='-c gp_session_role=utility' psql postgres
gpstart -R => 管理員連接
gpstart -v => 顯示詳細啓動信息

gpstop -a => 快速停止
gpstop -d => 指定數據目錄(默認值:$MASTER_DATA_DIRECTORY)
gpstop -m => 維護模式
gpstop -q => 在安靜模式下運行。命令輸出不顯示在屏幕,但仍然寫入日誌文件。
gpstop -r => 停止所有實例,然後重啓系統
gpstop -u => 重新加載配置文件 postgresql.conf 和 pg_hba.conf
gpstop -v => 顯示詳細啓動信息
gpstop -M fast      => 快速關閉。正在進行的任何事務都被中斷。然後滾回去。
gpstop -M immediate => 立即關閉。正在進行的任何事務都被中止。不推薦這種關閉模式,並且在某些情況下可能導致數據庫損壞需要手動恢復。
gpstop -M smart     => 智能關閉。如果存在活動連接,則此命令在警告時失敗。這是默認的關機模式。
gpstop --host hostname => 停用segments數據節點,不能與-m、-r、-u、-y同時使用 

gprecoverseg -a => 快速恢復
gprecoverseg -i => 指定恢復文件
gprecoverseg -d => 指定數據目錄
gprecoverseg -l => 指定日誌文件
gprecoverseg -r => 平衡數據
gprecoverseg -s => 指定配置空間文件
gprecoverseg -o => 指定恢復配置文件
gprecoverseg -p => 指定額外的備用機
gprecoverseg -S => 指定輸出配置空間文件

gpactivatestandby -d 路徑 | 使用數據目錄絕對路徑,默認:$MASTER_DATA_DIRECTORY
gpactivatestandby -f | 強制激活備份主機
gpactivatestandby -v | 顯示此版本信息

gpinitstandby -s 備庫名稱 => 指定新備庫
gpinitstandby -D => debug 模式
gpinitstandby -r => 移除備用機

本文參考:
https://blog.csdn.net/zutsoft/article/details/103645796
https://www.cnblogs.com/pl-boke/p/9852383.html

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章