Indexer Process Blocks at Startup

I am running a couchbase 4.0 cluster on CentOS.

I am building a secondary index on a bucket with ~300 Million documents.
The build process was about 90% complete and then the indexer froze completely.

I tried to restart the index process and now it will not restart properly (it doesn’t create a listener on port 9102 even).

Please let me know if you have any advice, logs posted below:

2016-03-07T19:43:06.442Z+00:00 [Info] Indexer started with command line: [/opt/couchbase/bin/indexer -vbuckets=1024 -cluster=127.0.0.1:8091 -adminPort=9100 -scanPort=9101 -httpPort=9102 -streamInitPort=9103 -streamCatchupPort=9104 -streamMaintPort=9105 -storageDir=/cb_indexes/@2i]
2016-03-07T19:43:06.46Z+00:00 [Info] Indexer::NewIndexer Status INIT
2016-03-07T19:43:06.462Z+00:00 [Info] Setting maxcpus = 4
2016-03-07T19:43:06.462Z+00:00 [Info] Setting buffer block size to 16384 bytes
2016-03-07T19:43:06.462Z+00:00 [Info] Setting log level to Debug
2016-03-07T19:43:06.462Z+00:00 [Info] Indexer::NewIndexer Starting with Vbuckets 1024
2016-03-07T19:43:06.504Z+00:00 [Info] [Queryport “:9101”] started …
2016-03-07T19:43:06.507Z+00:00 [Info] New settings received:
{“indexer.settings.compaction.interval”:“00:00,00:00”,“indexer.settings.persisted_snapshot.interval”:5000,“indexer.settings.log_level”:“debug”,“indexer.settings.compaction.min_frag”:30,“indexer.settings.inmemory_snapshot.interval”:200,“indexer.settings.max_cpu_percent”:400,“indexer.settings.recovery.max_rollbacks”:5,“indexer.settings.memory_quota”:536870912}
2016-03-07T19:43:06.507Z+00:00 [Info] Setting maxcpus = 4
2016-03-07T19:43:06.507Z+00:00 [Info] Setting log level to Debug
2016-03-07T19:43:06.537Z+00:00 [Debug] Repo.OpenRepositoryWithName(): open repo with name /cb_indexes/@2i/MetadataStore, buffer cache size 268435456
2016-03-07T19:43:06.598Z+00:00 [Debug] Repo.CreateSnapshot(): txnid 4294967301, forestdb seqnum 5
2016-03-07T19:43:06.598Z+00:00 [Debug] EmbeddedServer.runOnce() : Start Running Server
2016-03-07T19:43:06.598Z+00:00 [Debug] LeaderServer.RunLeaderServer(): start leader server :9100
2016-03-07T19:43:06.599Z+00:00 [Debug] Leader.listen(): start listening to message for leader
2016-03-07T19:43:06.599Z+00:00 [Debug] LeaderServer.incrementEpoch(): new epoch 20
2016-03-07T19:43:06.6Z+00:00 [Debug] Repo.Set(): key AcceptedEpoch, len(content) 2
2016-03-07T19:43:06.6Z+00:00 [Debug] LifecycleMgr.processRequest(): LifecycleMgr is ready to proces request
2016-03-07T19:43:06.6Z+00:00 [Info] ClustMgr:handleGetLocalValue Key IndexerId
2016-03-07T19:43:06.602Z+00:00 [Debug] Repo.Set(): key CurrentEpoch, len(content) 2
2016-03-07T19:43:06.604Z+00:00 [Debug] LeaderServer.processRequest(): Leader Server is ready to proces request
2016-03-07T19:43:06.604Z+00:00 [Info] Indexer Id 89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.605Z+00:00 [Debug] KVSender::handleCloseStream MAINT_STREAM
Message: MsgStreamUpdate
Type: CLOSE_STREAM
Stream: MAINT_STREAM
Bucket:
BuildTS: []
IndexList: []
RestartTs:
2016-03-07T19:43:06.636Z+00:00 [Info] KVSender::sendShutdownTopic Projector node28-18-123.eadpdata.ddns.ea.com:9999 Topic MAINT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.637Z+00:00 [Fatal] KVSender::sendShutdownTopic Unexpected Error During Shutdown Projector node28-18-123.eadpdata.ddns.ea.com:9999 Topic MAINT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0. Err projector.topicMissing
2016-03-07T19:43:06.637Z+00:00 [Error] KVSender::closeMutationStream MAINT_STREAM Error Received projector.topicMissing from node28-18-123.eadpdata.ddns.ea.com:9999
2016-03-07T19:43:06.637Z+00:00 [Info] KVSender::closeMutationStream MAINT_STREAM Treating projector.topicMissing As Success
2016-03-07T19:43:06.638Z+00:00 [Info] KVSender::sendShutdownTopic Projector node28-18-174.eadpdata.ddns.ea.com:9999 Topic MAINT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.639Z+00:00 [Fatal] KVSender::sendShutdownTopic Unexpected Error During Shutdown Projector node28-18-174.eadpdata.ddns.ea.com:9999 Topic MAINT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0. Err projector.topicMissing
2016-03-07T19:43:06.639Z+00:00 [Error] KVSender::closeMutationStream MAINT_STREAM Error Received projector.topicMissing from node28-18-174.eadpdata.ddns.ea.com:9999
2016-03-07T19:43:06.639Z+00:00 [Info] KVSender::closeMutationStream MAINT_STREAM Treating projector.topicMissing As Success
2016-03-07T19:43:06.639Z+00:00 [Info] KVSender::sendShutdownTopic Projector node28-18-195.eadpdata.ddns.ea.com:9999 Topic MAINT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.64Z+00:00 [Fatal] KVSender::sendShutdownTopic Unexpected Error During Shutdown Projector node28-18-195.eadpdata.ddns.ea.com:9999 Topic MAINT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0. Err projector.topicMissing
2016-03-07T19:43:06.64Z+00:00 [Error] KVSender::closeMutationStream MAINT_STREAM Error Received projector.topicMissing from node28-18-195.eadpdata.ddns.ea.com:9999
2016-03-07T19:43:06.64Z+00:00 [Info] KVSender::closeMutationStream MAINT_STREAM Treating projector.topicMissing As Success
2016-03-07T19:43:06.64Z+00:00 [Debug] KVSender::handleCloseStream INIT_STREAM
Message: MsgStreamUpdate
Type: CLOSE_STREAM
Stream: INIT_STREAM
Bucket:
BuildTS: []
IndexList: []
RestartTs:
2016-03-07T19:43:06.673Z+00:00 [Info] KVSender::sendShutdownTopic Projector node28-18-123.eadpdata.ddns.ea.com:9999 Topic INIT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.674Z+00:00 [Fatal] KVSender::sendShutdownTopic Unexpected Error During Shutdown Projector node28-18-123.eadpdata.ddns.ea.com:9999 Topic INIT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0. Err projector.topicMissing
2016-03-07T19:43:06.674Z+00:00 [Error] KVSender::closeMutationStream INIT_STREAM Error Received projector.topicMissing from node28-18-123.eadpdata.ddns.ea.com:9999
2016-03-07T19:43:06.674Z+00:00 [Info] KVSender::closeMutationStream INIT_STREAM Treating projector.topicMissing As Success
2016-03-07T19:43:06.674Z+00:00 [Info] KVSender::sendShutdownTopic Projector node28-18-174.eadpdata.ddns.ea.com:9999 Topic INIT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.675Z+00:00 [Fatal] KVSender::sendShutdownTopic Unexpected Error During Shutdown Projector node28-18-174.eadpdata.ddns.ea.com:9999 Topic INIT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0. Err projector.topicMissing
2016-03-07T19:43:06.675Z+00:00 [Error] KVSender::closeMutationStream INIT_STREAM Error Received projector.topicMissing from node28-18-174.eadpdata.ddns.ea.com:9999
2016-03-07T19:43:06.675Z+00:00 [Info] KVSender::closeMutationStream INIT_STREAM Treating projector.topicMissing As Success
2016-03-07T19:43:06.675Z+00:00 [Info] KVSender::sendShutdownTopic Projector node28-18-195.eadpdata.ddns.ea.com:9999 Topic INIT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0
2016-03-07T19:43:06.675Z+00:00 [Fatal] KVSender::sendShutdownTopic Unexpected Error During Shutdown Projector node28-18-195.eadpdata.ddns.ea.com:9999 Topic INIT_STREAM_TOPIC_89:ab:58:fe:65:96:35:f0. Err projector.topicMissing
2016-03-07T19:43:06.675Z+00:00 [Error] KVSender::closeMutationStream INIT_STREAM Error Received projector.topicMissing from node28-18-195.eadpdata.ddns.ea.com:9999
2016-03-07T19:43:06.675Z+00:00 [Info] KVSender::closeMutationStream INIT_STREAM Treating projector.topicMissing As Success
2016-03-07T19:43:06.675Z+00:00 [Info] ClustMgr:handleGetGlobalTopology &{map[]}
2016-03-07T19:43:06.675Z+00:00 [Info] Indexer::initFromPersistedState Recovered IndexInstMap
InstanceId: 2007323185282145784 Name: id_list_2 Bucket: bugsentry State: INDEX_STATE_INITIAL Stream: MAINT_STREAM
2016-03-07T19:43:06.752Z+00:00 [Info] DCPT[secidx:getseqnos-8b:1c:35:a8:1f:bd:55:44] ##abba feed started …
2016-03-07T19:43:06.756Z+00:00 [Info] DCPT[secidx:getseqnos-58:8:8b:82:e4:83:e7:df] ##abba feed started …
2016-03-07T19:43:06.759Z+00:00 [Info] DCPT[secidx:getseqnos-73:ac:50:b5:7:f2:36:4c] ##abba feed started …
2016-03-07T19:43:06.759Z+00:00 [Info] {bucket,feeds} “bugsentry” created for dcp_seqno cache…
2016-03-07T19:43:06.762Z+00:00 [Info] Indexer::initPartnInstance Initialized Partition:
Index: 2007323185282145784 Partition: PartitionId: 0 Endpoints: [:9105]
2016-03-07T19:43:06.762Z+00:00 [Debug] NewForestDBSlice(): buffer cache size 536870912
2016-03-07T19:43:06.762Z+00:00 [Debug] NewForestDBSlice(): buffer cache size 536870912
2016-03-07T19:43:06.762Z+00:00 [Verbose] NewForestDBSlice(): max writer lock prob 20
2016-03-07T19:43:06.762Z+00:00 [Verbose] NewForestDBSlice(): wal size 4096
2016-03-07T19:43:36.321Z+00:00 [Debug] LeaderServer.listenFollower(): Receive connection request from follower 10.28.18.123:47934
2016-03-07T19:43:36.321Z+00:00 [Debug] LeaderServer.listenFollower(): Receive connection request from follower 10.28.18.174:60239
2016-03-07T19:43:36.321Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = FollowerInfo
2016-03-07T19:43:36.321Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = FollowerInfo
2016-03-07T19:43:36.321Z+00:00 [Debug] version:1 acceptedEpoch:0 fid:“10.28.18.123/24:indexer:MetadataProvider:bc:6a:3f:b9:bf:ad:36:3a” voting:false
2016-03-07T19:43:36.321Z+00:00 [Debug] version:1 acceptedEpoch:0 fid:“10.28.18.174/24:indexer:MetadataProvider:78:57:3d:54:cc:ce:3c:b4” voting:false
2016-03-07T19:43:36.321Z+00:00 [Debug] LeaderServer.listenFollower(): Receive connection request from follower 10.28.18.195:42442
2016-03-07T19:43:36.322Z+00:00 [Debug] LeaderServer.startProxy(): Start synchronization with follower. Peer TCP connection (10.28.18.174:60239)
2016-03-07T19:43:36.322Z+00:00 [Debug] LeaderSyncProxy.updateAcceptedEpochAfterQuroum()
2016-03-07T19:43:36.322Z+00:00 [Debug] LeaderServer.startProxy(): Start synchronization with follower. Peer TCP connection (10.28.18.123:47934)
2016-03-07T19:43:36.322Z+00:00 [Debug] LeaderSyncProxy.updateAcceptedEpochAfterQuroum()
2016-03-07T19:43:36.322Z+00:00 [Debug] LeaderServer.startProxy(): Start synchronization with follower. Peer TCP connection (10.28.18.195:42442)
2016-03-07T19:43:36.322Z+00:00 [Debug] LeaderSyncProxy.updateAcceptedEpochAfterQuroum()
2016-03-07T19:43:36.323Z+00:00 [Debug] LeaderSyncProxy.notifyNewEpoch()
2016-03-07T19:43:36.323Z+00:00 [Debug] LeaderSyncProxy.notifyNewEpoch()
2016-03-07T19:43:36.323Z+00:00 [Debug] LeaderSyncProxy.updateCurrentEpochAfterQuorum()
2016-03-07T19:43:36.323Z+00:00 [Debug] LeaderSyncProxy.updateCurrentEpochAfterQuorum()
2016-03-07T19:43:36.323Z+00:00 [Debug] PeerPipe.doSend() : Sending message LeaderInfo (len 30) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.323Z+00:00 [Debug] PeerPipe.doSend() : Sending message LeaderInfo (len 30) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.323Z+00:00 [Debug] version:1 acceptedEpoch:20
2016-03-07T19:43:36.323Z+00:00 [Debug] version:1 acceptedEpoch:20
2016-03-07T19:43:36.323Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = EpochAck
2016-03-07T19:43:36.323Z+00:00 [Debug] version:1 lastLoggedTxid:0 currentEpoch:0
2016-03-07T19:43:36.323Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = EpochAck
2016-03-07T19:43:36.323Z+00:00 [Debug] version:1 lastLoggedTxid:0 currentEpoch:0
2016-03-07T19:43:36.324Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = FollowerInfo
2016-03-07T19:43:36.324Z+00:00 [Debug] version:1 acceptedEpoch:0 fid:“10.28.18.195/24:indexer:MetadataProvider:44:16:af:34:91:c6:58:c7” voting:false
2016-03-07T19:43:36.324Z+00:00 [Debug] LeaderSyncProxy.syncWithLeader()
2016-03-07T19:43:36.324Z+00:00 [Debug] LeaderSyncProxy.syncWithLeader()
2016-03-07T19:43:36.324Z+00:00 [Debug] LeaderSyncProxy.sendEntriesInCommittedLog(): startTxid 0 endTxid 0 observer first txid 0
2016-03-07T19:43:36.324Z+00:00 [Debug] LeaderSyncProxy.sendEntriesInCommittedLog(): startTxid 0 endTxid 0 observer first txid 0
2016-03-07T19:43:36.324Z+00:00 [Debug] LeaderSyncProxy.notifyNewEpoch()
2016-03-07T19:43:36.324Z+00:00 [Debug] LeaderSyncProxy.updateCurrentEpochAfterQuorum()
2016-03-07T19:43:36.324Z+00:00 [Debug] PeerPipe.doSend() : Sending message LeaderInfo (len 30) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.324Z+00:00 [Debug] version:1 acceptedEpoch:20
2016-03-07T19:43:36.325Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 60) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.325Z+00:00 [Debug] version:1 txnid:4294967301 opCode:5 key:“StreamBegin” content:“StreamBegin"
2016-03-07T19:43:36.325Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = EpochAck
2016-03-07T19:43:36.325Z+00:00 [Debug] version:1 lastLoggedTxid:0 currentEpoch:0
2016-03-07T19:43:36.325Z+00:00 [Debug] LeaderSyncProxy.syncWithLeader()
2016-03-07T19:43:36.325Z+00:00 [Debug] LeaderSyncProxy.sendEntriesInCommittedLog(): startTxid 0 endTxid 0 observer first txid 0
2016-03-07T19:43:36.326Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 60) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.326Z+00:00 [Debug] version:1 txnid:4294967301 opCode:5 key:“StreamBegin” content:“StreamBegin"
2016-03-07T19:43:36.329Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 60) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.329Z+00:00 [Debug] version:1 txnid:4294967301 opCode:5 key:“StreamBegin” content:“StreamBegin"
2016-03-07T19:43:36.329Z+00:00 [Debug] LeaderSyncProxy.declareNewLeaderAfterQuorum()
2016-03-07T19:43:36.329Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 97) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.329Z+00:00 [Debug] version:1 txnid:1 opCode:2 key:“GlobalIndexTopology” content:”{“topologyKeys”:[“IndexTopology/bugsentry”]}“
2016-03-07T19:43:36.329Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 319) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.329Z+00:00 [Debug] version:1 txnid:2 opCode:2 key:“IndexDefinitionId/2007323185282145784” content:”{“defnId”:2007323185282145784,“name”:“id_list_2”,“using”:“forestdb”,“bucket”:“bugsentry”,“bucketUUID”:“7a9ef591c41d68fa2470811faa86eaa7”,“secExprs”:[”(meta().id)”],“exprType”:“N1QL”,“partitionScheme”:“SINGLE”,“nodes”:[“2c:50:19:62:ee:12:70:63”]}“
2016-03-07T19:43:36.329Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 356) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.329Z+00:00 [Debug] version:1 txnid:4294967301 opCode:2 key:“IndexTopology/bugsentry” content:”{“version”:3,“bucket”:“bugsentry”,“definitions”:[{“bucket”:“bugsentry”,“name”:“id_list_2”,“defnId”:2007323185282145784,“instances”:[{“instId”:2007323185282145784,“state”:2,“steamId”:1,“partitions”:[{“singlePartition”:{“slices”:[{“indexerId”:“89:ab:58:fe:65:96:35:f0”}]},“keyPartition”:{}}]}]}]}“
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 56) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:4294967301 opCode:6 key:“StreamEnd” content:“StreamEnd"
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message NewLeader (len 29) to Peer 10.28.18.195:42442
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 currentEpoch:20
2016-03-07T19:43:36.33Z+00:00 [Debug] LeaderSyncProxy.declareNewLeaderAfterQuorum()
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 97) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:1 opCode:2 key:“GlobalIndexTopology” content:”{“topologyKeys”:[“IndexTopology/bugsentry”]}“
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 319) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:2 opCode:2 key:“IndexDefinitionId/2007323185282145784” content:”{“defnId”:2007323185282145784,“name”:“id_list_2”,“using”:“forestdb”,“bucket”:“bugsentry”,“bucketUUID”:“7a9ef591c41d68fa2470811faa86eaa7”,“secExprs”:[”(meta().id)"],“exprType”:“N1QL”,“partitionScheme”:“SINGLE”,“nodes”:[“2c:50:19:62:ee:12:70:63”]}“
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 356) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:4294967301 opCode:2 key:“IndexTopology/bugsentry” content:”{“version”:3,“bucket”:“bugsentry”,“definitions”:[{“bucket”:“bugsentry”,“name”:“id_list_2”,“defnId”:2007323185282145784,“instances”:[{“instId”:2007323185282145784,“state”:2,“steamId”:1,“partitions”:[{“singlePartition”:{“slices”:[{“indexerId”:“89:ab:58:fe:65:96:35:f0”}]},“keyPartition”:{}}]}]}]}“
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 56) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:4294967301 opCode:6 key:“StreamEnd” content:“StreamEnd"
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message NewLeader (len 29) to Peer 10.28.18.174:60239
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 currentEpoch:20
2016-03-07T19:43:36.33Z+00:00 [Debug] LeaderSyncProxy.declareNewLeaderAfterQuorum()
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 97) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:1 opCode:2 key:“GlobalIndexTopology” content:”{“topologyKeys”:[“IndexTopology/bugsentry”]}“
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 319) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:2 opCode:2 key:“IndexDefinitionId/2007323185282145784” content:”{“defnId”:2007323185282145784,“name”:“id_list_2”,“using”:“forestdb”,“bucket”:“bugsentry”,“bucketUUID”:“7a9ef591c41d68fa2470811faa86eaa7”,“secExprs”:[”(meta().id)"],“exprType”:“N1QL”,“partitionScheme”:“SINGLE”,“nodes”:[“2c:50:19:62:ee:12:70:63”]}“
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 356) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:4294967301 opCode:2 key:“IndexTopology/bugsentry” content:”{“version”:3,“bucket”:“bugsentry”,“definitions”:[{“bucket”:“bugsentry”,“name”:“id_list_2”,“defnId”:2007323185282145784,“instances”:[{“instId”:2007323185282145784,“state”:2,“steamId”:1,“partitions”:[{“singlePartition”:{“slices”:[{“indexerId”:“89:ab:58:fe:65:96:35:f0”}]},“keyPartition”:{}}]}]}]}"
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message LogEntry (len 56) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 txnid:4294967301 opCode:6 key:“StreamEnd” content:"StreamEnd"
2016-03-07T19:43:36.33Z+00:00 [Debug] PeerPipe.doSend() : Sending message NewLeader (len 29) to Peer 10.28.18.123:47934
2016-03-07T19:43:36.33Z+00:00 [Debug] version:1 currentEpoch:20
2016-03-07T19:43:36.331Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = NewLeaderAck
2016-03-07T19:43:36.331Z+00:00 [Debug] version:1
2016-03-07T19:43:36.331Z+00:00 [Debug] LeaderServer.startProxy(): Sync with watcher done. Add Watcher 10.28.18.195/24:indexer:MetadataProvider:44:16:af:34:91:c6:58:c7 (TCP conn = 10.28.18.195:42442)
2016-03-07T19:43:36.331Z+00:00 [Debug] LeaderServer.startProxy() : Terminates.
2016-03-07T19:43:36.331Z+00:00 [Debug] messageListener.start(): start listening to message from peer 10.28.18.195/24:indexer:MetadataProvider:44:16:af:34:91:c6:58:c7
2016-03-07T19:43:36.331Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = NewLeaderAck
2016-03-07T19:43:36.331Z+00:00 [Debug] version:1
2016-03-07T19:43:36.331Z+00:00 [Debug] LeaderServer.startProxy(): Sync with watcher done. Add Watcher 10.28.18.123/24:indexer:MetadataProvider:bc:6a:3f:b9:bf:ad:36:3a (TCP conn = 10.28.18.123:47934)
2016-03-07T19:43:36.331Z+00:00 [Debug] LeaderServer.startProxy() : Terminates.
2016-03-07T19:43:36.331Z+00:00 [Debug] messageListener.start(): start listening to message from peer 10.28.18.123/24:indexer:MetadataProvider:bc:6a:3f:b9:bf:ad:36:3a
2016-03-07T19:43:36.331Z+00:00 [Debug] PeerPipe.doRecieve() : Message decoded. Packet = NewLeaderAck
2016-03-07T19:43:36.331Z+00:00 [Debug] version:1
2016-03-07T19:43:36.331Z+00:00 [Debug] LeaderServer.startProxy(): Sync with watcher done. Add Watcher 10.28.18.174/24:indexer:MetadataProvider:78:57:3d:54:cc:ce:3c:b4 (TCP conn = 10.28.18.174:60239)
2016-03-07T19:43:36.331Z+00:00 [Debug] LeaderServer.startProxy() : Terminates.
2016-03-07T19:43:36.331Z+00:00 [Debug] messageListener.start(): start listening to message from peer 10.28.18.174/24:indexer:MetadataProvider:78:57:3d:54:cc:ce:3c:b4