Raspbian Package Auto-Building

Build log for consul (1.4.4~dfsg3-5) on armhf

consul1.4.4~dfsg3-5armhf → 2019-11-27 02:32:44

sbuild (Debian sbuild) 0.71.0 (24 Aug 2016) on testwandboard

+==============================================================================+
| consul 1.4.4~dfsg3-5 (armhf)                 Wed, 27 Nov 2019 01:32:21 +0000 |
+==============================================================================+

Package: consul
Version: 1.4.4~dfsg3-5
Source Version: 1.4.4~dfsg3-5
Distribution: bullseye-staging
Machine Architecture: armhf
Host Architecture: armhf
Build Architecture: armhf

I: NOTICE: Log filtering will replace 'var/lib/schroot/mount/bullseye-staging-armhf-sbuild-b7f9e9db-7fc6-4ae5-94b0-e4f33a957447' with '<<CHROOT>>'

+------------------------------------------------------------------------------+
| Update chroot                                                                |
+------------------------------------------------------------------------------+

Get:1 http://172.17.0.1/private bullseye-staging InRelease [11.3 kB]
Get:2 http://172.17.0.1/private bullseye-staging/main Sources [11.5 MB]
Get:3 http://172.17.0.1/private bullseye-staging/main armhf Packages [12.9 MB]
Fetched 24.4 MB in 31s (793 kB/s)
Reading package lists...
W: No sandbox user '_apt' on the system, can not drop privileges

+------------------------------------------------------------------------------+
| Fetch source files                                                           |
+------------------------------------------------------------------------------+


Check APT
---------

Checking available source versions...

Download source files with APT
------------------------------

Reading package lists...
NOTICE: 'consul' packaging is maintained in the 'Git' version control system at:
https://salsa.debian.org/go-team/packages/consul.git
Please use:
git clone https://salsa.debian.org/go-team/packages/consul.git
to retrieve the latest (possibly unreleased) updates to the package.
Need to get 4979 kB of source archives.
Get:1 http://172.17.0.1/private bullseye-staging/main consul 1.4.4~dfsg3-5 (dsc) [5400 B]
Get:2 http://172.17.0.1/private bullseye-staging/main consul 1.4.4~dfsg3-5 (tar) [4954 kB]
Get:3 http://172.17.0.1/private bullseye-staging/main consul 1.4.4~dfsg3-5 (diff) [19.4 kB]
Fetched 4979 kB in 2s (2139 kB/s)
Download complete and in download only mode
I: NOTICE: Log filtering will replace 'build/consul-OXjaLL/consul-1.4.4~dfsg3' with '<<PKGBUILDDIR>>'
I: NOTICE: Log filtering will replace 'build/consul-OXjaLL' with '<<BUILDDIR>>'

+------------------------------------------------------------------------------+
| Install build-essential                                                      |
+------------------------------------------------------------------------------+


Setup apt archive
-----------------

Merged Build-Depends: build-essential, fakeroot
Filtered Build-Depends: build-essential, fakeroot
dpkg-deb: building package 'sbuild-build-depends-core-dummy' in '/<<BUILDDIR>>/resolver-OaQGMc/apt_archive/sbuild-build-depends-core-dummy.deb'.
dpkg-scanpackages: warning: Packages in archive but missing from override file:
dpkg-scanpackages: warning:   sbuild-build-depends-core-dummy
dpkg-scanpackages: info: Wrote 1 entries to output Packages file.
gpg: keybox '/<<BUILDDIR>>/resolver-OaQGMc/gpg/pubring.kbx' created
gpg: /<<BUILDDIR>>/resolver-OaQGMc/gpg/trustdb.gpg: trustdb created
gpg: key 35506D9A48F77B2E: public key "Sbuild Signer (Sbuild Build Dependency Archive Key) <buildd-tools-devel@lists.alioth.debian.org>" imported
gpg: Total number processed: 1
gpg:               imported: 1
gpg: key 35506D9A48F77B2E: "Sbuild Signer (Sbuild Build Dependency Archive Key) <buildd-tools-devel@lists.alioth.debian.org>" not changed
gpg: key 35506D9A48F77B2E: secret key imported
gpg: Total number processed: 1
gpg:              unchanged: 1
gpg:       secret keys read: 1
gpg:   secret keys imported: 1
gpg: using "Sbuild Signer" as default secret key for signing
Ign:1 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ InRelease
Get:2 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Release [957 B]
Get:3 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Release.gpg [370 B]
Get:4 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Sources [349 B]
Get:5 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Packages [431 B]
Fetched 2107 B in 1s (2492 B/s)
Reading package lists...
W: No sandbox user '_apt' on the system, can not drop privileges
Reading package lists...

Install core build dependencies (apt-based resolver)
----------------------------------------------------

Installing build dependencies
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
  libpam-cap netbase
Use 'apt autoremove' to remove them.
The following NEW packages will be installed:
  sbuild-build-depends-core-dummy
0 upgraded, 1 newly installed, 0 to remove and 32 not upgraded.
Need to get 848 B of archives.
After this operation, 0 B of additional disk space will be used.
Get:1 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ sbuild-build-depends-core-dummy 0.invalid.0 [848 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 848 B in 0s (0 B/s)
Selecting previously unselected package sbuild-build-depends-core-dummy.
(Reading database ... 12234 files and directories currently installed.)
Preparing to unpack .../sbuild-build-depends-core-dummy_0.invalid.0_armhf.deb ...
Unpacking sbuild-build-depends-core-dummy (0.invalid.0) ...
Setting up sbuild-build-depends-core-dummy (0.invalid.0) ...
W: No sandbox user '_apt' on the system, can not drop privileges

+------------------------------------------------------------------------------+
| Check architectures                                                          |
+------------------------------------------------------------------------------+

Arch check ok (armhf included in any all)

+------------------------------------------------------------------------------+
| Install package build dependencies                                           |
+------------------------------------------------------------------------------+


Setup apt archive
-----------------

Merged Build-Depends: debhelper (>= 11~), bash-completion, dh-golang (>= 1.42~), golang-any (>= 2:1.13~), golang-github-asaskevich-govalidator-dev, golang-github-armon-circbuf-dev, golang-github-armon-go-metrics-dev (>= 0.0~git20171117~), golang-github-armon-go-radix-dev, golang-github-azure-go-autorest-dev (>= 10.15.5~), golang-github-bgentry-speakeasy-dev, golang-github-circonus-labs-circonus-gometrics-dev (>= 2.3.1~), golang-github-circonus-labs-circonusllhist-dev, golang-github-datadog-datadog-go-dev, golang-github-davecgh-go-spew-dev, golang-github-denverdino-aliyungo-dev, golang-github-digitalocean-godo-dev, golang-github-docker-go-connections-dev, golang-github-elazarl-go-bindata-assetfs-dev (>= 0.0~git20151224~), golang-github-ghodss-yaml-dev, golang-github-gogo-googleapis-dev, golang-github-gogo-protobuf-dev (>= 1.2.1~), golang-github-golang-snappy-dev, golang-github-googleapis-gnostic-dev, golang-github-google-gofuzz-dev, golang-github-gophercloud-gophercloud-dev, golang-github-gregjones-httpcache-dev, golang-github-hashicorp-go-checkpoint-dev, golang-github-hashicorp-go-cleanhttp-dev (>= 0.5.1~), golang-github-hashicorp-go-discover-dev, golang-github-hashicorp-go-hclog-dev (>= 0.9.2~), golang-github-hashicorp-go-immutable-radix-dev (>= 1.1.0~), golang-github-hashicorp-golang-lru-dev (>= 0.0~git20160207~), golang-github-hashicorp-go-memdb-dev (>= 0.0~git20180224~), golang-github-hashicorp-go-msgpack-dev (>= 0.5.5~), golang-github-hashicorp-go-multierror-dev, golang-github-hashicorp-go-raftchunking-dev, golang-github-hashicorp-go-reap-dev, golang-github-hashicorp-go-retryablehttp-dev, golang-github-hashicorp-go-rootcerts-dev, golang-github-hashicorp-go-sockaddr-dev, golang-github-hashicorp-go-syslog-dev, golang-github-hashicorp-go-uuid-dev, golang-github-hashicorp-go-version-dev, golang-github-hashicorp-hcl-dev, golang-github-hashicorp-hil-dev (>= 0.0~git20160711~), golang-github-hashicorp-logutils-dev, golang-github-hashicorp-memberlist-dev (>= 0.1.5~), golang-github-hashicorp-net-rpc-msgpackrpc-dev, golang-github-hashicorp-raft-boltdb-dev, golang-github-hashicorp-raft-dev (>= 1.1.1~), golang-github-hashicorp-scada-client-dev, golang-github-hashicorp-serf-dev (>= 0.8.4~), golang-github-hashicorp-yamux-dev (>= 0.0~git20151129~), golang-github-inconshreveable-muxado-dev, golang-github-imdario-mergo-dev, golang-github-jefferai-jsonx-dev, golang-github-json-iterator-go-dev, golang-github-kr-text-dev, golang-github-mattn-go-isatty-dev, golang-github-miekg-dns-dev, golang-github-mitchellh-cli-dev (>= 1.0.0~), golang-github-mitchellh-go-testing-interface-dev, golang-github-mitchellh-copystructure-dev, golang-github-mitchellh-hashstructure-dev, golang-github-mitchellh-mapstructure-dev, golang-github-mitchellh-reflectwalk-dev, golang-github-nytimes-gziphandler-dev, golang-github-packethost-packngo-dev, golang-github-pascaldekloe-goe-dev, golang-github-peterbourgon-diskv-dev, golang-github-pmezard-go-difflib-dev, golang-github-ryanuber-columnize-dev, golang-github-ryanuber-go-glob-dev, golang-github-shirou-gopsutil-dev, golang-github-spf13-pflag-dev, golang-golang-x-sys-dev (>= 0.0~git20161012~), golang-gopkg-inf.v0-dev, mockery, golang-github-sap-go-hdb-dev
Filtered Build-Depends: debhelper (>= 11~), bash-completion, dh-golang (>= 1.42~), golang-any (>= 2:1.13~), golang-github-asaskevich-govalidator-dev, golang-github-armon-circbuf-dev, golang-github-armon-go-metrics-dev (>= 0.0~git20171117~), golang-github-armon-go-radix-dev, golang-github-azure-go-autorest-dev (>= 10.15.5~), golang-github-bgentry-speakeasy-dev, golang-github-circonus-labs-circonus-gometrics-dev (>= 2.3.1~), golang-github-circonus-labs-circonusllhist-dev, golang-github-datadog-datadog-go-dev, golang-github-davecgh-go-spew-dev, golang-github-denverdino-aliyungo-dev, golang-github-digitalocean-godo-dev, golang-github-docker-go-connections-dev, golang-github-elazarl-go-bindata-assetfs-dev (>= 0.0~git20151224~), golang-github-ghodss-yaml-dev, golang-github-gogo-googleapis-dev, golang-github-gogo-protobuf-dev (>= 1.2.1~), golang-github-golang-snappy-dev, golang-github-googleapis-gnostic-dev, golang-github-google-gofuzz-dev, golang-github-gophercloud-gophercloud-dev, golang-github-gregjones-httpcache-dev, golang-github-hashicorp-go-checkpoint-dev, golang-github-hashicorp-go-cleanhttp-dev (>= 0.5.1~), golang-github-hashicorp-go-discover-dev, golang-github-hashicorp-go-hclog-dev (>= 0.9.2~), golang-github-hashicorp-go-immutable-radix-dev (>= 1.1.0~), golang-github-hashicorp-golang-lru-dev (>= 0.0~git20160207~), golang-github-hashicorp-go-memdb-dev (>= 0.0~git20180224~), golang-github-hashicorp-go-msgpack-dev (>= 0.5.5~), golang-github-hashicorp-go-multierror-dev, golang-github-hashicorp-go-raftchunking-dev, golang-github-hashicorp-go-reap-dev, golang-github-hashicorp-go-retryablehttp-dev, golang-github-hashicorp-go-rootcerts-dev, golang-github-hashicorp-go-sockaddr-dev, golang-github-hashicorp-go-syslog-dev, golang-github-hashicorp-go-uuid-dev, golang-github-hashicorp-go-version-dev, golang-github-hashicorp-hcl-dev, golang-github-hashicorp-hil-dev (>= 0.0~git20160711~), golang-github-hashicorp-logutils-dev, golang-github-hashicorp-memberlist-dev (>= 0.1.5~), golang-github-hashicorp-net-rpc-msgpackrpc-dev, golang-github-hashicorp-raft-boltdb-dev, golang-github-hashicorp-raft-dev (>= 1.1.1~), golang-github-hashicorp-scada-client-dev, golang-github-hashicorp-serf-dev (>= 0.8.4~), golang-github-hashicorp-yamux-dev (>= 0.0~git20151129~), golang-github-inconshreveable-muxado-dev, golang-github-imdario-mergo-dev, golang-github-jefferai-jsonx-dev, golang-github-json-iterator-go-dev, golang-github-kr-text-dev, golang-github-mattn-go-isatty-dev, golang-github-miekg-dns-dev, golang-github-mitchellh-cli-dev (>= 1.0.0~), golang-github-mitchellh-go-testing-interface-dev, golang-github-mitchellh-copystructure-dev, golang-github-mitchellh-hashstructure-dev, golang-github-mitchellh-mapstructure-dev, golang-github-mitchellh-reflectwalk-dev, golang-github-nytimes-gziphandler-dev, golang-github-packethost-packngo-dev, golang-github-pascaldekloe-goe-dev, golang-github-peterbourgon-diskv-dev, golang-github-pmezard-go-difflib-dev, golang-github-ryanuber-columnize-dev, golang-github-ryanuber-go-glob-dev, golang-github-shirou-gopsutil-dev, golang-github-spf13-pflag-dev, golang-golang-x-sys-dev (>= 0.0~git20161012~), golang-gopkg-inf.v0-dev, mockery, golang-github-sap-go-hdb-dev
dpkg-deb: building package 'sbuild-build-depends-consul-dummy' in '/<<BUILDDIR>>/resolver-OaQGMc/apt_archive/sbuild-build-depends-consul-dummy.deb'.
dpkg-scanpackages: warning: Packages in archive but missing from override file:
dpkg-scanpackages: warning:   sbuild-build-depends-consul-dummy sbuild-build-depends-core-dummy
dpkg-scanpackages: info: Wrote 2 entries to output Packages file.
gpg: using "Sbuild Signer" as default secret key for signing
Ign:1 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ InRelease
Get:2 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Release [969 B]
Get:3 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Release.gpg [370 B]
Get:4 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Sources [1353 B]
Get:5 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ Packages [1439 B]
Fetched 4131 B in 1s (5052 B/s)
Reading package lists...
W: No sandbox user '_apt' on the system, can not drop privileges
Reading package lists...

Install consul build dependencies (apt-based resolver)
------------------------------------------------------

Installing build dependencies
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
  libpam-cap netbase
Use 'apt autoremove' to remove them.
The following additional packages will be installed:
  autoconf automake autopoint autotools-dev bash-completion bsdmainutils
  ca-certificates debhelper dh-autoreconf dh-golang dh-strip-nondeterminism
  dwz file gettext gettext-base gogoprotobuf golang-1.13-go golang-1.13-src
  golang-any golang-dbus-dev golang-dns-dev golang-ginkgo-dev
  golang-github-alecthomas-units-dev golang-github-armon-circbuf-dev
  golang-github-armon-go-metrics-dev golang-github-armon-go-radix-dev
  golang-github-asaskevich-govalidator-dev golang-github-aws-aws-sdk-go-dev
  golang-github-azure-go-autorest-dev golang-github-beorn7-perks-dev
  golang-github-bgentry-speakeasy-dev golang-github-boltdb-bolt-dev
  golang-github-bradfitz-gomemcache-dev golang-github-cespare-xxhash-dev
  golang-github-circonus-labs-circonus-gometrics-dev
  golang-github-circonus-labs-circonusllhist-dev
  golang-github-coreos-go-systemd-dev golang-github-coreos-pkg-dev
  golang-github-cyphar-filepath-securejoin-dev
  golang-github-datadog-datadog-go-dev golang-github-davecgh-go-spew-dev
  golang-github-denverdino-aliyungo-dev golang-github-dgrijalva-jwt-go-dev
  golang-github-dgrijalva-jwt-go-v3-dev golang-github-digitalocean-godo-dev
  golang-github-dimchansky-utfbom-dev golang-github-docker-go-connections-dev
  golang-github-docker-go-units-dev golang-github-docopt-docopt-go-dev
  golang-github-elazarl-go-bindata-assetfs-dev golang-github-fatih-color-dev
  golang-github-garyburd-redigo-dev golang-github-ghodss-yaml-dev
  golang-github-go-ini-ini-dev golang-github-go-kit-kit-dev
  golang-github-go-logfmt-logfmt-dev golang-github-go-stack-stack-dev
  golang-github-go-test-deep-dev golang-github-gogo-googleapis-dev
  golang-github-gogo-protobuf-dev golang-github-golang-mock-dev
  golang-github-golang-snappy-dev golang-github-google-btree-dev
  golang-github-google-go-cmp-dev golang-github-google-go-querystring-dev
  golang-github-google-gofuzz-dev golang-github-googleapis-gnostic-dev
  golang-github-gophercloud-gophercloud-dev
  golang-github-gregjones-httpcache-dev golang-github-hashicorp-errwrap-dev
  golang-github-hashicorp-go-checkpoint-dev
  golang-github-hashicorp-go-cleanhttp-dev
  golang-github-hashicorp-go-discover-dev golang-github-hashicorp-go-hclog-dev
  golang-github-hashicorp-go-immutable-radix-dev
  golang-github-hashicorp-go-memdb-dev golang-github-hashicorp-go-msgpack-dev
  golang-github-hashicorp-go-multierror-dev
  golang-github-hashicorp-go-raftchunking-dev
  golang-github-hashicorp-go-reap-dev
  golang-github-hashicorp-go-retryablehttp-dev
  golang-github-hashicorp-go-rootcerts-dev
  golang-github-hashicorp-go-sockaddr-dev
  golang-github-hashicorp-go-syslog-dev golang-github-hashicorp-go-uuid-dev
  golang-github-hashicorp-go-version-dev
  golang-github-hashicorp-golang-lru-dev golang-github-hashicorp-hcl-dev
  golang-github-hashicorp-hil-dev golang-github-hashicorp-logutils-dev
  golang-github-hashicorp-mdns-dev golang-github-hashicorp-memberlist-dev
  golang-github-hashicorp-net-rpc-msgpackrpc-dev
  golang-github-hashicorp-raft-boltdb-dev golang-github-hashicorp-raft-dev
  golang-github-hashicorp-scada-client-dev golang-github-hashicorp-serf-dev
  golang-github-hashicorp-yamux-dev golang-github-imdario-mergo-dev
  golang-github-inconshreveable-muxado-dev golang-github-jeffail-gabs-dev
  golang-github-jefferai-jsonx-dev golang-github-jmespath-go-jmespath-dev
  golang-github-jpillora-backoff-dev golang-github-json-iterator-go-dev
  golang-github-julienschmidt-httprouter-dev golang-github-kr-pretty-dev
  golang-github-kr-pty-dev golang-github-kr-text-dev
  golang-github-mattn-go-colorable-dev golang-github-mattn-go-isatty-dev
  golang-github-miekg-dns-dev golang-github-mitchellh-cli-dev
  golang-github-mitchellh-copystructure-dev
  golang-github-mitchellh-go-homedir-dev
  golang-github-mitchellh-go-testing-interface-dev
  golang-github-mitchellh-hashstructure-dev
  golang-github-mitchellh-mapstructure-dev
  golang-github-mitchellh-reflectwalk-dev
  golang-github-modern-go-concurrent-dev golang-github-modern-go-reflect2-dev
  golang-github-mwitkow-go-conntrack-dev golang-github-nytimes-gziphandler-dev
  golang-github-opencontainers-runc-dev
  golang-github-opencontainers-selinux-dev
  golang-github-opencontainers-specs-dev
  golang-github-opentracing-opentracing-go-dev
  golang-github-packethost-packngo-dev golang-github-pascaldekloe-goe-dev
  golang-github-peterbourgon-diskv-dev golang-github-pkg-errors-dev
  golang-github-pmezard-go-difflib-dev golang-github-posener-complete-dev
  golang-github-prometheus-client-golang-dev
  golang-github-prometheus-client-model-dev
  golang-github-prometheus-common-dev golang-github-ryanuber-columnize-dev
  golang-github-ryanuber-go-glob-dev golang-github-sap-go-hdb-dev
  golang-github-seccomp-libseccomp-golang-dev
  golang-github-shirou-gopsutil-dev golang-github-sirupsen-logrus-dev
  golang-github-spf13-pflag-dev golang-github-stretchr-objx-dev
  golang-github-stretchr-testify-dev golang-github-syndtr-goleveldb-dev
  golang-github-tent-http-link-go-dev golang-github-tv42-httpunix-dev
  golang-github-ugorji-go-codec-dev golang-github-ugorji-go-msgpack-dev
  golang-github-urfave-cli-dev golang-github-vishvananda-netlink-dev
  golang-github-vishvananda-netns-dev golang-github-vmware-govmomi-dev
  golang-github-xeipuuv-gojsonpointer-dev
  golang-github-xeipuuv-gojsonreference-dev
  golang-github-xeipuuv-gojsonschema-dev golang-glog-dev golang-go
  golang-go.opencensus-dev golang-gocapability-dev golang-gogoprotobuf-dev
  golang-golang-x-crypto-dev golang-golang-x-net-dev
  golang-golang-x-oauth2-dev golang-golang-x-oauth2-google-dev
  golang-golang-x-sync-dev golang-golang-x-sys-dev golang-golang-x-text-dev
  golang-golang-x-time-dev golang-golang-x-tools golang-golang-x-tools-dev
  golang-golang-x-xerrors-dev golang-gomega-dev golang-google-api-dev
  golang-google-cloud-compute-metadata-dev golang-google-genproto-dev
  golang-google-grpc-dev golang-gopkg-alecthomas-kingpin.v2-dev
  golang-gopkg-check.v1-dev golang-gopkg-inf.v0-dev golang-gopkg-mgo.v2-dev
  golang-gopkg-tomb.v2-dev golang-gopkg-vmihailenco-msgpack.v2-dev
  golang-gopkg-yaml.v2-dev golang-goprotobuf-dev golang-procfs-dev
  golang-protobuf-extensions-dev golang-src groff-base intltool-debian
  iproute2 libarchive-zip-perl libbsd0 libcroco3 libdebhelper-perl libelf1
  libfile-stripnondeterminism-perl libglib2.0-0 libicu63 libjs-jquery
  libjs-jquery-ui libmagic-mgc libmagic1 libmnl0 libncurses6 libpipeline1
  libprocps7 libprotobuf-dev libprotobuf-lite17 libprotobuf17 libprotoc17
  libsasl2-dev libseccomp-dev libsigsegv2 libssl1.1 libsub-override-perl
  libsystemd-dev libsystemd0 libtinfo5 libtool libuchardet0 libxml2
  libxtables12 lsof m4 man-db mockery openssl pkg-config po-debconf procps
  protobuf-compiler sensible-utils zlib1g-dev
Suggested packages:
  autoconf-archive gnu-standards autoconf-doc wamerican | wordlist whois
  vacation dh-make gettext-doc libasprintf-dev libgettextpo-dev bzr | brz git
  mercurial subversion mockgen golang-google-appengine-dev groff iproute2-doc
  libjs-jquery-ui-docs seccomp libtool-doc gfortran | fortran95-compiler
  gcj-jdk m4-doc apparmor less www-browser libmail-box-perl
Recommended packages:
  curl | wget | lynx golang-doc libatm1 libarchive-cpio-perl libglib2.0-data
  shared-mime-info xdg-user-dirs javascript-common libgpm2 libltdl-dev
  libmail-sendmail-perl psmisc
The following NEW packages will be installed:
  autoconf automake autopoint autotools-dev bash-completion bsdmainutils
  ca-certificates debhelper dh-autoreconf dh-golang dh-strip-nondeterminism
  dwz file gettext gettext-base gogoprotobuf golang-1.13-go golang-1.13-src
  golang-any golang-dbus-dev golang-dns-dev golang-ginkgo-dev
  golang-github-alecthomas-units-dev golang-github-armon-circbuf-dev
  golang-github-armon-go-metrics-dev golang-github-armon-go-radix-dev
  golang-github-asaskevich-govalidator-dev golang-github-aws-aws-sdk-go-dev
  golang-github-azure-go-autorest-dev golang-github-beorn7-perks-dev
  golang-github-bgentry-speakeasy-dev golang-github-boltdb-bolt-dev
  golang-github-bradfitz-gomemcache-dev golang-github-cespare-xxhash-dev
  golang-github-circonus-labs-circonus-gometrics-dev
  golang-github-circonus-labs-circonusllhist-dev
  golang-github-coreos-go-systemd-dev golang-github-coreos-pkg-dev
  golang-github-cyphar-filepath-securejoin-dev
  golang-github-datadog-datadog-go-dev golang-github-davecgh-go-spew-dev
  golang-github-denverdino-aliyungo-dev golang-github-dgrijalva-jwt-go-dev
  golang-github-dgrijalva-jwt-go-v3-dev golang-github-digitalocean-godo-dev
  golang-github-dimchansky-utfbom-dev golang-github-docker-go-connections-dev
  golang-github-docker-go-units-dev golang-github-docopt-docopt-go-dev
  golang-github-elazarl-go-bindata-assetfs-dev golang-github-fatih-color-dev
  golang-github-garyburd-redigo-dev golang-github-ghodss-yaml-dev
  golang-github-go-ini-ini-dev golang-github-go-kit-kit-dev
  golang-github-go-logfmt-logfmt-dev golang-github-go-stack-stack-dev
  golang-github-go-test-deep-dev golang-github-gogo-googleapis-dev
  golang-github-gogo-protobuf-dev golang-github-golang-mock-dev
  golang-github-golang-snappy-dev golang-github-google-btree-dev
  golang-github-google-go-cmp-dev golang-github-google-go-querystring-dev
  golang-github-google-gofuzz-dev golang-github-googleapis-gnostic-dev
  golang-github-gophercloud-gophercloud-dev
  golang-github-gregjones-httpcache-dev golang-github-hashicorp-errwrap-dev
  golang-github-hashicorp-go-checkpoint-dev
  golang-github-hashicorp-go-cleanhttp-dev
  golang-github-hashicorp-go-discover-dev golang-github-hashicorp-go-hclog-dev
  golang-github-hashicorp-go-immutable-radix-dev
  golang-github-hashicorp-go-memdb-dev golang-github-hashicorp-go-msgpack-dev
  golang-github-hashicorp-go-multierror-dev
  golang-github-hashicorp-go-raftchunking-dev
  golang-github-hashicorp-go-reap-dev
  golang-github-hashicorp-go-retryablehttp-dev
  golang-github-hashicorp-go-rootcerts-dev
  golang-github-hashicorp-go-sockaddr-dev
  golang-github-hashicorp-go-syslog-dev golang-github-hashicorp-go-uuid-dev
  golang-github-hashicorp-go-version-dev
  golang-github-hashicorp-golang-lru-dev golang-github-hashicorp-hcl-dev
  golang-github-hashicorp-hil-dev golang-github-hashicorp-logutils-dev
  golang-github-hashicorp-mdns-dev golang-github-hashicorp-memberlist-dev
  golang-github-hashicorp-net-rpc-msgpackrpc-dev
  golang-github-hashicorp-raft-boltdb-dev golang-github-hashicorp-raft-dev
  golang-github-hashicorp-scada-client-dev golang-github-hashicorp-serf-dev
  golang-github-hashicorp-yamux-dev golang-github-imdario-mergo-dev
  golang-github-inconshreveable-muxado-dev golang-github-jeffail-gabs-dev
  golang-github-jefferai-jsonx-dev golang-github-jmespath-go-jmespath-dev
  golang-github-jpillora-backoff-dev golang-github-json-iterator-go-dev
  golang-github-julienschmidt-httprouter-dev golang-github-kr-pretty-dev
  golang-github-kr-pty-dev golang-github-kr-text-dev
  golang-github-mattn-go-colorable-dev golang-github-mattn-go-isatty-dev
  golang-github-miekg-dns-dev golang-github-mitchellh-cli-dev
  golang-github-mitchellh-copystructure-dev
  golang-github-mitchellh-go-homedir-dev
  golang-github-mitchellh-go-testing-interface-dev
  golang-github-mitchellh-hashstructure-dev
  golang-github-mitchellh-mapstructure-dev
  golang-github-mitchellh-reflectwalk-dev
  golang-github-modern-go-concurrent-dev golang-github-modern-go-reflect2-dev
  golang-github-mwitkow-go-conntrack-dev golang-github-nytimes-gziphandler-dev
  golang-github-opencontainers-runc-dev
  golang-github-opencontainers-selinux-dev
  golang-github-opencontainers-specs-dev
  golang-github-opentracing-opentracing-go-dev
  golang-github-packethost-packngo-dev golang-github-pascaldekloe-goe-dev
  golang-github-peterbourgon-diskv-dev golang-github-pkg-errors-dev
  golang-github-pmezard-go-difflib-dev golang-github-posener-complete-dev
  golang-github-prometheus-client-golang-dev
  golang-github-prometheus-client-model-dev
  golang-github-prometheus-common-dev golang-github-ryanuber-columnize-dev
  golang-github-ryanuber-go-glob-dev golang-github-sap-go-hdb-dev
  golang-github-seccomp-libseccomp-golang-dev
  golang-github-shirou-gopsutil-dev golang-github-sirupsen-logrus-dev
  golang-github-spf13-pflag-dev golang-github-stretchr-objx-dev
  golang-github-stretchr-testify-dev golang-github-syndtr-goleveldb-dev
  golang-github-tent-http-link-go-dev golang-github-tv42-httpunix-dev
  golang-github-ugorji-go-codec-dev golang-github-ugorji-go-msgpack-dev
  golang-github-urfave-cli-dev golang-github-vishvananda-netlink-dev
  golang-github-vishvananda-netns-dev golang-github-vmware-govmomi-dev
  golang-github-xeipuuv-gojsonpointer-dev
  golang-github-xeipuuv-gojsonreference-dev
  golang-github-xeipuuv-gojsonschema-dev golang-glog-dev golang-go
  golang-go.opencensus-dev golang-gocapability-dev golang-gogoprotobuf-dev
  golang-golang-x-crypto-dev golang-golang-x-net-dev
  golang-golang-x-oauth2-dev golang-golang-x-oauth2-google-dev
  golang-golang-x-sync-dev golang-golang-x-sys-dev golang-golang-x-text-dev
  golang-golang-x-time-dev golang-golang-x-tools golang-golang-x-tools-dev
  golang-golang-x-xerrors-dev golang-gomega-dev golang-google-api-dev
  golang-google-cloud-compute-metadata-dev golang-google-genproto-dev
  golang-google-grpc-dev golang-gopkg-alecthomas-kingpin.v2-dev
  golang-gopkg-check.v1-dev golang-gopkg-inf.v0-dev golang-gopkg-mgo.v2-dev
  golang-gopkg-tomb.v2-dev golang-gopkg-vmihailenco-msgpack.v2-dev
  golang-gopkg-yaml.v2-dev golang-goprotobuf-dev golang-procfs-dev
  golang-protobuf-extensions-dev golang-src groff-base intltool-debian
  iproute2 libarchive-zip-perl libbsd0 libcroco3 libdebhelper-perl libelf1
  libfile-stripnondeterminism-perl libglib2.0-0 libicu63 libjs-jquery
  libjs-jquery-ui libmagic-mgc libmagic1 libmnl0 libncurses6 libpipeline1
  libprocps7 libprotobuf-dev libprotobuf-lite17 libprotobuf17 libprotoc17
  libsasl2-dev libseccomp-dev libsigsegv2 libssl1.1 libsub-override-perl
  libsystemd-dev libtinfo5 libtool libuchardet0 libxml2 libxtables12 lsof m4
  man-db mockery openssl pkg-config po-debconf procps protobuf-compiler
  sbuild-build-depends-consul-dummy sensible-utils zlib1g-dev
The following packages will be upgraded:
  libsystemd0
1 upgraded, 235 newly installed, 0 to remove and 31 not upgraded.
Need to get 157 MB of archives.
After this operation, 984 MB of additional disk space will be used.
Get:1 copy:/<<BUILDDIR>>/resolver-OaQGMc/apt_archive ./ sbuild-build-depends-consul-dummy 0.invalid.0 [1692 B]
Get:2 http://172.17.0.1/private bullseye-staging/main armhf libsystemd0 armhf 243-8+rpi1 [310 kB]
Get:3 http://172.17.0.1/private bullseye-staging/main armhf libbsd0 armhf 0.10.0-1 [112 kB]
Get:4 http://172.17.0.1/private bullseye-staging/main armhf libtinfo5 armhf 6.1+20191019-1 [316 kB]
Get:5 http://172.17.0.1/private bullseye-staging/main armhf bsdmainutils armhf 11.1.2 [182 kB]
Get:6 http://172.17.0.1/private bullseye-staging/main armhf libuchardet0 armhf 0.0.6-3 [62.2 kB]
Get:7 http://172.17.0.1/private bullseye-staging/main armhf groff-base armhf 1.22.4-3 [782 kB]
Get:8 http://172.17.0.1/private bullseye-staging/main armhf libpipeline1 armhf 1.5.1-2 [26.6 kB]
Get:9 http://172.17.0.1/private bullseye-staging/main armhf man-db armhf 2.9.0-1 [1261 kB]
Get:10 http://172.17.0.1/private bullseye-staging/main armhf golang-github-davecgh-go-spew-dev all 1.1.1-2 [29.7 kB]
Get:11 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pmezard-go-difflib-dev all 1.0.0-2 [12.0 kB]
Get:12 http://172.17.0.1/private bullseye-staging/main armhf golang-github-stretchr-objx-dev all 0.1.1+git20180825.ef50b0d-1 [23.4 kB]
Get:13 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-pty-dev all 1.1.6-1 [10.6 kB]
Get:14 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-text-dev all 0.1.0-1 [10.8 kB]
Get:15 http://172.17.0.1/private bullseye-staging/main armhf golang-github-kr-pretty-dev all 0.1.0-1 [10.2 kB]
Get:16 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-check.v1-dev all 0.0+git20180628.788fd78-1 [31.6 kB]
Get:17 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-yaml.v2-dev all 2.2.2-1 [58.9 kB]
Get:18 http://172.17.0.1/private bullseye-staging/main armhf golang-github-stretchr-testify-dev all 1.4.0+ds-1 [53.5 kB]
Get:19 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-sys-dev all 0.0~git20190726.fc99dfb-1 [395 kB]
Get:20 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-sync-dev all 0.0~git20190423.1122301-1 [17.1 kB]
Get:21 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-xerrors-dev all 0.0~git20190717.a985d34-1 [12.8 kB]
Get:22 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-tools-dev all 1:0.0~git20191118.07fc4c7+ds-1 [1396 kB]
Get:23 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-text-dev all 0.3.2-1 [3689 kB]
Get:24 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-net-dev all 1:0.0+git20191112.2180aed+dfsg-1 [637 kB]
Get:25 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-crypto-dev all 1:0.0~git20190701.4def268-2 [1505 kB]
Get:26 http://172.17.0.1/private bullseye-staging/main armhf golang-github-sirupsen-logrus-dev all 1.3.0-1 [38.9 kB]
Get:27 http://172.17.0.1/private bullseye-staging/main armhf libelf1 armhf 0.176-1.1 [158 kB]
Get:28 http://172.17.0.1/private bullseye-staging/main armhf libmnl0 armhf 1.0.4-2 [11.3 kB]
Get:29 http://172.17.0.1/private bullseye-staging/main armhf libxtables12 armhf 1.8.3-2 [77.3 kB]
Get:30 http://172.17.0.1/private bullseye-staging/main armhf iproute2 armhf 5.3.0-1 [752 kB]
Get:31 http://172.17.0.1/private bullseye-staging/main armhf libncurses6 armhf 6.1+20191019-1 [79.5 kB]
Get:32 http://172.17.0.1/private bullseye-staging/main armhf libprocps7 armhf 2:3.3.15-2 [58.9 kB]
Get:33 http://172.17.0.1/private bullseye-staging/main armhf procps armhf 2:3.3.15-2 [235 kB]
Get:34 http://172.17.0.1/private bullseye-staging/main armhf sensible-utils all 0.0.12 [15.8 kB]
Get:35 http://172.17.0.1/private bullseye-staging/main armhf bash-completion all 1:2.8-6 [208 kB]
Get:36 http://172.17.0.1/private bullseye-staging/main armhf libmagic-mgc armhf 1:5.37-6 [253 kB]
Get:37 http://172.17.0.1/private bullseye-staging/main armhf libmagic1 armhf 1:5.37-6 [111 kB]
Get:38 http://172.17.0.1/private bullseye-staging/main armhf file armhf 1:5.37-6 [66.2 kB]
Get:39 http://172.17.0.1/private bullseye-staging/main armhf gettext-base armhf 0.19.8.1-10 [117 kB]
Get:40 http://172.17.0.1/private bullseye-staging/main armhf lsof armhf 4.93.2+dfsg-1 [307 kB]
Get:41 http://172.17.0.1/private bullseye-staging/main armhf libsigsegv2 armhf 2.12-2 [32.3 kB]
Get:42 http://172.17.0.1/private bullseye-staging/main armhf m4 armhf 1.4.18-4 [185 kB]
Get:43 http://172.17.0.1/private bullseye-staging/main armhf autoconf all 2.69-11 [341 kB]
Get:44 http://172.17.0.1/private bullseye-staging/main armhf autotools-dev all 20180224.1 [77.0 kB]
Get:45 http://172.17.0.1/private bullseye-staging/main armhf automake all 1:1.16.1-4 [771 kB]
Get:46 http://172.17.0.1/private bullseye-staging/main armhf autopoint all 0.19.8.1-10 [435 kB]
Get:47 http://172.17.0.1/private bullseye-staging/main armhf libssl1.1 armhf 1.1.1d-2 [1268 kB]
Get:48 http://172.17.0.1/private bullseye-staging/main armhf openssl armhf 1.1.1d-2 [806 kB]
Get:49 http://172.17.0.1/private bullseye-staging/main armhf ca-certificates all 20190110 [157 kB]
Get:50 http://172.17.0.1/private bullseye-staging/main armhf libtool all 2.4.6-11 [547 kB]
Get:51 http://172.17.0.1/private bullseye-staging/main armhf dh-autoreconf all 19 [16.9 kB]
Get:52 http://172.17.0.1/private bullseye-staging/main armhf libdebhelper-perl all 12.7.1 [173 kB]
Get:53 http://172.17.0.1/private bullseye-staging/main armhf libarchive-zip-perl all 1.67-1 [104 kB]
Get:54 http://172.17.0.1/private bullseye-staging/main armhf libsub-override-perl all 0.09-2 [10.2 kB]
Get:55 http://172.17.0.1/private bullseye-staging/main armhf libfile-stripnondeterminism-perl all 1.6.3-1 [23.6 kB]
Get:56 http://172.17.0.1/private bullseye-staging/main armhf dh-strip-nondeterminism all 1.6.3-1 [14.6 kB]
Get:57 http://172.17.0.1/private bullseye-staging/main armhf dwz armhf 0.13-2 [136 kB]
Get:58 http://172.17.0.1/private bullseye-staging/main armhf libglib2.0-0 armhf 2.62.2-3 [1137 kB]
Get:59 http://172.17.0.1/private bullseye-staging/main armhf libicu63 armhf 63.2-2 [7974 kB]
Get:60 http://172.17.0.1/private bullseye-staging/main armhf libxml2 armhf 2.9.4+dfsg1-8 [593 kB]
Get:61 http://172.17.0.1/private bullseye-staging/main armhf libcroco3 armhf 0.6.13-1 [133 kB]
Get:62 http://172.17.0.1/private bullseye-staging/main armhf gettext armhf 0.19.8.1-10 [1219 kB]
Get:63 http://172.17.0.1/private bullseye-staging/main armhf intltool-debian all 0.35.0+20060710.5 [26.8 kB]
Get:64 http://172.17.0.1/private bullseye-staging/main armhf po-debconf all 1.0.21 [248 kB]
Get:65 http://172.17.0.1/private bullseye-staging/main armhf debhelper all 12.7.1 [997 kB]
Get:66 http://172.17.0.1/private bullseye-staging/main armhf dh-golang all 1.42 [21.7 kB]
Get:67 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gogo-protobuf-dev all 1.2.1+git20190611.dadb6258-1 [863 kB]
Get:68 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf17 armhf 3.6.1.3-2+rpi1 [665 kB]
Get:69 http://172.17.0.1/private bullseye-staging/main armhf libprotoc17 armhf 3.6.1.3-2+rpi1 [546 kB]
Get:70 http://172.17.0.1/private bullseye-staging/main armhf protobuf-compiler armhf 3.6.1.3-2+rpi1 [64.5 kB]
Get:71 http://172.17.0.1/private bullseye-staging/main armhf gogoprotobuf armhf 1.2.1+git20190611.dadb6258-1 [5285 kB]
Get:72 http://172.17.0.1/private bullseye-staging/main armhf golang-1.13-src armhf 1.13.4-1+rpi1 [12.7 MB]
Get:73 http://172.17.0.1/private bullseye-staging/main armhf golang-1.13-go armhf 1.13.4-1+rpi1 [43.5 MB]
Get:74 http://172.17.0.1/private bullseye-staging/main armhf golang-src armhf 2:1.13~1+b11 [4892 B]
Get:75 http://172.17.0.1/private bullseye-staging/main armhf golang-go armhf 2:1.13~1+b11 [23.9 kB]
Get:76 http://172.17.0.1/private bullseye-staging/main armhf golang-any armhf 2:1.13~1+b11 [5012 B]
Get:77 http://172.17.0.1/private bullseye-staging/main armhf golang-dbus-dev all 5.0.2-1 [54.7 kB]
Get:78 http://172.17.0.1/private bullseye-staging/main armhf golang-github-miekg-dns-dev all 1.0.4+ds-1 [126 kB]
Get:79 http://172.17.0.1/private bullseye-staging/main armhf golang-dns-dev all 1.0.4+ds-1 [3464 B]
Get:80 http://172.17.0.1/private bullseye-staging/main armhf golang-github-alecthomas-units-dev all 0.0~git20151022.0.2efee85-4 [5816 B]
Get:81 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-circbuf-dev all 0.0~git20150827.0.bbbad09-2 [3952 B]
Get:82 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pkg-errors-dev all 0.8.1-1 [11.2 kB]
Get:83 http://172.17.0.1/private bullseye-staging/main armhf golang-github-circonus-labs-circonusllhist-dev all 0.0~git20160526.0.d724266-2 [6974 B]
Get:84 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-cleanhttp-dev all 0.5.1-1 [10.4 kB]
Get:85 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-hclog-dev all 0.9.2-1 [14.1 kB]
Get:86 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-retryablehttp-dev all 0.6.3-1 [17.2 kB]
Get:87 http://172.17.0.1/private bullseye-staging/main armhf golang-github-tv42-httpunix-dev all 0.0~git20150427.b75d861-2 [3744 B]
Get:88 http://172.17.0.1/private bullseye-staging/main armhf golang-github-circonus-labs-circonus-gometrics-dev all 2.3.1-2 [64.4 kB]
Get:89 http://172.17.0.1/private bullseye-staging/main armhf golang-github-datadog-datadog-go-dev all 2.1.0-2 [14.7 kB]
Get:90 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-uuid-dev all 1.0.1-1 [8476 B]
Get:91 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-golang-lru-dev all 0.5.0-1 [14.0 kB]
Get:92 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-immutable-radix-dev all 1.1.0-1 [22.8 kB]
Get:93 http://172.17.0.1/private bullseye-staging/main armhf golang-github-pascaldekloe-goe-dev all 0.1.0-2 [21.7 kB]
Get:94 http://172.17.0.1/private bullseye-staging/main armhf golang-github-beorn7-perks-dev all 0.0~git20160804.0.4c0e845-1 [11.6 kB]
Get:95 http://172.17.0.1/private bullseye-staging/main armhf golang-github-cespare-xxhash-dev all 2.1.0-1 [8696 B]
Get:96 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-gofuzz-dev all 0.0~git20170612.24818f7-1 [9108 B]
Get:97 http://172.17.0.1/private bullseye-staging/main armhf golang-github-modern-go-concurrent-dev all 1.0.3-1 [4520 B]
Get:98 http://172.17.0.1/private bullseye-staging/main armhf golang-github-modern-go-reflect2-dev all 1.0.0-1 [14.4 kB]
Get:99 http://172.17.0.1/private bullseye-staging/main armhf golang-github-json-iterator-go-dev all 1.1.4-1 [62.6 kB]
Get:100 http://172.17.0.1/private bullseye-staging/main armhf zlib1g-dev armhf 1:1.2.11.dfsg-1 [206 kB]
Get:101 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf-lite17 armhf 3.6.1.3-2+rpi1 [147 kB]
Get:102 http://172.17.0.1/private bullseye-staging/main armhf libprotobuf-dev armhf 3.6.1.3-2+rpi1 [1001 kB]
Get:103 http://172.17.0.1/private bullseye-staging/main armhf golang-goprotobuf-dev armhf 1.3.2-2 [1369 kB]
Get:104 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-client-model-dev all 0.0.2+git20171117.99fa1f4-1 [19.3 kB]
Get:105 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dgrijalva-jwt-go-v3-dev all 3.2.0-2 [32.4 kB]
Get:106 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-logfmt-logfmt-dev all 0.3.0-1 [12.5 kB]
Get:107 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-stack-stack-dev all 1.5.2-2 [6956 B]
Get:108 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opentracing-opentracing-go-dev all 1.0.2-1 [21.8 kB]
Get:109 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-time-dev all 0.0~git20161028.0.f51c127-2 [9396 B]
Get:110 http://172.17.0.1/private bullseye-staging/main armhf golang-github-golang-mock-dev all 1.3.1-2 [35.1 kB]
Get:111 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-go-cmp-dev all 0.3.1-1 [65.2 kB]
Get:112 http://172.17.0.1/private bullseye-staging/main armhf golang-glog-dev all 0.0~git20160126.23def4e-3 [17.3 kB]
Get:113 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-oauth2-dev all 0.0~git20190604.0f29369-2 [31.9 kB]
Get:114 http://172.17.0.1/private bullseye-staging/main armhf golang-google-cloud-compute-metadata-dev all 0.43.0-1 [31.1 kB]
Get:115 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-oauth2-google-dev all 0.0~git20190604.0f29369-2 [13.2 kB]
Get:116 http://172.17.0.1/private bullseye-staging/main armhf golang-google-genproto-dev all 0.0~git20190801.fa694d8-2 [2897 kB]
Get:117 http://172.17.0.1/private bullseye-staging/main armhf golang-google-grpc-dev all 1.22.1-1 [493 kB]
Get:118 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-kit-kit-dev all 0.6.0-2 [103 kB]
Get:119 http://172.17.0.1/private bullseye-staging/main armhf golang-github-julienschmidt-httprouter-dev all 1.1-5 [16.0 kB]
Get:120 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jpillora-backoff-dev all 1.0.0-1 [3580 B]
Get:121 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mwitkow-go-conntrack-dev all 0.0~git20190716.2f06839-1 [14.4 kB]
Get:122 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-alecthomas-kingpin.v2-dev all 2.2.6-1 [42.2 kB]
Get:123 http://172.17.0.1/private bullseye-staging/main armhf golang-protobuf-extensions-dev all 1.0.1-1 [29.6 kB]
Get:124 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-common-dev all 0.7.0-1 [83.8 kB]
Get:125 http://172.17.0.1/private bullseye-staging/main armhf golang-procfs-dev all 0.0.3-1 [78.0 kB]
Get:126 http://172.17.0.1/private bullseye-staging/main armhf golang-github-prometheus-client-golang-dev all 1.2.1-3 [106 kB]
Get:127 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-go-metrics-dev all 0.0~git20190430.ec5e00d-1 [25.9 kB]
Get:128 http://172.17.0.1/private bullseye-staging/main armhf golang-github-armon-go-radix-dev all 1.0.0-1 [7420 B]
Get:129 http://172.17.0.1/private bullseye-staging/main armhf golang-github-asaskevich-govalidator-dev all 9+git20180720.0.f9ffefc3-1 [41.2 kB]
Get:130 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-ini-ini-dev all 1.32.0-2 [32.7 kB]
Get:131 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jmespath-go-jmespath-dev all 0.2.2-3 [18.7 kB]
Get:132 http://172.17.0.1/private bullseye-staging/main armhf golang-github-aws-aws-sdk-go-dev all 1.21.6+dfsg-2 [4969 kB]
Get:133 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dgrijalva-jwt-go-dev all 3.2.0-1 [32.5 kB]
Get:134 http://172.17.0.1/private bullseye-staging/main armhf golang-github-dimchansky-utfbom-dev all 0.0~git20170328.6c6132f-1 [4712 B]
Get:135 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-go-homedir-dev all 1.1.0-1 [5168 B]
Get:136 http://172.17.0.1/private bullseye-staging/main armhf golang-github-azure-go-autorest-dev all 10.15.5-1 [99.2 kB]
Get:137 http://172.17.0.1/private bullseye-staging/main armhf golang-github-bgentry-speakeasy-dev all 0.1.0-1 [5110 B]
Get:138 http://172.17.0.1/private bullseye-staging/main armhf golang-github-boltdb-bolt-dev all 1.3.1-6 [60.6 kB]
Get:139 http://172.17.0.1/private bullseye-staging/main armhf golang-github-bradfitz-gomemcache-dev all 0.0~git20141109-3 [10.3 kB]
Get:140 http://172.17.0.1/private bullseye-staging/main armhf golang-github-coreos-pkg-dev all 4-2 [25.1 kB]
Get:141 http://172.17.0.1/private bullseye-staging/main armhf libsystemd-dev armhf 243-8+rpi1 [331 kB]
Get:142 http://172.17.0.1/private bullseye-staging/main armhf pkg-config armhf 0.29-6 [59.8 kB]
Get:143 http://172.17.0.1/private bullseye-staging/main armhf golang-github-coreos-go-systemd-dev all 20-1 [50.7 kB]
Get:144 http://172.17.0.1/private bullseye-staging/main armhf golang-github-cyphar-filepath-securejoin-dev all 0.2.2-1 [7196 B]
Get:145 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-go-querystring-dev all 1.0.0-1 [7456 B]
Get:146 http://172.17.0.1/private bullseye-staging/main armhf golang-github-tent-http-link-go-dev all 0.0~git20130702.0.ac974c6-6 [5016 B]
Get:147 http://172.17.0.1/private bullseye-staging/main armhf golang-github-digitalocean-godo-dev all 1.1.0-1 [42.6 kB]
Get:148 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docker-go-units-dev all 0.4.0-1 [7536 B]
Get:149 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-selinux-dev all 1.3.0-2 [13.3 kB]
Get:150 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonpointer-dev all 0.0~git20151027.0.e0fe6f6-2 [4620 B]
Get:151 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonreference-dev all 0.0~git20150808.0.e02fc20-2 [4592 B]
Get:152 http://172.17.0.1/private bullseye-staging/main armhf golang-github-xeipuuv-gojsonschema-dev all 0.0~git20170210.0.6b67b3f-2 [25.3 kB]
Get:153 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-specs-dev all 1.0.1+git20190408.a1b50f6-1 [27.7 kB]
Get:154 http://172.17.0.1/private bullseye-staging/main armhf libseccomp-dev armhf 2.4.1-2+rpi1 [64.1 kB]
Get:155 http://172.17.0.1/private bullseye-staging/main armhf golang-github-seccomp-libseccomp-golang-dev all 0.9.1-1 [16.1 kB]
Get:156 http://172.17.0.1/private bullseye-staging/main armhf golang-github-urfave-cli-dev all 1.20.0-1 [51.0 kB]
Get:157 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vishvananda-netns-dev all 0.0~git20170707.0.86bef33-1 [5646 B]
Get:158 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vishvananda-netlink-dev all 1.0.0+git20181030.023a6da-1 [106 kB]
Get:159 http://172.17.0.1/private bullseye-staging/main armhf golang-gocapability-dev all 0.0+git20180916.d983527-1 [11.8 kB]
Get:160 http://172.17.0.1/private bullseye-staging/main armhf golang-github-opencontainers-runc-dev all 1.0.0~rc9+dfsg1-1+rpi1 [178 kB]
Get:161 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docker-go-connections-dev all 0.4.0-1 [26.3 kB]
Get:162 http://172.17.0.1/private bullseye-staging/main armhf golang-github-elazarl-go-bindata-assetfs-dev all 1.0.0-1 [5460 B]
Get:163 http://172.17.0.1/private bullseye-staging/main armhf golang-github-garyburd-redigo-dev all 0.0~git20150901.0.d8dbe4d-2 [28.0 kB]
Get:164 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ghodss-yaml-dev all 1.0.0-1 [12.9 kB]
Get:165 http://172.17.0.1/private bullseye-staging/main armhf golang-github-go-test-deep-dev all 1.0.3-1 [9876 B]
Get:166 http://172.17.0.1/private bullseye-staging/main armhf golang-gogoprotobuf-dev all 1.2.1+git20190611.dadb6258-1 [5340 B]
Get:167 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gogo-googleapis-dev all 1.2.0-1 [30.4 kB]
Get:168 http://172.17.0.1/private bullseye-staging/main armhf golang-github-golang-snappy-dev all 0.0+git20160529.d9eb7a3-3 [51.2 kB]
Get:169 http://172.17.0.1/private bullseye-staging/main armhf golang-github-google-btree-dev all 0.0~git20161217.0.316fb6d-1 [12.0 kB]
Get:170 http://172.17.0.1/private bullseye-staging/main armhf golang-github-docopt-docopt-go-dev all 0.6.2+git20160216.0.784ddc5-1 [9434 B]
Get:171 http://172.17.0.1/private bullseye-staging/main armhf golang-github-googleapis-gnostic-dev all 0.2.0-1 [74.4 kB]
Get:172 http://172.17.0.1/private bullseye-staging/main armhf golang-github-peterbourgon-diskv-dev all 2.0.1-1 [17.5 kB]
Get:173 http://172.17.0.1/private bullseye-staging/main armhf golang-gomega-dev all 1.0+git20160910.d59fa0a-1 [63.7 kB]
Get:174 http://172.17.0.1/private bullseye-staging/main armhf golang-ginkgo-dev armhf 1.2.0+git20161006.acfa16a-1 [1535 kB]
Get:175 http://172.17.0.1/private bullseye-staging/main armhf golang-github-syndtr-goleveldb-dev all 0.0~git20170725.0.b89cc31-2 [116 kB]
Get:176 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gregjones-httpcache-dev all 0.0~git20180305.9cad4c3-1 [13.6 kB]
Get:177 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-errwrap-dev all 1.0.0-1 [10.3 kB]
Get:178 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-checkpoint-dev all 0.0~git20171009.1545e56-2 [8184 B]
Get:179 http://172.17.0.1/private bullseye-staging/main armhf golang-github-denverdino-aliyungo-dev all 0.0~git20180921.13fa8aa-2 [125 kB]
Get:180 http://172.17.0.1/private bullseye-staging/main armhf golang-github-gophercloud-gophercloud-dev all 0.6.0-1 [570 kB]
Get:181 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-multierror-dev all 1.0.0-1 [10.6 kB]
Get:182 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-mdns-dev all 0.0~git20150317.0.2b439d3-2 [10.8 kB]
Get:183 http://172.17.0.1/private bullseye-staging/main armhf golang-github-packethost-packngo-dev all 0.2.0-2 [40.7 kB]
Get:184 http://172.17.0.1/private bullseye-staging/main armhf golang-github-vmware-govmomi-dev all 0.15.0-1 [10.2 MB]
Get:185 http://172.17.0.1/private bullseye-staging/main armhf golang-go.opencensus-dev all 0.22.0-1 [120 kB]
Get:186 http://172.17.0.1/private bullseye-staging/main armhf golang-google-api-dev all 0.7.0-2 [2971 kB]
Get:187 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-discover-dev all 0.0+git20190905.34a6505-2 [26.7 kB]
Get:188 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-memdb-dev all 0.0~git20180224.1289e7ff-1 [27.1 kB]
Get:189 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ugorji-go-msgpack-dev all 0.0~git20130605.792643-5 [20.7 kB]
Get:190 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ugorji-go-codec-dev all 1.1.7-1 [201 kB]
Get:191 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-vmihailenco-msgpack.v2-dev all 3.3.3-1 [24.4 kB]
Get:192 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-tomb.v2-dev all 0.0~git20161208.d5d1b58-3 [6840 B]
Get:193 http://172.17.0.1/private bullseye-staging/main armhf libsasl2-dev armhf 2.1.27+dfsg-1+b1 [255 kB]
Get:194 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-mgo.v2-dev all 2016.08.01-6 [316 kB]
Get:195 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-msgpack-dev all 0.5.5-1 [43.3 kB]
Get:196 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-raft-dev all 1.1.1-2 [88.5 kB]
Get:197 http://172.17.0.1/private bullseye-staging/main armhf libjs-jquery all 3.3.1~dfsg-3 [332 kB]
Get:198 http://172.17.0.1/private bullseye-staging/main armhf libjs-jquery-ui all 1.12.1+dfsg-5 [232 kB]
Get:199 http://172.17.0.1/private bullseye-staging/main armhf golang-golang-x-tools armhf 1:0.0~git20191118.07fc4c7+ds-1 [28.9 MB]
Get:200 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-reflectwalk-dev all 0.0~git20170726.63d60e9-4 [7868 B]
Get:201 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-copystructure-dev all 0.0~git20161013.0.5af94ae-2 [8704 B]
Get:202 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-raftchunking-dev all 0.6.2-2 [12.3 kB]
Get:203 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-reap-dev all 0.0~git20160113.0.2d85522-3 [9334 B]
Get:204 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-sockaddr-dev all 0.0~git20170627.41949a1+ds-2 [62.7 kB]
Get:205 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-version-dev all 1.2.0-1 [13.8 kB]
Get:206 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-hcl-dev all 1.0.0-1 [58.5 kB]
Get:207 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-mapstructure-dev all 1.1.2-1 [21.1 kB]
Get:208 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-hil-dev all 0.0~git20160711.1e86c6b-1 [32.6 kB]
Get:209 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-memberlist-dev all 0.1.5-2 [74.8 kB]
Get:210 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-raft-boltdb-dev all 0.0~git20171010.6e5ba93-3 [11.1 kB]
Get:211 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-net-rpc-msgpackrpc-dev all 0.0~git20151116.0.a14192a-1 [4168 B]
Get:212 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-yamux-dev all 0.0+git20190923.df201c7-1 [22.0 kB]
Get:213 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-scada-client-dev all 0.0~git20160601.0.6e89678-2 [19.3 kB]
Get:214 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mattn-go-isatty-dev all 0.0.8-2 [5864 B]
Get:215 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mattn-go-colorable-dev all 0.0.9-3 [7960 B]
Get:216 http://172.17.0.1/private bullseye-staging/main armhf golang-github-fatih-color-dev all 1.5.0-1 [11.1 kB]
Get:217 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-syslog-dev all 0.0~git20150218.0.42a2b57-1 [5336 B]
Get:218 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-logutils-dev all 0.0~git20150609.0.0dc08b1-1 [8150 B]
Get:219 http://172.17.0.1/private bullseye-staging/main armhf golang-github-posener-complete-dev all 1.1+git20180108.57878c9-3 [17.6 kB]
Get:220 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-cli-dev all 1.0.0-1 [23.8 kB]
Get:221 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ryanuber-columnize-dev all 2.1.1-1 [6600 B]
Get:222 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-serf-dev all 0.8.5~ds1-1 [127 kB]
Get:223 http://172.17.0.1/private bullseye-staging/main armhf golang-github-imdario-mergo-dev all 0.3.5-1 [16.4 kB]
Get:224 http://172.17.0.1/private bullseye-staging/main armhf golang-github-inconshreveable-muxado-dev all 0.0~git20140312.0.f693c7e-2 [26.5 kB]
Get:225 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jeffail-gabs-dev all 2.1.0-2 [16.6 kB]
Get:226 http://172.17.0.1/private bullseye-staging/main armhf golang-github-jefferai-jsonx-dev all 1.0.1-2 [4552 B]
Get:227 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-go-testing-interface-dev all 1.0.0-1 [4268 B]
Get:228 http://172.17.0.1/private bullseye-staging/main armhf golang-github-mitchellh-hashstructure-dev all 1.0.0-1 [7400 B]
Get:229 http://172.17.0.1/private bullseye-staging/main armhf golang-github-nytimes-gziphandler-dev all 1.0.1-1 [37.9 kB]
Get:230 http://172.17.0.1/private bullseye-staging/main armhf golang-github-ryanuber-go-glob-dev all 1.0.0-2 [4588 B]
Get:231 http://172.17.0.1/private bullseye-staging/main armhf golang-github-sap-go-hdb-dev all 0.14.1-2 [61.9 kB]
Get:232 http://172.17.0.1/private bullseye-staging/main armhf golang-github-shirou-gopsutil-dev all 2.18.06-1 [89.3 kB]
Get:233 http://172.17.0.1/private bullseye-staging/main armhf golang-github-spf13-pflag-dev all 1.0.3-1 [38.0 kB]
Get:234 http://172.17.0.1/private bullseye-staging/main armhf golang-gopkg-inf.v0-dev all 0.9.0-3 [14.0 kB]
Get:235 http://172.17.0.1/private bullseye-staging/main armhf mockery armhf 0.0~git20181123.e78b021-2 [1598 kB]
Get:236 http://172.17.0.1/private bullseye-staging/main armhf golang-github-hashicorp-go-rootcerts-dev all 0.0~git20160503.0.6bb64b3-1 [7336 B]
debconf: delaying package configuration, since apt-utils is not installed
Fetched 157 MB in 50s (3135 kB/s)
(Reading database ... 12234 files and directories currently installed.)
Preparing to unpack .../libsystemd0_243-8+rpi1_armhf.deb ...
Unpacking libsystemd0:armhf (243-8+rpi1) over (242-7+rpi1) ...
Setting up libsystemd0:armhf (243-8+rpi1) ...
Selecting previously unselected package libbsd0:armhf.
(Reading database ... 12234 files and directories currently installed.)
Preparing to unpack .../000-libbsd0_0.10.0-1_armhf.deb ...
Unpacking libbsd0:armhf (0.10.0-1) ...
Selecting previously unselected package libtinfo5:armhf.
Preparing to unpack .../001-libtinfo5_6.1+20191019-1_armhf.deb ...
Unpacking libtinfo5:armhf (6.1+20191019-1) ...
Selecting previously unselected package bsdmainutils.
Preparing to unpack .../002-bsdmainutils_11.1.2_armhf.deb ...
Unpacking bsdmainutils (11.1.2) ...
Selecting previously unselected package libuchardet0:armhf.
Preparing to unpack .../003-libuchardet0_0.0.6-3_armhf.deb ...
Unpacking libuchardet0:armhf (0.0.6-3) ...
Selecting previously unselected package groff-base.
Preparing to unpack .../004-groff-base_1.22.4-3_armhf.deb ...
Unpacking groff-base (1.22.4-3) ...
Selecting previously unselected package libpipeline1:armhf.
Preparing to unpack .../005-libpipeline1_1.5.1-2_armhf.deb ...
Unpacking libpipeline1:armhf (1.5.1-2) ...
Selecting previously unselected package man-db.
Preparing to unpack .../006-man-db_2.9.0-1_armhf.deb ...
Unpacking man-db (2.9.0-1) ...
Selecting previously unselected package golang-github-davecgh-go-spew-dev.
Preparing to unpack .../007-golang-github-davecgh-go-spew-dev_1.1.1-2_all.deb ...
Unpacking golang-github-davecgh-go-spew-dev (1.1.1-2) ...
Selecting previously unselected package golang-github-pmezard-go-difflib-dev.
Preparing to unpack .../008-golang-github-pmezard-go-difflib-dev_1.0.0-2_all.deb ...
Unpacking golang-github-pmezard-go-difflib-dev (1.0.0-2) ...
Selecting previously unselected package golang-github-stretchr-objx-dev.
Preparing to unpack .../009-golang-github-stretchr-objx-dev_0.1.1+git20180825.ef50b0d-1_all.deb ...
Unpacking golang-github-stretchr-objx-dev (0.1.1+git20180825.ef50b0d-1) ...
Selecting previously unselected package golang-github-kr-pty-dev.
Preparing to unpack .../010-golang-github-kr-pty-dev_1.1.6-1_all.deb ...
Unpacking golang-github-kr-pty-dev (1.1.6-1) ...
Selecting previously unselected package golang-github-kr-text-dev.
Preparing to unpack .../011-golang-github-kr-text-dev_0.1.0-1_all.deb ...
Unpacking golang-github-kr-text-dev (0.1.0-1) ...
Selecting previously unselected package golang-github-kr-pretty-dev.
Preparing to unpack .../012-golang-github-kr-pretty-dev_0.1.0-1_all.deb ...
Unpacking golang-github-kr-pretty-dev (0.1.0-1) ...
Selecting previously unselected package golang-gopkg-check.v1-dev.
Preparing to unpack .../013-golang-gopkg-check.v1-dev_0.0+git20180628.788fd78-1_all.deb ...
Unpacking golang-gopkg-check.v1-dev (0.0+git20180628.788fd78-1) ...
Selecting previously unselected package golang-gopkg-yaml.v2-dev.
Preparing to unpack .../014-golang-gopkg-yaml.v2-dev_2.2.2-1_all.deb ...
Unpacking golang-gopkg-yaml.v2-dev (2.2.2-1) ...
Selecting previously unselected package golang-github-stretchr-testify-dev.
Preparing to unpack .../015-golang-github-stretchr-testify-dev_1.4.0+ds-1_all.deb ...
Unpacking golang-github-stretchr-testify-dev (1.4.0+ds-1) ...
Selecting previously unselected package golang-golang-x-sys-dev.
Preparing to unpack .../016-golang-golang-x-sys-dev_0.0~git20190726.fc99dfb-1_all.deb ...
Unpacking golang-golang-x-sys-dev (0.0~git20190726.fc99dfb-1) ...
Selecting previously unselected package golang-golang-x-sync-dev.
Preparing to unpack .../017-golang-golang-x-sync-dev_0.0~git20190423.1122301-1_all.deb ...
Unpacking golang-golang-x-sync-dev (0.0~git20190423.1122301-1) ...
Selecting previously unselected package golang-golang-x-xerrors-dev.
Preparing to unpack .../018-golang-golang-x-xerrors-dev_0.0~git20190717.a985d34-1_all.deb ...
Unpacking golang-golang-x-xerrors-dev (0.0~git20190717.a985d34-1) ...
Selecting previously unselected package golang-golang-x-tools-dev.
Preparing to unpack .../019-golang-golang-x-tools-dev_1%3a0.0~git20191118.07fc4c7+ds-1_all.deb ...
Unpacking golang-golang-x-tools-dev (1:0.0~git20191118.07fc4c7+ds-1) ...
Selecting previously unselected package golang-golang-x-text-dev.
Preparing to unpack .../020-golang-golang-x-text-dev_0.3.2-1_all.deb ...
Unpacking golang-golang-x-text-dev (0.3.2-1) ...
Selecting previously unselected package golang-golang-x-net-dev.
Preparing to unpack .../021-golang-golang-x-net-dev_1%3a0.0+git20191112.2180aed+dfsg-1_all.deb ...
Unpacking golang-golang-x-net-dev (1:0.0+git20191112.2180aed+dfsg-1) ...
Selecting previously unselected package golang-golang-x-crypto-dev.
Preparing to unpack .../022-golang-golang-x-crypto-dev_1%3a0.0~git20190701.4def268-2_all.deb ...
Unpacking golang-golang-x-crypto-dev (1:0.0~git20190701.4def268-2) ...
Selecting previously unselected package golang-github-sirupsen-logrus-dev.
Preparing to unpack .../023-golang-github-sirupsen-logrus-dev_1.3.0-1_all.deb ...
Unpacking golang-github-sirupsen-logrus-dev (1.3.0-1) ...
Selecting previously unselected package libelf1:armhf.
Preparing to unpack .../024-libelf1_0.176-1.1_armhf.deb ...
Unpacking libelf1:armhf (0.176-1.1) ...
Selecting previously unselected package libmnl0:armhf.
Preparing to unpack .../025-libmnl0_1.0.4-2_armhf.deb ...
Unpacking libmnl0:armhf (1.0.4-2) ...
Selecting previously unselected package libxtables12:armhf.
Preparing to unpack .../026-libxtables12_1.8.3-2_armhf.deb ...
Unpacking libxtables12:armhf (1.8.3-2) ...
Selecting previously unselected package iproute2.
Preparing to unpack .../027-iproute2_5.3.0-1_armhf.deb ...
Unpacking iproute2 (5.3.0-1) ...
Selecting previously unselected package libncurses6:armhf.
Preparing to unpack .../028-libncurses6_6.1+20191019-1_armhf.deb ...
Unpacking libncurses6:armhf (6.1+20191019-1) ...
Selecting previously unselected package libprocps7:armhf.
Preparing to unpack .../029-libprocps7_2%3a3.3.15-2_armhf.deb ...
Unpacking libprocps7:armhf (2:3.3.15-2) ...
Selecting previously unselected package procps.
Preparing to unpack .../030-procps_2%3a3.3.15-2_armhf.deb ...
Unpacking procps (2:3.3.15-2) ...
Selecting previously unselected package sensible-utils.
Preparing to unpack .../031-sensible-utils_0.0.12_all.deb ...
Unpacking sensible-utils (0.0.12) ...
Selecting previously unselected package bash-completion.
Preparing to unpack .../032-bash-completion_1%3a2.8-6_all.deb ...
Unpacking bash-completion (1:2.8-6) ...
Selecting previously unselected package libmagic-mgc.
Preparing to unpack .../033-libmagic-mgc_1%3a5.37-6_armhf.deb ...
Unpacking libmagic-mgc (1:5.37-6) ...
Selecting previously unselected package libmagic1:armhf.
Preparing to unpack .../034-libmagic1_1%3a5.37-6_armhf.deb ...
Unpacking libmagic1:armhf (1:5.37-6) ...
Selecting previously unselected package file.
Preparing to unpack .../035-file_1%3a5.37-6_armhf.deb ...
Unpacking file (1:5.37-6) ...
Selecting previously unselected package gettext-base.
Preparing to unpack .../036-gettext-base_0.19.8.1-10_armhf.deb ...
Unpacking gettext-base (0.19.8.1-10) ...
Selecting previously unselected package lsof.
Preparing to unpack .../037-lsof_4.93.2+dfsg-1_armhf.deb ...
Unpacking lsof (4.93.2+dfsg-1) ...
Selecting previously unselected package libsigsegv2:armhf.
Preparing to unpack .../038-libsigsegv2_2.12-2_armhf.deb ...
Unpacking libsigsegv2:armhf (2.12-2) ...
Selecting previously unselected package m4.
Preparing to unpack .../039-m4_1.4.18-4_armhf.deb ...
Unpacking m4 (1.4.18-4) ...
Selecting previously unselected package autoconf.
Preparing to unpack .../040-autoconf_2.69-11_all.deb ...
Unpacking autoconf (2.69-11) ...
Selecting previously unselected package autotools-dev.
Preparing to unpack .../041-autotools-dev_20180224.1_all.deb ...
Unpacking autotools-dev (20180224.1) ...
Selecting previously unselected package automake.
Preparing to unpack .../042-automake_1%3a1.16.1-4_all.deb ...
Unpacking automake (1:1.16.1-4) ...
Selecting previously unselected package autopoint.
Preparing to unpack .../043-autopoint_0.19.8.1-10_all.deb ...
Unpacking autopoint (0.19.8.1-10) ...
Selecting previously unselected package libssl1.1:armhf.
Preparing to unpack .../044-libssl1.1_1.1.1d-2_armhf.deb ...
Unpacking libssl1.1:armhf (1.1.1d-2) ...
Selecting previously unselected package openssl.
Preparing to unpack .../045-openssl_1.1.1d-2_armhf.deb ...
Unpacking openssl (1.1.1d-2) ...
Selecting previously unselected package ca-certificates.
Preparing to unpack .../046-ca-certificates_20190110_all.deb ...
Unpacking ca-certificates (20190110) ...
Selecting previously unselected package libtool.
Preparing to unpack .../047-libtool_2.4.6-11_all.deb ...
Unpacking libtool (2.4.6-11) ...
Selecting previously unselected package dh-autoreconf.
Preparing to unpack .../048-dh-autoreconf_19_all.deb ...
Unpacking dh-autoreconf (19) ...
Selecting previously unselected package libdebhelper-perl.
Preparing to unpack .../049-libdebhelper-perl_12.7.1_all.deb ...
Unpacking libdebhelper-perl (12.7.1) ...
Selecting previously unselected package libarchive-zip-perl.
Preparing to unpack .../050-libarchive-zip-perl_1.67-1_all.deb ...
Unpacking libarchive-zip-perl (1.67-1) ...
Selecting previously unselected package libsub-override-perl.
Preparing to unpack .../051-libsub-override-perl_0.09-2_all.deb ...
Unpacking libsub-override-perl (0.09-2) ...
Selecting previously unselected package libfile-stripnondeterminism-perl.
Preparing to unpack .../052-libfile-stripnondeterminism-perl_1.6.3-1_all.deb ...
Unpacking libfile-stripnondeterminism-perl (1.6.3-1) ...
Selecting previously unselected package dh-strip-nondeterminism.
Preparing to unpack .../053-dh-strip-nondeterminism_1.6.3-1_all.deb ...
Unpacking dh-strip-nondeterminism (1.6.3-1) ...
Selecting previously unselected package dwz.
Preparing to unpack .../054-dwz_0.13-2_armhf.deb ...
Unpacking dwz (0.13-2) ...
Selecting previously unselected package libglib2.0-0:armhf.
Preparing to unpack .../055-libglib2.0-0_2.62.2-3_armhf.deb ...
Unpacking libglib2.0-0:armhf (2.62.2-3) ...
Selecting previously unselected package libicu63:armhf.
Preparing to unpack .../056-libicu63_63.2-2_armhf.deb ...
Unpacking libicu63:armhf (63.2-2) ...
Selecting previously unselected package libxml2:armhf.
Preparing to unpack .../057-libxml2_2.9.4+dfsg1-8_armhf.deb ...
Unpacking libxml2:armhf (2.9.4+dfsg1-8) ...
Selecting previously unselected package libcroco3:armhf.
Preparing to unpack .../058-libcroco3_0.6.13-1_armhf.deb ...
Unpacking libcroco3:armhf (0.6.13-1) ...
Selecting previously unselected package gettext.
Preparing to unpack .../059-gettext_0.19.8.1-10_armhf.deb ...
Unpacking gettext (0.19.8.1-10) ...
Selecting previously unselected package intltool-debian.
Preparing to unpack .../060-intltool-debian_0.35.0+20060710.5_all.deb ...
Unpacking intltool-debian (0.35.0+20060710.5) ...
Selecting previously unselected package po-debconf.
Preparing to unpack .../061-po-debconf_1.0.21_all.deb ...
Unpacking po-debconf (1.0.21) ...
Selecting previously unselected package debhelper.
Preparing to unpack .../062-debhelper_12.7.1_all.deb ...
Unpacking debhelper (12.7.1) ...
Selecting previously unselected package dh-golang.
Preparing to unpack .../063-dh-golang_1.42_all.deb ...
Unpacking dh-golang (1.42) ...
Selecting previously unselected package golang-github-gogo-protobuf-dev.
Preparing to unpack .../064-golang-github-gogo-protobuf-dev_1.2.1+git20190611.dadb6258-1_all.deb ...
Unpacking golang-github-gogo-protobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package libprotobuf17:armhf.
Preparing to unpack .../065-libprotobuf17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package libprotoc17:armhf.
Preparing to unpack .../066-libprotoc17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotoc17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package protobuf-compiler.
Preparing to unpack .../067-protobuf-compiler_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking protobuf-compiler (3.6.1.3-2+rpi1) ...
Selecting previously unselected package gogoprotobuf.
Preparing to unpack .../068-gogoprotobuf_1.2.1+git20190611.dadb6258-1_armhf.deb ...
Unpacking gogoprotobuf (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package golang-1.13-src.
Preparing to unpack .../069-golang-1.13-src_1.13.4-1+rpi1_armhf.deb ...
Unpacking golang-1.13-src (1.13.4-1+rpi1) ...
Selecting previously unselected package golang-1.13-go.
Preparing to unpack .../070-golang-1.13-go_1.13.4-1+rpi1_armhf.deb ...
Unpacking golang-1.13-go (1.13.4-1+rpi1) ...
Selecting previously unselected package golang-src.
Preparing to unpack .../071-golang-src_2%3a1.13~1+b11_armhf.deb ...
Unpacking golang-src (2:1.13~1+b11) ...
Selecting previously unselected package golang-go.
Preparing to unpack .../072-golang-go_2%3a1.13~1+b11_armhf.deb ...
Unpacking golang-go (2:1.13~1+b11) ...
Selecting previously unselected package golang-any.
Preparing to unpack .../073-golang-any_2%3a1.13~1+b11_armhf.deb ...
Unpacking golang-any (2:1.13~1+b11) ...
Selecting previously unselected package golang-dbus-dev.
Preparing to unpack .../074-golang-dbus-dev_5.0.2-1_all.deb ...
Unpacking golang-dbus-dev (5.0.2-1) ...
Selecting previously unselected package golang-github-miekg-dns-dev.
Preparing to unpack .../075-golang-github-miekg-dns-dev_1.0.4+ds-1_all.deb ...
Unpacking golang-github-miekg-dns-dev (1.0.4+ds-1) ...
Selecting previously unselected package golang-dns-dev.
Preparing to unpack .../076-golang-dns-dev_1.0.4+ds-1_all.deb ...
Unpacking golang-dns-dev (1.0.4+ds-1) ...
Selecting previously unselected package golang-github-alecthomas-units-dev.
Preparing to unpack .../077-golang-github-alecthomas-units-dev_0.0~git20151022.0.2efee85-4_all.deb ...
Unpacking golang-github-alecthomas-units-dev (0.0~git20151022.0.2efee85-4) ...
Selecting previously unselected package golang-github-armon-circbuf-dev.
Preparing to unpack .../078-golang-github-armon-circbuf-dev_0.0~git20150827.0.bbbad09-2_all.deb ...
Unpacking golang-github-armon-circbuf-dev (0.0~git20150827.0.bbbad09-2) ...
Selecting previously unselected package golang-github-pkg-errors-dev.
Preparing to unpack .../079-golang-github-pkg-errors-dev_0.8.1-1_all.deb ...
Unpacking golang-github-pkg-errors-dev (0.8.1-1) ...
Selecting previously unselected package golang-github-circonus-labs-circonusllhist-dev.
Preparing to unpack .../080-golang-github-circonus-labs-circonusllhist-dev_0.0~git20160526.0.d724266-2_all.deb ...
Unpacking golang-github-circonus-labs-circonusllhist-dev (0.0~git20160526.0.d724266-2) ...
Selecting previously unselected package golang-github-hashicorp-go-cleanhttp-dev.
Preparing to unpack .../081-golang-github-hashicorp-go-cleanhttp-dev_0.5.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-cleanhttp-dev (0.5.1-1) ...
Selecting previously unselected package golang-github-hashicorp-go-hclog-dev.
Preparing to unpack .../082-golang-github-hashicorp-go-hclog-dev_0.9.2-1_all.deb ...
Unpacking golang-github-hashicorp-go-hclog-dev (0.9.2-1) ...
Selecting previously unselected package golang-github-hashicorp-go-retryablehttp-dev.
Preparing to unpack .../083-golang-github-hashicorp-go-retryablehttp-dev_0.6.3-1_all.deb ...
Unpacking golang-github-hashicorp-go-retryablehttp-dev (0.6.3-1) ...
Selecting previously unselected package golang-github-tv42-httpunix-dev.
Preparing to unpack .../084-golang-github-tv42-httpunix-dev_0.0~git20150427.b75d861-2_all.deb ...
Unpacking golang-github-tv42-httpunix-dev (0.0~git20150427.b75d861-2) ...
Selecting previously unselected package golang-github-circonus-labs-circonus-gometrics-dev.
Preparing to unpack .../085-golang-github-circonus-labs-circonus-gometrics-dev_2.3.1-2_all.deb ...
Unpacking golang-github-circonus-labs-circonus-gometrics-dev (2.3.1-2) ...
Selecting previously unselected package golang-github-datadog-datadog-go-dev.
Preparing to unpack .../086-golang-github-datadog-datadog-go-dev_2.1.0-2_all.deb ...
Unpacking golang-github-datadog-datadog-go-dev (2.1.0-2) ...
Selecting previously unselected package golang-github-hashicorp-go-uuid-dev.
Preparing to unpack .../087-golang-github-hashicorp-go-uuid-dev_1.0.1-1_all.deb ...
Unpacking golang-github-hashicorp-go-uuid-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-hashicorp-golang-lru-dev.
Preparing to unpack .../088-golang-github-hashicorp-golang-lru-dev_0.5.0-1_all.deb ...
Unpacking golang-github-hashicorp-golang-lru-dev (0.5.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-immutable-radix-dev.
Preparing to unpack .../089-golang-github-hashicorp-go-immutable-radix-dev_1.1.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-immutable-radix-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-pascaldekloe-goe-dev.
Preparing to unpack .../090-golang-github-pascaldekloe-goe-dev_0.1.0-2_all.deb ...
Unpacking golang-github-pascaldekloe-goe-dev (0.1.0-2) ...
Selecting previously unselected package golang-github-beorn7-perks-dev.
Preparing to unpack .../091-golang-github-beorn7-perks-dev_0.0~git20160804.0.4c0e845-1_all.deb ...
Unpacking golang-github-beorn7-perks-dev (0.0~git20160804.0.4c0e845-1) ...
Selecting previously unselected package golang-github-cespare-xxhash-dev.
Preparing to unpack .../092-golang-github-cespare-xxhash-dev_2.1.0-1_all.deb ...
Unpacking golang-github-cespare-xxhash-dev (2.1.0-1) ...
Selecting previously unselected package golang-github-google-gofuzz-dev.
Preparing to unpack .../093-golang-github-google-gofuzz-dev_0.0~git20170612.24818f7-1_all.deb ...
Unpacking golang-github-google-gofuzz-dev (0.0~git20170612.24818f7-1) ...
Selecting previously unselected package golang-github-modern-go-concurrent-dev.
Preparing to unpack .../094-golang-github-modern-go-concurrent-dev_1.0.3-1_all.deb ...
Unpacking golang-github-modern-go-concurrent-dev (1.0.3-1) ...
Selecting previously unselected package golang-github-modern-go-reflect2-dev.
Preparing to unpack .../095-golang-github-modern-go-reflect2-dev_1.0.0-1_all.deb ...
Unpacking golang-github-modern-go-reflect2-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-json-iterator-go-dev.
Preparing to unpack .../096-golang-github-json-iterator-go-dev_1.1.4-1_all.deb ...
Unpacking golang-github-json-iterator-go-dev (1.1.4-1) ...
Selecting previously unselected package zlib1g-dev:armhf.
Preparing to unpack .../097-zlib1g-dev_1%3a1.2.11.dfsg-1_armhf.deb ...
Unpacking zlib1g-dev:armhf (1:1.2.11.dfsg-1) ...
Selecting previously unselected package libprotobuf-lite17:armhf.
Preparing to unpack .../098-libprotobuf-lite17_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf-lite17:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package libprotobuf-dev:armhf.
Preparing to unpack .../099-libprotobuf-dev_3.6.1.3-2+rpi1_armhf.deb ...
Unpacking libprotobuf-dev:armhf (3.6.1.3-2+rpi1) ...
Selecting previously unselected package golang-goprotobuf-dev.
Preparing to unpack .../100-golang-goprotobuf-dev_1.3.2-2_armhf.deb ...
Unpacking golang-goprotobuf-dev (1.3.2-2) ...
Selecting previously unselected package golang-github-prometheus-client-model-dev.
Preparing to unpack .../101-golang-github-prometheus-client-model-dev_0.0.2+git20171117.99fa1f4-1_all.deb ...
Unpacking golang-github-prometheus-client-model-dev (0.0.2+git20171117.99fa1f4-1) ...
Selecting previously unselected package golang-github-dgrijalva-jwt-go-v3-dev.
Preparing to unpack .../102-golang-github-dgrijalva-jwt-go-v3-dev_3.2.0-2_all.deb ...
Unpacking golang-github-dgrijalva-jwt-go-v3-dev (3.2.0-2) ...
Selecting previously unselected package golang-github-go-logfmt-logfmt-dev.
Preparing to unpack .../103-golang-github-go-logfmt-logfmt-dev_0.3.0-1_all.deb ...
Unpacking golang-github-go-logfmt-logfmt-dev (0.3.0-1) ...
Selecting previously unselected package golang-github-go-stack-stack-dev.
Preparing to unpack .../104-golang-github-go-stack-stack-dev_1.5.2-2_all.deb ...
Unpacking golang-github-go-stack-stack-dev (1.5.2-2) ...
Selecting previously unselected package golang-github-opentracing-opentracing-go-dev.
Preparing to unpack .../105-golang-github-opentracing-opentracing-go-dev_1.0.2-1_all.deb ...
Unpacking golang-github-opentracing-opentracing-go-dev (1.0.2-1) ...
Selecting previously unselected package golang-golang-x-time-dev.
Preparing to unpack .../106-golang-golang-x-time-dev_0.0~git20161028.0.f51c127-2_all.deb ...
Unpacking golang-golang-x-time-dev (0.0~git20161028.0.f51c127-2) ...
Selecting previously unselected package golang-github-golang-mock-dev.
Preparing to unpack .../107-golang-github-golang-mock-dev_1.3.1-2_all.deb ...
Unpacking golang-github-golang-mock-dev (1.3.1-2) ...
Selecting previously unselected package golang-github-google-go-cmp-dev.
Preparing to unpack .../108-golang-github-google-go-cmp-dev_0.3.1-1_all.deb ...
Unpacking golang-github-google-go-cmp-dev (0.3.1-1) ...
Selecting previously unselected package golang-glog-dev.
Preparing to unpack .../109-golang-glog-dev_0.0~git20160126.23def4e-3_all.deb ...
Unpacking golang-glog-dev (0.0~git20160126.23def4e-3) ...
Selecting previously unselected package golang-golang-x-oauth2-dev.
Preparing to unpack .../110-golang-golang-x-oauth2-dev_0.0~git20190604.0f29369-2_all.deb ...
Unpacking golang-golang-x-oauth2-dev (0.0~git20190604.0f29369-2) ...
Selecting previously unselected package golang-google-cloud-compute-metadata-dev.
Preparing to unpack .../111-golang-google-cloud-compute-metadata-dev_0.43.0-1_all.deb ...
Unpacking golang-google-cloud-compute-metadata-dev (0.43.0-1) ...
Selecting previously unselected package golang-golang-x-oauth2-google-dev.
Preparing to unpack .../112-golang-golang-x-oauth2-google-dev_0.0~git20190604.0f29369-2_all.deb ...
Unpacking golang-golang-x-oauth2-google-dev (0.0~git20190604.0f29369-2) ...
Selecting previously unselected package golang-google-genproto-dev.
Preparing to unpack .../113-golang-google-genproto-dev_0.0~git20190801.fa694d8-2_all.deb ...
Unpacking golang-google-genproto-dev (0.0~git20190801.fa694d8-2) ...
Selecting previously unselected package golang-google-grpc-dev.
Preparing to unpack .../114-golang-google-grpc-dev_1.22.1-1_all.deb ...
Unpacking golang-google-grpc-dev (1.22.1-1) ...
Selecting previously unselected package golang-github-go-kit-kit-dev.
Preparing to unpack .../115-golang-github-go-kit-kit-dev_0.6.0-2_all.deb ...
Unpacking golang-github-go-kit-kit-dev (0.6.0-2) ...
Selecting previously unselected package golang-github-julienschmidt-httprouter-dev.
Preparing to unpack .../116-golang-github-julienschmidt-httprouter-dev_1.1-5_all.deb ...
Unpacking golang-github-julienschmidt-httprouter-dev (1.1-5) ...
Selecting previously unselected package golang-github-jpillora-backoff-dev.
Preparing to unpack .../117-golang-github-jpillora-backoff-dev_1.0.0-1_all.deb ...
Unpacking golang-github-jpillora-backoff-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mwitkow-go-conntrack-dev.
Preparing to unpack .../118-golang-github-mwitkow-go-conntrack-dev_0.0~git20190716.2f06839-1_all.deb ...
Unpacking golang-github-mwitkow-go-conntrack-dev (0.0~git20190716.2f06839-1) ...
Selecting previously unselected package golang-gopkg-alecthomas-kingpin.v2-dev.
Preparing to unpack .../119-golang-gopkg-alecthomas-kingpin.v2-dev_2.2.6-1_all.deb ...
Unpacking golang-gopkg-alecthomas-kingpin.v2-dev (2.2.6-1) ...
Selecting previously unselected package golang-protobuf-extensions-dev.
Preparing to unpack .../120-golang-protobuf-extensions-dev_1.0.1-1_all.deb ...
Unpacking golang-protobuf-extensions-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-prometheus-common-dev.
Preparing to unpack .../121-golang-github-prometheus-common-dev_0.7.0-1_all.deb ...
Unpacking golang-github-prometheus-common-dev (0.7.0-1) ...
Selecting previously unselected package golang-procfs-dev.
Preparing to unpack .../122-golang-procfs-dev_0.0.3-1_all.deb ...
Unpacking golang-procfs-dev (0.0.3-1) ...
Selecting previously unselected package golang-github-prometheus-client-golang-dev.
Preparing to unpack .../123-golang-github-prometheus-client-golang-dev_1.2.1-3_all.deb ...
Unpacking golang-github-prometheus-client-golang-dev (1.2.1-3) ...
Selecting previously unselected package golang-github-armon-go-metrics-dev.
Preparing to unpack .../124-golang-github-armon-go-metrics-dev_0.0~git20190430.ec5e00d-1_all.deb ...
Unpacking golang-github-armon-go-metrics-dev (0.0~git20190430.ec5e00d-1) ...
Selecting previously unselected package golang-github-armon-go-radix-dev.
Preparing to unpack .../125-golang-github-armon-go-radix-dev_1.0.0-1_all.deb ...
Unpacking golang-github-armon-go-radix-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-asaskevich-govalidator-dev.
Preparing to unpack .../126-golang-github-asaskevich-govalidator-dev_9+git20180720.0.f9ffefc3-1_all.deb ...
Unpacking golang-github-asaskevich-govalidator-dev (9+git20180720.0.f9ffefc3-1) ...
Selecting previously unselected package golang-github-go-ini-ini-dev.
Preparing to unpack .../127-golang-github-go-ini-ini-dev_1.32.0-2_all.deb ...
Unpacking golang-github-go-ini-ini-dev (1.32.0-2) ...
Selecting previously unselected package golang-github-jmespath-go-jmespath-dev.
Preparing to unpack .../128-golang-github-jmespath-go-jmespath-dev_0.2.2-3_all.deb ...
Unpacking golang-github-jmespath-go-jmespath-dev (0.2.2-3) ...
Selecting previously unselected package golang-github-aws-aws-sdk-go-dev.
Preparing to unpack .../129-golang-github-aws-aws-sdk-go-dev_1.21.6+dfsg-2_all.deb ...
Unpacking golang-github-aws-aws-sdk-go-dev (1.21.6+dfsg-2) ...
Selecting previously unselected package golang-github-dgrijalva-jwt-go-dev.
Preparing to unpack .../130-golang-github-dgrijalva-jwt-go-dev_3.2.0-1_all.deb ...
Unpacking golang-github-dgrijalva-jwt-go-dev (3.2.0-1) ...
Selecting previously unselected package golang-github-dimchansky-utfbom-dev.
Preparing to unpack .../131-golang-github-dimchansky-utfbom-dev_0.0~git20170328.6c6132f-1_all.deb ...
Unpacking golang-github-dimchansky-utfbom-dev (0.0~git20170328.6c6132f-1) ...
Selecting previously unselected package golang-github-mitchellh-go-homedir-dev.
Preparing to unpack .../132-golang-github-mitchellh-go-homedir-dev_1.1.0-1_all.deb ...
Unpacking golang-github-mitchellh-go-homedir-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-azure-go-autorest-dev.
Preparing to unpack .../133-golang-github-azure-go-autorest-dev_10.15.5-1_all.deb ...
Unpacking golang-github-azure-go-autorest-dev (10.15.5-1) ...
Selecting previously unselected package golang-github-bgentry-speakeasy-dev.
Preparing to unpack .../134-golang-github-bgentry-speakeasy-dev_0.1.0-1_all.deb ...
Unpacking golang-github-bgentry-speakeasy-dev (0.1.0-1) ...
Selecting previously unselected package golang-github-boltdb-bolt-dev.
Preparing to unpack .../135-golang-github-boltdb-bolt-dev_1.3.1-6_all.deb ...
Unpacking golang-github-boltdb-bolt-dev (1.3.1-6) ...
Selecting previously unselected package golang-github-bradfitz-gomemcache-dev.
Preparing to unpack .../136-golang-github-bradfitz-gomemcache-dev_0.0~git20141109-3_all.deb ...
Unpacking golang-github-bradfitz-gomemcache-dev (0.0~git20141109-3) ...
Selecting previously unselected package golang-github-coreos-pkg-dev.
Preparing to unpack .../137-golang-github-coreos-pkg-dev_4-2_all.deb ...
Unpacking golang-github-coreos-pkg-dev (4-2) ...
Selecting previously unselected package libsystemd-dev:armhf.
Preparing to unpack .../138-libsystemd-dev_243-8+rpi1_armhf.deb ...
Unpacking libsystemd-dev:armhf (243-8+rpi1) ...
Selecting previously unselected package pkg-config.
Preparing to unpack .../139-pkg-config_0.29-6_armhf.deb ...
Unpacking pkg-config (0.29-6) ...
Selecting previously unselected package golang-github-coreos-go-systemd-dev.
Preparing to unpack .../140-golang-github-coreos-go-systemd-dev_20-1_all.deb ...
Unpacking golang-github-coreos-go-systemd-dev (20-1) ...
Selecting previously unselected package golang-github-cyphar-filepath-securejoin-dev.
Preparing to unpack .../141-golang-github-cyphar-filepath-securejoin-dev_0.2.2-1_all.deb ...
Unpacking golang-github-cyphar-filepath-securejoin-dev (0.2.2-1) ...
Selecting previously unselected package golang-github-google-go-querystring-dev.
Preparing to unpack .../142-golang-github-google-go-querystring-dev_1.0.0-1_all.deb ...
Unpacking golang-github-google-go-querystring-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-tent-http-link-go-dev.
Preparing to unpack .../143-golang-github-tent-http-link-go-dev_0.0~git20130702.0.ac974c6-6_all.deb ...
Unpacking golang-github-tent-http-link-go-dev (0.0~git20130702.0.ac974c6-6) ...
Selecting previously unselected package golang-github-digitalocean-godo-dev.
Preparing to unpack .../144-golang-github-digitalocean-godo-dev_1.1.0-1_all.deb ...
Unpacking golang-github-digitalocean-godo-dev (1.1.0-1) ...
Selecting previously unselected package golang-github-docker-go-units-dev.
Preparing to unpack .../145-golang-github-docker-go-units-dev_0.4.0-1_all.deb ...
Unpacking golang-github-docker-go-units-dev (0.4.0-1) ...
Selecting previously unselected package golang-github-opencontainers-selinux-dev.
Preparing to unpack .../146-golang-github-opencontainers-selinux-dev_1.3.0-2_all.deb ...
Unpacking golang-github-opencontainers-selinux-dev (1.3.0-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonpointer-dev.
Preparing to unpack .../147-golang-github-xeipuuv-gojsonpointer-dev_0.0~git20151027.0.e0fe6f6-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonpointer-dev (0.0~git20151027.0.e0fe6f6-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonreference-dev.
Preparing to unpack .../148-golang-github-xeipuuv-gojsonreference-dev_0.0~git20150808.0.e02fc20-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonreference-dev (0.0~git20150808.0.e02fc20-2) ...
Selecting previously unselected package golang-github-xeipuuv-gojsonschema-dev.
Preparing to unpack .../149-golang-github-xeipuuv-gojsonschema-dev_0.0~git20170210.0.6b67b3f-2_all.deb ...
Unpacking golang-github-xeipuuv-gojsonschema-dev (0.0~git20170210.0.6b67b3f-2) ...
Selecting previously unselected package golang-github-opencontainers-specs-dev.
Preparing to unpack .../150-golang-github-opencontainers-specs-dev_1.0.1+git20190408.a1b50f6-1_all.deb ...
Unpacking golang-github-opencontainers-specs-dev (1.0.1+git20190408.a1b50f6-1) ...
Selecting previously unselected package libseccomp-dev:armhf.
Preparing to unpack .../151-libseccomp-dev_2.4.1-2+rpi1_armhf.deb ...
Unpacking libseccomp-dev:armhf (2.4.1-2+rpi1) ...
Selecting previously unselected package golang-github-seccomp-libseccomp-golang-dev.
Preparing to unpack .../152-golang-github-seccomp-libseccomp-golang-dev_0.9.1-1_all.deb ...
Unpacking golang-github-seccomp-libseccomp-golang-dev (0.9.1-1) ...
Selecting previously unselected package golang-github-urfave-cli-dev.
Preparing to unpack .../153-golang-github-urfave-cli-dev_1.20.0-1_all.deb ...
Unpacking golang-github-urfave-cli-dev (1.20.0-1) ...
Selecting previously unselected package golang-github-vishvananda-netns-dev.
Preparing to unpack .../154-golang-github-vishvananda-netns-dev_0.0~git20170707.0.86bef33-1_all.deb ...
Unpacking golang-github-vishvananda-netns-dev (0.0~git20170707.0.86bef33-1) ...
Selecting previously unselected package golang-github-vishvananda-netlink-dev.
Preparing to unpack .../155-golang-github-vishvananda-netlink-dev_1.0.0+git20181030.023a6da-1_all.deb ...
Unpacking golang-github-vishvananda-netlink-dev (1.0.0+git20181030.023a6da-1) ...
Selecting previously unselected package golang-gocapability-dev.
Preparing to unpack .../156-golang-gocapability-dev_0.0+git20180916.d983527-1_all.deb ...
Unpacking golang-gocapability-dev (0.0+git20180916.d983527-1) ...
Selecting previously unselected package golang-github-opencontainers-runc-dev.
Preparing to unpack .../157-golang-github-opencontainers-runc-dev_1.0.0~rc9+dfsg1-1+rpi1_all.deb ...
Unpacking golang-github-opencontainers-runc-dev (1.0.0~rc9+dfsg1-1+rpi1) ...
Selecting previously unselected package golang-github-docker-go-connections-dev.
Preparing to unpack .../158-golang-github-docker-go-connections-dev_0.4.0-1_all.deb ...
Unpacking golang-github-docker-go-connections-dev (0.4.0-1) ...
Selecting previously unselected package golang-github-elazarl-go-bindata-assetfs-dev.
Preparing to unpack .../159-golang-github-elazarl-go-bindata-assetfs-dev_1.0.0-1_all.deb ...
Unpacking golang-github-elazarl-go-bindata-assetfs-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-garyburd-redigo-dev.
Preparing to unpack .../160-golang-github-garyburd-redigo-dev_0.0~git20150901.0.d8dbe4d-2_all.deb ...
Unpacking golang-github-garyburd-redigo-dev (0.0~git20150901.0.d8dbe4d-2) ...
Selecting previously unselected package golang-github-ghodss-yaml-dev.
Preparing to unpack .../161-golang-github-ghodss-yaml-dev_1.0.0-1_all.deb ...
Unpacking golang-github-ghodss-yaml-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-go-test-deep-dev.
Preparing to unpack .../162-golang-github-go-test-deep-dev_1.0.3-1_all.deb ...
Unpacking golang-github-go-test-deep-dev (1.0.3-1) ...
Selecting previously unselected package golang-gogoprotobuf-dev.
Preparing to unpack .../163-golang-gogoprotobuf-dev_1.2.1+git20190611.dadb6258-1_all.deb ...
Unpacking golang-gogoprotobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Selecting previously unselected package golang-github-gogo-googleapis-dev.
Preparing to unpack .../164-golang-github-gogo-googleapis-dev_1.2.0-1_all.deb ...
Unpacking golang-github-gogo-googleapis-dev (1.2.0-1) ...
Selecting previously unselected package golang-github-golang-snappy-dev.
Preparing to unpack .../165-golang-github-golang-snappy-dev_0.0+git20160529.d9eb7a3-3_all.deb ...
Unpacking golang-github-golang-snappy-dev (0.0+git20160529.d9eb7a3-3) ...
Selecting previously unselected package golang-github-google-btree-dev.
Preparing to unpack .../166-golang-github-google-btree-dev_0.0~git20161217.0.316fb6d-1_all.deb ...
Unpacking golang-github-google-btree-dev (0.0~git20161217.0.316fb6d-1) ...
Selecting previously unselected package golang-github-docopt-docopt-go-dev.
Preparing to unpack .../167-golang-github-docopt-docopt-go-dev_0.6.2+git20160216.0.784ddc5-1_all.deb ...
Unpacking golang-github-docopt-docopt-go-dev (0.6.2+git20160216.0.784ddc5-1) ...
Selecting previously unselected package golang-github-googleapis-gnostic-dev.
Preparing to unpack .../168-golang-github-googleapis-gnostic-dev_0.2.0-1_all.deb ...
Unpacking golang-github-googleapis-gnostic-dev (0.2.0-1) ...
Selecting previously unselected package golang-github-peterbourgon-diskv-dev.
Preparing to unpack .../169-golang-github-peterbourgon-diskv-dev_2.0.1-1_all.deb ...
Unpacking golang-github-peterbourgon-diskv-dev (2.0.1-1) ...
Selecting previously unselected package golang-gomega-dev.
Preparing to unpack .../170-golang-gomega-dev_1.0+git20160910.d59fa0a-1_all.deb ...
Unpacking golang-gomega-dev (1.0+git20160910.d59fa0a-1) ...
Selecting previously unselected package golang-ginkgo-dev.
Preparing to unpack .../171-golang-ginkgo-dev_1.2.0+git20161006.acfa16a-1_armhf.deb ...
Unpacking golang-ginkgo-dev (1.2.0+git20161006.acfa16a-1) ...
Selecting previously unselected package golang-github-syndtr-goleveldb-dev.
Preparing to unpack .../172-golang-github-syndtr-goleveldb-dev_0.0~git20170725.0.b89cc31-2_all.deb ...
Unpacking golang-github-syndtr-goleveldb-dev (0.0~git20170725.0.b89cc31-2) ...
Selecting previously unselected package golang-github-gregjones-httpcache-dev.
Preparing to unpack .../173-golang-github-gregjones-httpcache-dev_0.0~git20180305.9cad4c3-1_all.deb ...
Unpacking golang-github-gregjones-httpcache-dev (0.0~git20180305.9cad4c3-1) ...
Selecting previously unselected package golang-github-hashicorp-errwrap-dev.
Preparing to unpack .../174-golang-github-hashicorp-errwrap-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-errwrap-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-checkpoint-dev.
Preparing to unpack .../175-golang-github-hashicorp-go-checkpoint-dev_0.0~git20171009.1545e56-2_all.deb ...
Unpacking golang-github-hashicorp-go-checkpoint-dev (0.0~git20171009.1545e56-2) ...
Selecting previously unselected package golang-github-denverdino-aliyungo-dev.
Preparing to unpack .../176-golang-github-denverdino-aliyungo-dev_0.0~git20180921.13fa8aa-2_all.deb ...
Unpacking golang-github-denverdino-aliyungo-dev (0.0~git20180921.13fa8aa-2) ...
Selecting previously unselected package golang-github-gophercloud-gophercloud-dev.
Preparing to unpack .../177-golang-github-gophercloud-gophercloud-dev_0.6.0-1_all.deb ...
Unpacking golang-github-gophercloud-gophercloud-dev (0.6.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-multierror-dev.
Preparing to unpack .../178-golang-github-hashicorp-go-multierror-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-multierror-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-hashicorp-mdns-dev.
Preparing to unpack .../179-golang-github-hashicorp-mdns-dev_0.0~git20150317.0.2b439d3-2_all.deb ...
Unpacking golang-github-hashicorp-mdns-dev (0.0~git20150317.0.2b439d3-2) ...
Selecting previously unselected package golang-github-packethost-packngo-dev.
Preparing to unpack .../180-golang-github-packethost-packngo-dev_0.2.0-2_all.deb ...
Unpacking golang-github-packethost-packngo-dev (0.2.0-2) ...
Selecting previously unselected package golang-github-vmware-govmomi-dev.
Preparing to unpack .../181-golang-github-vmware-govmomi-dev_0.15.0-1_all.deb ...
Unpacking golang-github-vmware-govmomi-dev (0.15.0-1) ...
Selecting previously unselected package golang-go.opencensus-dev.
Preparing to unpack .../182-golang-go.opencensus-dev_0.22.0-1_all.deb ...
Unpacking golang-go.opencensus-dev (0.22.0-1) ...
Selecting previously unselected package golang-google-api-dev.
Preparing to unpack .../183-golang-google-api-dev_0.7.0-2_all.deb ...
Unpacking golang-google-api-dev (0.7.0-2) ...
Selecting previously unselected package golang-github-hashicorp-go-discover-dev.
Preparing to unpack .../184-golang-github-hashicorp-go-discover-dev_0.0+git20190905.34a6505-2_all.deb ...
Unpacking golang-github-hashicorp-go-discover-dev (0.0+git20190905.34a6505-2) ...
Selecting previously unselected package golang-github-hashicorp-go-memdb-dev.
Preparing to unpack .../185-golang-github-hashicorp-go-memdb-dev_0.0~git20180224.1289e7ff-1_all.deb ...
Unpacking golang-github-hashicorp-go-memdb-dev (0.0~git20180224.1289e7ff-1) ...
Selecting previously unselected package golang-github-ugorji-go-msgpack-dev.
Preparing to unpack .../186-golang-github-ugorji-go-msgpack-dev_0.0~git20130605.792643-5_all.deb ...
Unpacking golang-github-ugorji-go-msgpack-dev (0.0~git20130605.792643-5) ...
Selecting previously unselected package golang-github-ugorji-go-codec-dev.
Preparing to unpack .../187-golang-github-ugorji-go-codec-dev_1.1.7-1_all.deb ...
Unpacking golang-github-ugorji-go-codec-dev (1.1.7-1) ...
Selecting previously unselected package golang-gopkg-vmihailenco-msgpack.v2-dev.
Preparing to unpack .../188-golang-gopkg-vmihailenco-msgpack.v2-dev_3.3.3-1_all.deb ...
Unpacking golang-gopkg-vmihailenco-msgpack.v2-dev (3.3.3-1) ...
Selecting previously unselected package golang-gopkg-tomb.v2-dev.
Preparing to unpack .../189-golang-gopkg-tomb.v2-dev_0.0~git20161208.d5d1b58-3_all.deb ...
Unpacking golang-gopkg-tomb.v2-dev (0.0~git20161208.d5d1b58-3) ...
Selecting previously unselected package libsasl2-dev.
Preparing to unpack .../190-libsasl2-dev_2.1.27+dfsg-1+b1_armhf.deb ...
Unpacking libsasl2-dev (2.1.27+dfsg-1+b1) ...
Selecting previously unselected package golang-gopkg-mgo.v2-dev.
Preparing to unpack .../191-golang-gopkg-mgo.v2-dev_2016.08.01-6_all.deb ...
Unpacking golang-gopkg-mgo.v2-dev (2016.08.01-6) ...
Selecting previously unselected package golang-github-hashicorp-go-msgpack-dev.
Preparing to unpack .../192-golang-github-hashicorp-go-msgpack-dev_0.5.5-1_all.deb ...
Unpacking golang-github-hashicorp-go-msgpack-dev (0.5.5-1) ...
Selecting previously unselected package golang-github-hashicorp-raft-dev.
Preparing to unpack .../193-golang-github-hashicorp-raft-dev_1.1.1-2_all.deb ...
Unpacking golang-github-hashicorp-raft-dev (1.1.1-2) ...
Selecting previously unselected package libjs-jquery.
Preparing to unpack .../194-libjs-jquery_3.3.1~dfsg-3_all.deb ...
Unpacking libjs-jquery (3.3.1~dfsg-3) ...
Selecting previously unselected package libjs-jquery-ui.
Preparing to unpack .../195-libjs-jquery-ui_1.12.1+dfsg-5_all.deb ...
Unpacking libjs-jquery-ui (1.12.1+dfsg-5) ...
Selecting previously unselected package golang-golang-x-tools.
Preparing to unpack .../196-golang-golang-x-tools_1%3a0.0~git20191118.07fc4c7+ds-1_armhf.deb ...
Unpacking golang-golang-x-tools (1:0.0~git20191118.07fc4c7+ds-1) ...
Selecting previously unselected package golang-github-mitchellh-reflectwalk-dev.
Preparing to unpack .../197-golang-github-mitchellh-reflectwalk-dev_0.0~git20170726.63d60e9-4_all.deb ...
Unpacking golang-github-mitchellh-reflectwalk-dev (0.0~git20170726.63d60e9-4) ...
Selecting previously unselected package golang-github-mitchellh-copystructure-dev.
Preparing to unpack .../198-golang-github-mitchellh-copystructure-dev_0.0~git20161013.0.5af94ae-2_all.deb ...
Unpacking golang-github-mitchellh-copystructure-dev (0.0~git20161013.0.5af94ae-2) ...
Selecting previously unselected package golang-github-hashicorp-go-raftchunking-dev.
Preparing to unpack .../199-golang-github-hashicorp-go-raftchunking-dev_0.6.2-2_all.deb ...
Unpacking golang-github-hashicorp-go-raftchunking-dev (0.6.2-2) ...
Selecting previously unselected package golang-github-hashicorp-go-reap-dev.
Preparing to unpack .../200-golang-github-hashicorp-go-reap-dev_0.0~git20160113.0.2d85522-3_all.deb ...
Unpacking golang-github-hashicorp-go-reap-dev (0.0~git20160113.0.2d85522-3) ...
Selecting previously unselected package golang-github-hashicorp-go-sockaddr-dev.
Preparing to unpack .../201-golang-github-hashicorp-go-sockaddr-dev_0.0~git20170627.41949a1+ds-2_all.deb ...
Unpacking golang-github-hashicorp-go-sockaddr-dev (0.0~git20170627.41949a1+ds-2) ...
Selecting previously unselected package golang-github-hashicorp-go-version-dev.
Preparing to unpack .../202-golang-github-hashicorp-go-version-dev_1.2.0-1_all.deb ...
Unpacking golang-github-hashicorp-go-version-dev (1.2.0-1) ...
Selecting previously unselected package golang-github-hashicorp-hcl-dev.
Preparing to unpack .../203-golang-github-hashicorp-hcl-dev_1.0.0-1_all.deb ...
Unpacking golang-github-hashicorp-hcl-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mitchellh-mapstructure-dev.
Preparing to unpack .../204-golang-github-mitchellh-mapstructure-dev_1.1.2-1_all.deb ...
Unpacking golang-github-mitchellh-mapstructure-dev (1.1.2-1) ...
Selecting previously unselected package golang-github-hashicorp-hil-dev.
Preparing to unpack .../205-golang-github-hashicorp-hil-dev_0.0~git20160711.1e86c6b-1_all.deb ...
Unpacking golang-github-hashicorp-hil-dev (0.0~git20160711.1e86c6b-1) ...
Selecting previously unselected package golang-github-hashicorp-memberlist-dev.
Preparing to unpack .../206-golang-github-hashicorp-memberlist-dev_0.1.5-2_all.deb ...
Unpacking golang-github-hashicorp-memberlist-dev (0.1.5-2) ...
Selecting previously unselected package golang-github-hashicorp-raft-boltdb-dev.
Preparing to unpack .../207-golang-github-hashicorp-raft-boltdb-dev_0.0~git20171010.6e5ba93-3_all.deb ...
Unpacking golang-github-hashicorp-raft-boltdb-dev (0.0~git20171010.6e5ba93-3) ...
Selecting previously unselected package golang-github-hashicorp-net-rpc-msgpackrpc-dev.
Preparing to unpack .../208-golang-github-hashicorp-net-rpc-msgpackrpc-dev_0.0~git20151116.0.a14192a-1_all.deb ...
Unpacking golang-github-hashicorp-net-rpc-msgpackrpc-dev (0.0~git20151116.0.a14192a-1) ...
Selecting previously unselected package golang-github-hashicorp-yamux-dev.
Preparing to unpack .../209-golang-github-hashicorp-yamux-dev_0.0+git20190923.df201c7-1_all.deb ...
Unpacking golang-github-hashicorp-yamux-dev (0.0+git20190923.df201c7-1) ...
Selecting previously unselected package golang-github-hashicorp-scada-client-dev.
Preparing to unpack .../210-golang-github-hashicorp-scada-client-dev_0.0~git20160601.0.6e89678-2_all.deb ...
Unpacking golang-github-hashicorp-scada-client-dev (0.0~git20160601.0.6e89678-2) ...
Selecting previously unselected package golang-github-mattn-go-isatty-dev.
Preparing to unpack .../211-golang-github-mattn-go-isatty-dev_0.0.8-2_all.deb ...
Unpacking golang-github-mattn-go-isatty-dev (0.0.8-2) ...
Selecting previously unselected package golang-github-mattn-go-colorable-dev.
Preparing to unpack .../212-golang-github-mattn-go-colorable-dev_0.0.9-3_all.deb ...
Unpacking golang-github-mattn-go-colorable-dev (0.0.9-3) ...
Selecting previously unselected package golang-github-fatih-color-dev.
Preparing to unpack .../213-golang-github-fatih-color-dev_1.5.0-1_all.deb ...
Unpacking golang-github-fatih-color-dev (1.5.0-1) ...
Selecting previously unselected package golang-github-hashicorp-go-syslog-dev.
Preparing to unpack .../214-golang-github-hashicorp-go-syslog-dev_0.0~git20150218.0.42a2b57-1_all.deb ...
Unpacking golang-github-hashicorp-go-syslog-dev (0.0~git20150218.0.42a2b57-1) ...
Selecting previously unselected package golang-github-hashicorp-logutils-dev.
Preparing to unpack .../215-golang-github-hashicorp-logutils-dev_0.0~git20150609.0.0dc08b1-1_all.deb ...
Unpacking golang-github-hashicorp-logutils-dev (0.0~git20150609.0.0dc08b1-1) ...
Selecting previously unselected package golang-github-posener-complete-dev.
Preparing to unpack .../216-golang-github-posener-complete-dev_1.1+git20180108.57878c9-3_all.deb ...
Unpacking golang-github-posener-complete-dev (1.1+git20180108.57878c9-3) ...
Selecting previously unselected package golang-github-mitchellh-cli-dev.
Preparing to unpack .../217-golang-github-mitchellh-cli-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-cli-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-ryanuber-columnize-dev.
Preparing to unpack .../218-golang-github-ryanuber-columnize-dev_2.1.1-1_all.deb ...
Unpacking golang-github-ryanuber-columnize-dev (2.1.1-1) ...
Selecting previously unselected package golang-github-hashicorp-serf-dev.
Preparing to unpack .../219-golang-github-hashicorp-serf-dev_0.8.5~ds1-1_all.deb ...
Unpacking golang-github-hashicorp-serf-dev (0.8.5~ds1-1) ...
Selecting previously unselected package golang-github-imdario-mergo-dev.
Preparing to unpack .../220-golang-github-imdario-mergo-dev_0.3.5-1_all.deb ...
Unpacking golang-github-imdario-mergo-dev (0.3.5-1) ...
Selecting previously unselected package golang-github-inconshreveable-muxado-dev.
Preparing to unpack .../221-golang-github-inconshreveable-muxado-dev_0.0~git20140312.0.f693c7e-2_all.deb ...
Unpacking golang-github-inconshreveable-muxado-dev (0.0~git20140312.0.f693c7e-2) ...
Selecting previously unselected package golang-github-jeffail-gabs-dev.
Preparing to unpack .../222-golang-github-jeffail-gabs-dev_2.1.0-2_all.deb ...
Unpacking golang-github-jeffail-gabs-dev (2.1.0-2) ...
Selecting previously unselected package golang-github-jefferai-jsonx-dev.
Preparing to unpack .../223-golang-github-jefferai-jsonx-dev_1.0.1-2_all.deb ...
Unpacking golang-github-jefferai-jsonx-dev (1.0.1-2) ...
Selecting previously unselected package golang-github-mitchellh-go-testing-interface-dev.
Preparing to unpack .../224-golang-github-mitchellh-go-testing-interface-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-go-testing-interface-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-mitchellh-hashstructure-dev.
Preparing to unpack .../225-golang-github-mitchellh-hashstructure-dev_1.0.0-1_all.deb ...
Unpacking golang-github-mitchellh-hashstructure-dev (1.0.0-1) ...
Selecting previously unselected package golang-github-nytimes-gziphandler-dev.
Preparing to unpack .../226-golang-github-nytimes-gziphandler-dev_1.0.1-1_all.deb ...
Unpacking golang-github-nytimes-gziphandler-dev (1.0.1-1) ...
Selecting previously unselected package golang-github-ryanuber-go-glob-dev.
Preparing to unpack .../227-golang-github-ryanuber-go-glob-dev_1.0.0-2_all.deb ...
Unpacking golang-github-ryanuber-go-glob-dev (1.0.0-2) ...
Selecting previously unselected package golang-github-sap-go-hdb-dev.
Preparing to unpack .../228-golang-github-sap-go-hdb-dev_0.14.1-2_all.deb ...
Unpacking golang-github-sap-go-hdb-dev (0.14.1-2) ...
Selecting previously unselected package golang-github-shirou-gopsutil-dev.
Preparing to unpack .../229-golang-github-shirou-gopsutil-dev_2.18.06-1_all.deb ...
Unpacking golang-github-shirou-gopsutil-dev (2.18.06-1) ...
Selecting previously unselected package golang-github-spf13-pflag-dev.
Preparing to unpack .../230-golang-github-spf13-pflag-dev_1.0.3-1_all.deb ...
Unpacking golang-github-spf13-pflag-dev (1.0.3-1) ...
Selecting previously unselected package golang-gopkg-inf.v0-dev.
Preparing to unpack .../231-golang-gopkg-inf.v0-dev_0.9.0-3_all.deb ...
Unpacking golang-gopkg-inf.v0-dev (0.9.0-3) ...
Selecting previously unselected package mockery.
Preparing to unpack .../232-mockery_0.0~git20181123.e78b021-2_armhf.deb ...
Unpacking mockery (0.0~git20181123.e78b021-2) ...
Selecting previously unselected package golang-github-hashicorp-go-rootcerts-dev.
Preparing to unpack .../233-golang-github-hashicorp-go-rootcerts-dev_0.0~git20160503.0.6bb64b3-1_all.deb ...
Unpacking golang-github-hashicorp-go-rootcerts-dev (0.0~git20160503.0.6bb64b3-1) ...
Selecting previously unselected package sbuild-build-depends-consul-dummy.
Preparing to unpack .../234-sbuild-build-depends-consul-dummy_0.invalid.0_armhf.deb ...
Unpacking sbuild-build-depends-consul-dummy (0.invalid.0) ...
Setting up golang-github-xeipuuv-gojsonpointer-dev (0.0~git20151027.0.e0fe6f6-2) ...
Setting up golang-github-dimchansky-utfbom-dev (0.0~git20170328.6c6132f-1) ...
Setting up golang-github-dgrijalva-jwt-go-v3-dev (3.2.0-2) ...
Setting up libpipeline1:armhf (1.5.1-2) ...
Setting up golang-github-google-go-cmp-dev (0.3.1-1) ...
Setting up golang-github-ryanuber-go-glob-dev (1.0.0-2) ...
Setting up golang-github-go-ini-ini-dev (1.32.0-2) ...
Setting up golang-github-hashicorp-go-uuid-dev (1.0.1-1) ...
Setting up golang-1.13-src (1.13.4-1+rpi1) ...
Setting up libseccomp-dev:armhf (2.4.1-2+rpi1) ...
Setting up golang-github-mitchellh-go-homedir-dev (1.1.0-1) ...
Setting up golang-github-google-go-querystring-dev (1.0.0-1) ...
Setting up golang-github-mitchellh-mapstructure-dev (1.1.2-1) ...
Setting up golang-dbus-dev (5.0.2-1) ...
Setting up golang-github-gogo-protobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Setting up golang-github-golang-mock-dev (1.3.1-2) ...
Setting up golang-github-stretchr-objx-dev (0.1.1+git20180825.ef50b0d-1) ...
Setting up golang-github-mitchellh-hashstructure-dev (1.0.0-1) ...
Setting up libmagic-mgc (1:5.37-6) ...
Setting up golang-github-pkg-errors-dev (0.8.1-1) ...
Setting up golang-github-hashicorp-golang-lru-dev (0.5.0-1) ...
Setting up golang-github-google-gofuzz-dev (0.0~git20170612.24818f7-1) ...
Setting up golang-github-inconshreveable-muxado-dev (0.0~git20140312.0.f693c7e-2) ...
Setting up libarchive-zip-perl (1.67-1) ...
Setting up libglib2.0-0:armhf (2.62.2-3) ...
No schema files found: doing nothing.
Setting up libprotobuf-lite17:armhf (3.6.1.3-2+rpi1) ...
Setting up libssl1.1:armhf (1.1.1d-2) ...
Setting up golang-github-ryanuber-columnize-dev (2.1.1-1) ...
Setting up libprocps7:armhf (2:3.3.15-2) ...
Setting up libdebhelper-perl (12.7.1) ...
Setting up golang-golang-x-sys-dev (0.0~git20190726.fc99dfb-1) ...
Setting up golang-github-tent-http-link-go-dev (0.0~git20130702.0.ac974c6-6) ...
Setting up libmagic1:armhf (1:5.37-6) ...
Setting up golang-github-hashicorp-go-syslog-dev (0.0~git20150218.0.42a2b57-1) ...
Setting up golang-github-golang-snappy-dev (0.0+git20160529.d9eb7a3-3) ...
Setting up golang-github-pmezard-go-difflib-dev (1.0.0-2) ...
Setting up golang-github-modern-go-concurrent-dev (1.0.3-1) ...
Setting up gettext-base (0.19.8.1-10) ...
Setting up golang-github-circonus-labs-circonusllhist-dev (0.0~git20160526.0.d724266-2) ...
Setting up golang-github-bradfitz-gomemcache-dev (0.0~git20141109-3) ...
Setting up mockery (0.0~git20181123.e78b021-2) ...
Setting up golang-github-mitchellh-go-testing-interface-dev (1.0.0-1) ...
Setting up file (1:5.37-6) ...
Setting up golang-github-seccomp-libseccomp-golang-dev (0.9.1-1) ...
Setting up golang-github-asaskevich-govalidator-dev (9+git20180720.0.f9ffefc3-1) ...
Setting up golang-github-google-btree-dev (0.0~git20161217.0.316fb6d-1) ...
Setting up golang-github-go-stack-stack-dev (1.5.2-2) ...
Setting up golang-github-beorn7-perks-dev (0.0~git20160804.0.4c0e845-1) ...
Setting up libicu63:armhf (63.2-2) ...
Setting up golang-github-hashicorp-go-cleanhttp-dev (0.5.1-1) ...
Setting up golang-github-hashicorp-errwrap-dev (1.0.0-1) ...
Setting up golang-github-cespare-xxhash-dev (2.1.0-1) ...
Setting up golang-github-spf13-pflag-dev (1.0.3-1) ...
Setting up golang-gopkg-tomb.v2-dev (0.0~git20161208.d5d1b58-3) ...
Setting up golang-github-bgentry-speakeasy-dev (0.1.0-1) ...
Setting up golang-github-jpillora-backoff-dev (1.0.0-1) ...
Setting up golang-github-davecgh-go-spew-dev (1.1.1-2) ...
Setting up autotools-dev (20180224.1) ...
Setting up libsasl2-dev (2.1.27+dfsg-1+b1) ...
Setting up golang-github-pascaldekloe-goe-dev (0.1.0-2) ...
Setting up golang-github-go-logfmt-logfmt-dev (0.3.0-1) ...
Setting up golang-github-ugorji-go-msgpack-dev (0.0~git20130605.792643-5) ...
Setting up golang-github-go-test-deep-dev (1.0.3-1) ...
Setting up bash-completion (1:2.8-6) ...
Setting up golang-github-hashicorp-go-immutable-radix-dev (1.1.0-1) ...
Setting up golang-github-boltdb-bolt-dev (1.3.1-6) ...
Setting up libncurses6:armhf (6.1+20191019-1) ...
Setting up libsigsegv2:armhf (2.12-2) ...
Setting up golang-github-xeipuuv-gojsonreference-dev (0.0~git20150808.0.e02fc20-2) ...
Setting up libmnl0:armhf (1.0.4-2) ...
Setting up golang-golang-x-sync-dev (0.0~git20190423.1122301-1) ...
Setting up autopoint (0.19.8.1-10) ...
Setting up golang-github-kr-pty-dev (1.1.6-1) ...
Setting up golang-github-opencontainers-selinux-dev (1.3.0-2) ...
Setting up pkg-config (0.29-6) ...
Setting up golang-github-hashicorp-hcl-dev (1.0.0-1) ...
Setting up golang-github-vishvananda-netns-dev (0.0~git20170707.0.86bef33-1) ...
Setting up golang-1.13-go (1.13.4-1+rpi1) ...
Setting up libxtables12:armhf (1.8.3-2) ...
Setting up golang-gocapability-dev (0.0+git20180916.d983527-1) ...
Setting up golang-glog-dev (0.0~git20160126.23def4e-3) ...
Setting up golang-github-julienschmidt-httprouter-dev (1.1-5) ...
Setting up golang-github-hashicorp-go-multierror-dev (1.0.0-1) ...
Setting up lsof (4.93.2+dfsg-1) ...
Setting up zlib1g-dev:armhf (1:1.2.11.dfsg-1) ...
Setting up golang-github-tv42-httpunix-dev (0.0~git20150427.b75d861-2) ...
Setting up golang-github-hashicorp-go-version-dev (1.2.0-1) ...
Setting up golang-gopkg-inf.v0-dev (0.9.0-3) ...
Setting up sensible-utils (0.0.12) ...
Setting up libuchardet0:armhf (0.0.6-3) ...
Setting up golang-github-vishvananda-netlink-dev (1.0.0+git20181030.023a6da-1) ...
Setting up procps (2:3.3.15-2) ...
update-alternatives: using /usr/bin/w.procps to provide /usr/bin/w (w) in auto mode
Setting up golang-github-cyphar-filepath-securejoin-dev (0.2.2-1) ...
Setting up golang-github-modern-go-reflect2-dev (1.0.0-1) ...
Setting up libsub-override-perl (0.09-2) ...
Setting up golang-github-dgrijalva-jwt-go-dev (3.2.0-1) ...
Setting up golang-github-armon-go-radix-dev (1.0.0-1) ...
Setting up libprotobuf17:armhf (3.6.1.3-2+rpi1) ...
Setting up golang-github-datadog-datadog-go-dev (2.1.0-2) ...
Setting up libjs-jquery (3.3.1~dfsg-3) ...
Setting up golang-golang-x-xerrors-dev (0.0~git20190717.a985d34-1) ...
Setting up golang-procfs-dev (0.0.3-1) ...
Setting up golang-src (2:1.13~1+b11) ...
Setting up openssl (1.1.1d-2) ...
Setting up libbsd0:armhf (0.10.0-1) ...
Setting up libtinfo5:armhf (6.1+20191019-1) ...
Setting up libelf1:armhf (0.176-1.1) ...
Setting up golang-github-armon-circbuf-dev (0.0~git20150827.0.bbbad09-2) ...
Setting up golang-github-jeffail-gabs-dev (2.1.0-2) ...
Setting up libxml2:armhf (2.9.4+dfsg1-8) ...
Setting up golang-github-jefferai-jsonx-dev (1.0.1-2) ...
Setting up libsystemd-dev:armhf (243-8+rpi1) ...
Setting up golang-github-hashicorp-yamux-dev (0.0+git20190923.df201c7-1) ...
Setting up golang-github-hashicorp-go-rootcerts-dev (0.0~git20160503.0.6bb64b3-1) ...
Setting up golang-github-hashicorp-logutils-dev (0.0~git20150609.0.0dc08b1-1) ...
Setting up libfile-stripnondeterminism-perl (1.6.3-1) ...
Setting up golang-github-mattn-go-isatty-dev (0.0.8-2) ...
Setting up golang-github-hashicorp-go-reap-dev (0.0~git20160113.0.2d85522-3) ...
Setting up golang-github-digitalocean-godo-dev (1.1.0-1) ...
Setting up golang-github-hashicorp-go-memdb-dev (0.0~git20180224.1289e7ff-1) ...
Setting up libprotoc17:armhf (3.6.1.3-2+rpi1) ...
Setting up protobuf-compiler (3.6.1.3-2+rpi1) ...
Setting up libtool (2.4.6-11) ...
Setting up golang-go (2:1.13~1+b11) ...
Setting up golang-github-mattn-go-colorable-dev (0.0.9-3) ...
Setting up iproute2 (5.3.0-1) ...
Setting up golang-github-posener-complete-dev (1.1+git20180108.57878c9-3) ...
Setting up golang-github-docker-go-units-dev (0.4.0-1) ...
Setting up m4 (1.4.18-4) ...
Setting up golang-any (2:1.13~1+b11) ...
Setting up libprotobuf-dev:armhf (3.6.1.3-2+rpi1) ...
Setting up ca-certificates (20190110) ...
Updating certificates in /etc/ssl/certs...
128 added, 0 removed; done.
Setting up golang-goprotobuf-dev (1.3.2-2) ...
Setting up libjs-jquery-ui (1.12.1+dfsg-5) ...
Setting up golang-github-kr-text-dev (0.1.0-1) ...
Setting up golang-github-elazarl-go-bindata-assetfs-dev (1.0.0-1) ...
Setting up bsdmainutils (11.1.2) ...
update-alternatives: using /usr/bin/bsd-write to provide /usr/bin/write (write) in auto mode
update-alternatives: using /usr/bin/bsd-from to provide /usr/bin/from (from) in auto mode
Setting up libcroco3:armhf (0.6.13-1) ...
Setting up gogoprotobuf (1.2.1+git20190611.dadb6258-1) ...
Setting up autoconf (2.69-11) ...
Setting up dh-strip-nondeterminism (1.6.3-1) ...
Setting up dwz (0.13-2) ...
Setting up groff-base (1.22.4-3) ...
Setting up golang-github-prometheus-client-model-dev (0.0.2+git20171117.99fa1f4-1) ...
Setting up golang-github-docopt-docopt-go-dev (0.6.2+git20160216.0.784ddc5-1) ...
Setting up golang-github-hashicorp-go-checkpoint-dev (0.0~git20171009.1545e56-2) ...
Setting up automake (1:1.16.1-4) ...
update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode
Setting up golang-github-kr-pretty-dev (0.1.0-1) ...
Setting up gettext (0.19.8.1-10) ...
Setting up golang-github-peterbourgon-diskv-dev (2.0.1-1) ...
Setting up golang-github-fatih-color-dev (1.5.0-1) ...
Setting up golang-github-hashicorp-go-sockaddr-dev (0.0~git20170627.41949a1+ds-2) ...
Setting up golang-github-garyburd-redigo-dev (0.0~git20150901.0.d8dbe4d-2) ...
Setting up golang-protobuf-extensions-dev (1.0.1-1) ...
Setting up golang-gogoprotobuf-dev (1.2.1+git20190611.dadb6258-1) ...
Setting up golang-gopkg-check.v1-dev (0.0+git20180628.788fd78-1) ...
Setting up man-db (2.9.0-1) ...
Not building database; man-db/auto-update is not 'true'.
Setting up golang-golang-x-tools (1:0.0~git20191118.07fc4c7+ds-1) ...
Setting up golang-github-mitchellh-reflectwalk-dev (0.0~git20170726.63d60e9-4) ...
Setting up golang-github-denverdino-aliyungo-dev (0.0~git20180921.13fa8aa-2) ...
Setting up intltool-debian (0.35.0+20060710.5) ...
Setting up golang-gopkg-mgo.v2-dev (2016.08.01-6) ...
Setting up golang-github-mitchellh-cli-dev (1.0.0-1) ...
Setting up golang-github-hashicorp-hil-dev (0.0~git20160711.1e86c6b-1) ...
Setting up golang-github-gogo-googleapis-dev (1.2.0-1) ...
Setting up golang-gopkg-yaml.v2-dev (2.2.2-1) ...
Setting up golang-github-imdario-mergo-dev (0.3.5-1) ...
Setting up po-debconf (1.0.21) ...
Setting up golang-gomega-dev (1.0+git20160910.d59fa0a-1) ...
Setting up golang-github-mitchellh-copystructure-dev (0.0~git20161013.0.5af94ae-2) ...
Setting up golang-github-stretchr-testify-dev (1.4.0+ds-1) ...
Setting up golang-github-shirou-gopsutil-dev (2.18.06-1) ...
Setting up golang-github-alecthomas-units-dev (0.0~git20151022.0.2efee85-4) ...
Setting up golang-github-ghodss-yaml-dev (1.0.0-1) ...
Setting up golang-github-jmespath-go-jmespath-dev (0.2.2-3) ...
Setting up golang-github-hashicorp-go-hclog-dev (0.9.2-1) ...
Setting up golang-github-urfave-cli-dev (1.20.0-1) ...
Setting up golang-ginkgo-dev (1.2.0+git20161006.acfa16a-1) ...
Setting up golang-gopkg-alecthomas-kingpin.v2-dev (2.2.6-1) ...
Setting up golang-github-xeipuuv-gojsonschema-dev (0.0~git20170210.0.6b67b3f-2) ...
Setting up golang-github-nytimes-gziphandler-dev (1.0.1-1) ...
Setting up golang-github-json-iterator-go-dev (1.1.4-1) ...
Setting up golang-github-hashicorp-go-retryablehttp-dev (0.6.3-1) ...
Setting up golang-github-aws-aws-sdk-go-dev (1.21.6+dfsg-2) ...
Setting up golang-github-opencontainers-specs-dev (1.0.1+git20190408.a1b50f6-1) ...
Setting up golang-github-syndtr-goleveldb-dev (0.0~git20170725.0.b89cc31-2) ...
Setting up golang-github-circonus-labs-circonus-gometrics-dev (2.3.1-2) ...
Setting up golang-github-gregjones-httpcache-dev (0.0~git20180305.9cad4c3-1) ...
Setting up golang-google-genproto-dev (0.0~git20190801.fa694d8-2) ...
Setting up dh-autoreconf (19) ...
Setting up golang-github-coreos-go-systemd-dev (20-1) ...
Setting up golang-golang-x-text-dev (0.3.2-1) ...
Setting up debhelper (12.7.1) ...
Setting up golang-github-sap-go-hdb-dev (0.14.1-2) ...
Setting up golang-golang-x-net-dev (1:0.0+git20191112.2180aed+dfsg-1) ...
Setting up golang-github-vmware-govmomi-dev (0.15.0-1) ...
Setting up golang-golang-x-crypto-dev (1:0.0~git20190701.4def268-2) ...
Setting up golang-golang-x-oauth2-dev (0.0~git20190604.0f29369-2) ...
Setting up golang-golang-x-time-dev (0.0~git20161028.0.f51c127-2) ...
Setting up golang-github-sirupsen-logrus-dev (1.3.0-1) ...
Setting up golang-github-opentracing-opentracing-go-dev (1.0.2-1) ...
Setting up dh-golang (1.42) ...
Setting up golang-github-gophercloud-gophercloud-dev (0.6.0-1) ...
Setting up golang-github-miekg-dns-dev (1.0.4+ds-1) ...
Setting up golang-github-coreos-pkg-dev (4-2) ...
Setting up golang-github-mwitkow-go-conntrack-dev (0.0~git20190716.2f06839-1) ...
Setting up golang-google-cloud-compute-metadata-dev (0.43.0-1) ...
Setting up golang-golang-x-tools-dev (1:0.0~git20191118.07fc4c7+ds-1) ...
Setting up golang-github-packethost-packngo-dev (0.2.0-2) ...
Setting up golang-golang-x-oauth2-google-dev (0.0~git20190604.0f29369-2) ...
Setting up golang-dns-dev (1.0.4+ds-1) ...
Setting up golang-github-azure-go-autorest-dev (10.15.5-1) ...
Setting up golang-github-opencontainers-runc-dev (1.0.0~rc9+dfsg1-1+rpi1) ...
Setting up golang-github-googleapis-gnostic-dev (0.2.0-1) ...
Setting up golang-google-grpc-dev (1.22.1-1) ...
Setting up golang-github-ugorji-go-codec-dev (1.1.7-1) ...
Setting up golang-gopkg-vmihailenco-msgpack.v2-dev (3.3.3-1) ...
Setting up golang-go.opencensus-dev (0.22.0-1) ...
Setting up golang-github-hashicorp-mdns-dev (0.0~git20150317.0.2b439d3-2) ...
Setting up golang-github-go-kit-kit-dev (0.6.0-2) ...
Setting up golang-github-hashicorp-go-msgpack-dev (0.5.5-1) ...
Setting up golang-github-docker-go-connections-dev (0.4.0-1) ...
Setting up golang-github-hashicorp-net-rpc-msgpackrpc-dev (0.0~git20151116.0.a14192a-1) ...
Setting up golang-github-prometheus-common-dev (0.7.0-1) ...
Setting up golang-google-api-dev (0.7.0-2) ...
Setting up golang-github-prometheus-client-golang-dev (1.2.1-3) ...
Setting up golang-github-hashicorp-go-discover-dev (0.0+git20190905.34a6505-2) ...
Setting up golang-github-armon-go-metrics-dev (0.0~git20190430.ec5e00d-1) ...
Setting up golang-github-hashicorp-raft-dev (1.1.1-2) ...
Setting up golang-github-hashicorp-scada-client-dev (0.0~git20160601.0.6e89678-2) ...
Setting up golang-github-hashicorp-memberlist-dev (0.1.5-2) ...
Setting up golang-github-hashicorp-go-raftchunking-dev (0.6.2-2) ...
Setting up golang-github-hashicorp-raft-boltdb-dev (0.0~git20171010.6e5ba93-3) ...
Setting up golang-github-hashicorp-serf-dev (0.8.5~ds1-1) ...
Setting up sbuild-build-depends-consul-dummy (0.invalid.0) ...
Processing triggers for libc-bin (2.29-2+rpi1) ...
Processing triggers for ca-certificates (20190110) ...
Updating certificates in /etc/ssl/certs...
0 added, 0 removed; done.
Running hooks in /etc/ca-certificates/update.d...
done.
W: No sandbox user '_apt' on the system, can not drop privileges

+------------------------------------------------------------------------------+
| Build environment                                                            |
+------------------------------------------------------------------------------+

Kernel: Linux 4.9.0-0.bpo.6-armmp armhf (armv7l)
Toolchain package versions: binutils_2.33.1-2+rpi1 dpkg-dev_1.19.7 g++-9_9.2.1-17+rpi1 gcc-9_9.2.1-17+rpi1 libc6-dev_2.29-2+rpi1 libstdc++-9-dev_9.2.1-17+rpi1 libstdc++6_9.2.1-17+rpi1 linux-libc-dev_5.2.17-1+rpi1+b2
Package versions: adduser_3.118 apt_1.8.4 autoconf_2.69-11 automake_1:1.16.1-4 autopoint_0.19.8.1-10 autotools-dev_20180224.1 base-files_11+rpi1 base-passwd_3.5.46 bash_5.0-5 bash-completion_1:2.8-6 binutils_2.33.1-2+rpi1 binutils-arm-linux-gnueabihf_2.33.1-2+rpi1 binutils-common_2.33.1-2+rpi1 bsdmainutils_11.1.2 bsdutils_1:2.34-0.1 build-essential_12.8 bzip2_1.0.8-2 ca-certificates_20190110 coreutils_8.30-3 cpp_4:9.2.1-3+rpi1 cpp-9_9.2.1-17+rpi1 dash_0.5.10.2-6 debconf_1.5.73 debhelper_12.7.1 debianutils_4.9 dh-autoreconf_19 dh-golang_1.42 dh-strip-nondeterminism_1.6.3-1 diffutils_1:3.7-3 dirmngr_2.2.17-3+b1 dpkg_1.19.7 dpkg-dev_1.19.7 dwz_0.13-2 e2fsprogs_1.45.4-1 fakeroot_1.24-1 fdisk_2.34-0.1 file_1:5.37-6 findutils_4.7.0-1 g++_4:9.2.1-3+rpi1 g++-9_9.2.1-17+rpi1 gcc_4:9.2.1-3+rpi1 gcc-4.9-base_4.9.4-2+rpi1+b19 gcc-5-base_5.5.0-8 gcc-6-base_6.5.0-1+rpi1+b3 gcc-7-base_7.4.0-15 gcc-9_9.2.1-17+rpi1 gcc-9-base_9.2.1-17+rpi1 gettext_0.19.8.1-10 gettext-base_0.19.8.1-10 gnupg_2.2.17-3 gnupg-l10n_2.2.17-3 gnupg-utils_2.2.17-3+b1 gogoprotobuf_1.2.1+git20190611.dadb6258-1 golang-1.13-go_1.13.4-1+rpi1 golang-1.13-src_1.13.4-1+rpi1 golang-any_2:1.13~1+b11 golang-dbus-dev_5.0.2-1 golang-dns-dev_1.0.4+ds-1 golang-ginkgo-dev_1.2.0+git20161006.acfa16a-1 golang-github-alecthomas-units-dev_0.0~git20151022.0.2efee85-4 golang-github-armon-circbuf-dev_0.0~git20150827.0.bbbad09-2 golang-github-armon-go-metrics-dev_0.0~git20190430.ec5e00d-1 golang-github-armon-go-radix-dev_1.0.0-1 golang-github-asaskevich-govalidator-dev_9+git20180720.0.f9ffefc3-1 golang-github-aws-aws-sdk-go-dev_1.21.6+dfsg-2 golang-github-azure-go-autorest-dev_10.15.5-1 golang-github-beorn7-perks-dev_0.0~git20160804.0.4c0e845-1 golang-github-bgentry-speakeasy-dev_0.1.0-1 golang-github-boltdb-bolt-dev_1.3.1-6 golang-github-bradfitz-gomemcache-dev_0.0~git20141109-3 golang-github-cespare-xxhash-dev_2.1.0-1 golang-github-circonus-labs-circonus-gometrics-dev_2.3.1-2 golang-github-circonus-labs-circonusllhist-dev_0.0~git20160526.0.d724266-2 golang-github-coreos-go-systemd-dev_20-1 golang-github-coreos-pkg-dev_4-2 golang-github-cyphar-filepath-securejoin-dev_0.2.2-1 golang-github-datadog-datadog-go-dev_2.1.0-2 golang-github-davecgh-go-spew-dev_1.1.1-2 golang-github-denverdino-aliyungo-dev_0.0~git20180921.13fa8aa-2 golang-github-dgrijalva-jwt-go-dev_3.2.0-1 golang-github-dgrijalva-jwt-go-v3-dev_3.2.0-2 golang-github-digitalocean-godo-dev_1.1.0-1 golang-github-dimchansky-utfbom-dev_0.0~git20170328.6c6132f-1 golang-github-docker-go-connections-dev_0.4.0-1 golang-github-docker-go-units-dev_0.4.0-1 golang-github-docopt-docopt-go-dev_0.6.2+git20160216.0.784ddc5-1 golang-github-elazarl-go-bindata-assetfs-dev_1.0.0-1 golang-github-fatih-color-dev_1.5.0-1 golang-github-garyburd-redigo-dev_0.0~git20150901.0.d8dbe4d-2 golang-github-ghodss-yaml-dev_1.0.0-1 golang-github-go-ini-ini-dev_1.32.0-2 golang-github-go-kit-kit-dev_0.6.0-2 golang-github-go-logfmt-logfmt-dev_0.3.0-1 golang-github-go-stack-stack-dev_1.5.2-2 golang-github-go-test-deep-dev_1.0.3-1 golang-github-gogo-googleapis-dev_1.2.0-1 golang-github-gogo-protobuf-dev_1.2.1+git20190611.dadb6258-1 golang-github-golang-mock-dev_1.3.1-2 golang-github-golang-snappy-dev_0.0+git20160529.d9eb7a3-3 golang-github-google-btree-dev_0.0~git20161217.0.316fb6d-1 golang-github-google-go-cmp-dev_0.3.1-1 golang-github-google-go-querystring-dev_1.0.0-1 golang-github-google-gofuzz-dev_0.0~git20170612.24818f7-1 golang-github-googleapis-gnostic-dev_0.2.0-1 golang-github-gophercloud-gophercloud-dev_0.6.0-1 golang-github-gregjones-httpcache-dev_0.0~git20180305.9cad4c3-1 golang-github-hashicorp-errwrap-dev_1.0.0-1 golang-github-hashicorp-go-checkpoint-dev_0.0~git20171009.1545e56-2 golang-github-hashicorp-go-cleanhttp-dev_0.5.1-1 golang-github-hashicorp-go-discover-dev_0.0+git20190905.34a6505-2 golang-github-hashicorp-go-hclog-dev_0.9.2-1 golang-github-hashicorp-go-immutable-radix-dev_1.1.0-1 golang-github-hashicorp-go-memdb-dev_0.0~git20180224.1289e7ff-1 golang-github-hashicorp-go-msgpack-dev_0.5.5-1 golang-github-hashicorp-go-multierror-dev_1.0.0-1 golang-github-hashicorp-go-raftchunking-dev_0.6.2-2 golang-github-hashicorp-go-reap-dev_0.0~git20160113.0.2d85522-3 golang-github-hashicorp-go-retryablehttp-dev_0.6.3-1 golang-github-hashicorp-go-rootcerts-dev_0.0~git20160503.0.6bb64b3-1 golang-github-hashicorp-go-sockaddr-dev_0.0~git20170627.41949a1+ds-2 golang-github-hashicorp-go-syslog-dev_0.0~git20150218.0.42a2b57-1 golang-github-hashicorp-go-uuid-dev_1.0.1-1 golang-github-hashicorp-go-version-dev_1.2.0-1 golang-github-hashicorp-golang-lru-dev_0.5.0-1 golang-github-hashicorp-hcl-dev_1.0.0-1 golang-github-hashicorp-hil-dev_0.0~git20160711.1e86c6b-1 golang-github-hashicorp-logutils-dev_0.0~git20150609.0.0dc08b1-1 golang-github-hashicorp-mdns-dev_0.0~git20150317.0.2b439d3-2 golang-github-hashicorp-memberlist-dev_0.1.5-2 golang-github-hashicorp-net-rpc-msgpackrpc-dev_0.0~git20151116.0.a14192a-1 golang-github-hashicorp-raft-boltdb-dev_0.0~git20171010.6e5ba93-3 golang-github-hashicorp-raft-dev_1.1.1-2 golang-github-hashicorp-scada-client-dev_0.0~git20160601.0.6e89678-2 golang-github-hashicorp-serf-dev_0.8.5~ds1-1 golang-github-hashicorp-yamux-dev_0.0+git20190923.df201c7-1 golang-github-imdario-mergo-dev_0.3.5-1 golang-github-inconshreveable-muxado-dev_0.0~git20140312.0.f693c7e-2 golang-github-jeffail-gabs-dev_2.1.0-2 golang-github-jefferai-jsonx-dev_1.0.1-2 golang-github-jmespath-go-jmespath-dev_0.2.2-3 golang-github-jpillora-backoff-dev_1.0.0-1 golang-github-json-iterator-go-dev_1.1.4-1 golang-github-julienschmidt-httprouter-dev_1.1-5 golang-github-kr-pretty-dev_0.1.0-1 golang-github-kr-pty-dev_1.1.6-1 golang-github-kr-text-dev_0.1.0-1 golang-github-mattn-go-colorable-dev_0.0.9-3 golang-github-mattn-go-isatty-dev_0.0.8-2 golang-github-miekg-dns-dev_1.0.4+ds-1 golang-github-mitchellh-cli-dev_1.0.0-1 golang-github-mitchellh-copystructure-dev_0.0~git20161013.0.5af94ae-2 golang-github-mitchellh-go-homedir-dev_1.1.0-1 golang-github-mitchellh-go-testing-interface-dev_1.0.0-1 golang-github-mitchellh-hashstructure-dev_1.0.0-1 golang-github-mitchellh-mapstructure-dev_1.1.2-1 golang-github-mitchellh-reflectwalk-dev_0.0~git20170726.63d60e9-4 golang-github-modern-go-concurrent-dev_1.0.3-1 golang-github-modern-go-reflect2-dev_1.0.0-1 golang-github-mwitkow-go-conntrack-dev_0.0~git20190716.2f06839-1 golang-github-nytimes-gziphandler-dev_1.0.1-1 golang-github-opencontainers-runc-dev_1.0.0~rc9+dfsg1-1+rpi1 golang-github-opencontainers-selinux-dev_1.3.0-2 golang-github-opencontainers-specs-dev_1.0.1+git20190408.a1b50f6-1 golang-github-opentracing-opentracing-go-dev_1.0.2-1 golang-github-packethost-packngo-dev_0.2.0-2 golang-github-pascaldekloe-goe-dev_0.1.0-2 golang-github-peterbourgon-diskv-dev_2.0.1-1 golang-github-pkg-errors-dev_0.8.1-1 golang-github-pmezard-go-difflib-dev_1.0.0-2 golang-github-posener-complete-dev_1.1+git20180108.57878c9-3 golang-github-prometheus-client-golang-dev_1.2.1-3 golang-github-prometheus-client-model-dev_0.0.2+git20171117.99fa1f4-1 golang-github-prometheus-common-dev_0.7.0-1 golang-github-ryanuber-columnize-dev_2.1.1-1 golang-github-ryanuber-go-glob-dev_1.0.0-2 golang-github-sap-go-hdb-dev_0.14.1-2 golang-github-seccomp-libseccomp-golang-dev_0.9.1-1 golang-github-shirou-gopsutil-dev_2.18.06-1 golang-github-sirupsen-logrus-dev_1.3.0-1 golang-github-spf13-pflag-dev_1.0.3-1 golang-github-stretchr-objx-dev_0.1.1+git20180825.ef50b0d-1 golang-github-stretchr-testify-dev_1.4.0+ds-1 golang-github-syndtr-goleveldb-dev_0.0~git20170725.0.b89cc31-2 golang-github-tent-http-link-go-dev_0.0~git20130702.0.ac974c6-6 golang-github-tv42-httpunix-dev_0.0~git20150427.b75d861-2 golang-github-ugorji-go-codec-dev_1.1.7-1 golang-github-ugorji-go-msgpack-dev_0.0~git20130605.792643-5 golang-github-urfave-cli-dev_1.20.0-1 golang-github-vishvananda-netlink-dev_1.0.0+git20181030.023a6da-1 golang-github-vishvananda-netns-dev_0.0~git20170707.0.86bef33-1 golang-github-vmware-govmomi-dev_0.15.0-1 golang-github-xeipuuv-gojsonpointer-dev_0.0~git20151027.0.e0fe6f6-2 golang-github-xeipuuv-gojsonreference-dev_0.0~git20150808.0.e02fc20-2 golang-github-xeipuuv-gojsonschema-dev_0.0~git20170210.0.6b67b3f-2 golang-glog-dev_0.0~git20160126.23def4e-3 golang-go_2:1.13~1+b11 golang-go.opencensus-dev_0.22.0-1 golang-gocapability-dev_0.0+git20180916.d983527-1 golang-gogoprotobuf-dev_1.2.1+git20190611.dadb6258-1 golang-golang-x-crypto-dev_1:0.0~git20190701.4def268-2 golang-golang-x-net-dev_1:0.0+git20191112.2180aed+dfsg-1 golang-golang-x-oauth2-dev_0.0~git20190604.0f29369-2 golang-golang-x-oauth2-google-dev_0.0~git20190604.0f29369-2 golang-golang-x-sync-dev_0.0~git20190423.1122301-1 golang-golang-x-sys-dev_0.0~git20190726.fc99dfb-1 golang-golang-x-text-dev_0.3.2-1 golang-golang-x-time-dev_0.0~git20161028.0.f51c127-2 golang-golang-x-tools_1:0.0~git20191118.07fc4c7+ds-1 golang-golang-x-tools-dev_1:0.0~git20191118.07fc4c7+ds-1 golang-golang-x-xerrors-dev_0.0~git20190717.a985d34-1 golang-gomega-dev_1.0+git20160910.d59fa0a-1 golang-google-api-dev_0.7.0-2 golang-google-cloud-compute-metadata-dev_0.43.0-1 golang-google-genproto-dev_0.0~git20190801.fa694d8-2 golang-google-grpc-dev_1.22.1-1 golang-gopkg-alecthomas-kingpin.v2-dev_2.2.6-1 golang-gopkg-check.v1-dev_0.0+git20180628.788fd78-1 golang-gopkg-inf.v0-dev_0.9.0-3 golang-gopkg-mgo.v2-dev_2016.08.01-6 golang-gopkg-tomb.v2-dev_0.0~git20161208.d5d1b58-3 golang-gopkg-vmihailenco-msgpack.v2-dev_3.3.3-1 golang-gopkg-yaml.v2-dev_2.2.2-1 golang-goprotobuf-dev_1.3.2-2 golang-procfs-dev_0.0.3-1 golang-protobuf-extensions-dev_1.0.1-1 golang-src_2:1.13~1+b11 gpg_2.2.17-3+b1 gpg-agent_2.2.17-3+b1 gpg-wks-client_2.2.17-3+b1 gpg-wks-server_2.2.17-3+b1 gpgconf_2.2.17-3+b1 gpgsm_2.2.17-3+b1 gpgv_2.2.17-3+b1 grep_3.3-1 groff-base_1.22.4-3 gzip_1.9-3 hostname_3.23 init-system-helpers_1.57 intltool-debian_0.35.0+20060710.5 iproute2_5.3.0-1 iputils-ping_3:20190709-2 libacl1_2.2.53-5 libapt-pkg5.0_1.8.4 libarchive-zip-perl_1.67-1 libasan5_9.2.1-17+rpi1 libassuan0_2.5.3-7 libatomic1_9.2.1-17+rpi1 libattr1_1:2.4.48-5 libaudit-common_1:2.8.5-2 libaudit1_1:2.8.5-2 libbinutils_2.33.1-2+rpi1 libblkid1_2.34-0.1 libbsd0_0.10.0-1 libbz2-1.0_1.0.8-2 libc-bin_2.29-2+rpi1 libc-dev-bin_2.29-2+rpi1 libc6_2.29-2+rpi1 libc6-dev_2.29-2+rpi1 libcap-ng0_0.7.9-2.1 libcap2_1:2.27-1 libcap2-bin_1:2.27-1 libcc1-0_9.2.1-17+rpi1 libcom-err2_1.45.4-1 libcroco3_0.6.13-1 libdb5.3_5.3.28+dfsg1-0.6 libdebconfclient0_0.250 libdebhelper-perl_12.7.1 libdpkg-perl_1.19.7 libelf1_0.176-1.1 libext2fs2_1.45.4-1 libfakeroot_1.24-1 libfdisk1_2.34-0.1 libffi6_3.2.1-9 libfile-stripnondeterminism-perl_1.6.3-1 libgcc-9-dev_9.2.1-17+rpi1 libgcc1_1:9.2.1-17+rpi1 libgcrypt20_1.8.5-3 libgdbm-compat4_1.18.1-5 libgdbm6_1.18.1-5 libglib2.0-0_2.62.2-3 libgmp10_2:6.1.2+dfsg-4 libgnutls30_3.6.10-4 libgomp1_9.2.1-17+rpi1 libgpg-error0_1.36-7 libhogweed5_3.5.1+really3.5.1-2 libicu63_63.2-2 libidn2-0_2.2.0-2 libisl19_0.20-2 libisl21_0.21-2 libjs-jquery_3.3.1~dfsg-3 libjs-jquery-ui_1.12.1+dfsg-5 libksba8_1.3.5-2 libldap-2.4-2_2.4.48+dfsg-1+b2 libldap-common_2.4.48+dfsg-1 liblz4-1_1.9.2-1 liblzma5_5.2.4-1 libmagic-mgc_1:5.37-6 libmagic1_1:5.37-6 libmnl0_1.0.4-2 libmount1_2.34-0.1 libmpc3_1.1.0-1 libmpfr6_4.0.2-1 libncurses6_6.1+20191019-1 libncursesw6_6.1+20191019-1 libnettle7_3.5.1+really3.5.1-2 libnpth0_1.6-1 libp11-kit0_0.23.18.1-2 libpam-cap_1:2.27-1 libpam-modules_1.3.1-5 libpam-modules-bin_1.3.1-5 libpam-runtime_1.3.1-5 libpam0g_1.3.1-5 libpcre2-8-0_10.32-5 libpcre3_2:8.39-12 libperl5.30_5.30.0-9 libpipeline1_1.5.1-2 libprocps7_2:3.3.15-2 libprotobuf-dev_3.6.1.3-2+rpi1 libprotobuf-lite17_3.6.1.3-2+rpi1 libprotobuf17_3.6.1.3-2+rpi1 libprotoc17_3.6.1.3-2+rpi1 libreadline7_7.0-5 libreadline8_8.0-3 libsasl2-2_2.1.27+dfsg-1+b1 libsasl2-dev_2.1.27+dfsg-1+b1 libsasl2-modules-db_2.1.27+dfsg-1+b1 libseccomp-dev_2.4.1-2+rpi1 libseccomp2_2.4.1-2+rpi1 libselinux1_2.9-2 libsemanage-common_2.9-3 libsemanage1_2.9-3 libsepol1_2.9-2 libsigsegv2_2.12-2 libsmartcols1_2.34-0.1 libsqlite3-0_3.30.1-1 libss2_1.45.4-1 libssl1.1_1.1.1d-2 libstdc++-9-dev_9.2.1-17+rpi1 libstdc++6_9.2.1-17+rpi1 libsub-override-perl_0.09-2 libsystemd-dev_243-8+rpi1 libsystemd0_243-8+rpi1 libtasn1-6_4.14-3 libtinfo5_6.1+20191019-1 libtinfo6_6.1+20191019-1 libtool_2.4.6-11 libubsan1_9.2.1-17+rpi1 libuchardet0_0.0.6-3 libudev1_242-7+rpi1 libunistring2_0.9.10-2 libuuid1_2.34-0.1 libxml2_2.9.4+dfsg1-8 libxtables12_1.8.3-2 libzstd1_1.4.3+dfsg-1+rpi1 linux-libc-dev_5.2.17-1+rpi1+b2 login_1:4.7-2 logsave_1.45.4-1 lsb-base_11.1.0+rpi1 lsof_4.93.2+dfsg-1 m4_1.4.18-4 make_4.2.1-1.2 man-db_2.9.0-1 mawk_1.3.3-17 mockery_0.0~git20181123.e78b021-2 mount_2.34-0.1 ncurses-base_6.1+20191019-1 ncurses-bin_6.1+20191019-1 netbase_5.6 openssl_1.1.1d-2 passwd_1:4.7-2 patch_2.7.6-6 perl_5.30.0-9 perl-base_5.30.0-9 perl-modules-5.30_5.30.0-9 pinentry-curses_1.1.0-3 pkg-config_0.29-6 po-debconf_1.0.21 procps_2:3.3.15-2 protobuf-compiler_3.6.1.3-2+rpi1 raspbian-archive-keyring_20120528.2 readline-common_8.0-3 sbuild-build-depends-consul-dummy_0.invalid.0 sbuild-build-depends-core-dummy_0.invalid.0 sed_4.7-1 sensible-utils_0.0.12 sysvinit-utils_2.96-1 tar_1.30+dfsg-6 tzdata_2019c-3 util-linux_2.34-0.1 xz-utils_5.2.4-1 zlib1g_1:1.2.11.dfsg-1 zlib1g-dev_1:1.2.11.dfsg-1

+------------------------------------------------------------------------------+
| Build                                                                        |
+------------------------------------------------------------------------------+


Unpack source
-------------

gpgv: unknown type of key resource 'trustedkeys.kbx'
gpgv: keyblock resource '/sbuild-nonexistent/.gnupg/trustedkeys.kbx': General error
gpgv: Signature made Mon Nov 18 23:52:18 2019 UTC
gpgv:                using RSA key 50BC7CF939D20C272A6B065652B6BBD953968D1B
gpgv: Can't check signature: No public key
dpkg-source: warning: failed to verify signature on ./consul_1.4.4~dfsg3-5.dsc
dpkg-source: info: extracting consul in /<<PKGBUILDDIR>>
dpkg-source: info: unpacking consul_1.4.4~dfsg3.orig.tar.xz
dpkg-source: info: unpacking consul_1.4.4~dfsg3-5.debian.tar.xz
dpkg-source: info: using patch list from debian/patches/series
dpkg-source: info: applying hclog.patch
dpkg-source: info: applying provider-no-k8s.patch
dpkg-source: info: applying t-fix--agent-session_endpoint.patch
dpkg-source: info: applying t-skip-unreliable-tests.patch

Check disc space
----------------

Sufficient free space for build

User Environment
----------------

APT_CONFIG=/var/lib/sbuild/apt.conf
DEB_BUILD_OPTIONS=parallel=4
HOME=/sbuild-nonexistent
LC_ALL=POSIX
LOGNAME=buildd
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
SCHROOT_ALIAS_NAME=bullseye-staging-armhf-sbuild
SCHROOT_CHROOT_NAME=bullseye-staging-armhf-sbuild
SCHROOT_COMMAND=env
SCHROOT_GID=109
SCHROOT_GROUP=buildd
SCHROOT_SESSION_ID=bullseye-staging-armhf-sbuild-b7f9e9db-7fc6-4ae5-94b0-e4f33a957447
SCHROOT_UID=104
SCHROOT_USER=buildd
SHELL=/bin/sh
TERM=linux
USER=buildd

dpkg-buildpackage
-----------------

dpkg-buildpackage: info: source package consul
dpkg-buildpackage: info: source version 1.4.4~dfsg3-5
dpkg-buildpackage: info: source distribution unstable
 dpkg-source --before-build .
dpkg-buildpackage: info: host architecture armhf
 fakeroot debian/rules clean
dh clean --buildsystem=golang --with=golang,bash-completion --builddirectory=_build
   dh_auto_clean -O--buildsystem=golang -O--builddirectory=_build
   dh_autoreconf_clean -O--buildsystem=golang -O--builddirectory=_build
   debian/rules override_dh_clean
make[1]: Entering directory '/<<PKGBUILDDIR>>'
dh_clean
## Remove Files-Excluded (when built from checkout or non-DFSG tarball):
rm -f -rv `perl -0nE 'say $1 if m{^Files\-Excluded\:\s*(.*?)(?:\n\n|Files:|Comment:)}sm;' debian/copyright`
find vendor -type d -empty -delete -print
vendor/github.com/Azure
vendor/github.com/DataDog
vendor/github.com/Jeffail
vendor/github.com/Microsoft
vendor/github.com/NYTimes
vendor/github.com/SAP
vendor/github.com/SermoDigital
vendor/github.com/Sirupsen
vendor/github.com/StackExchange
vendor/github.com/armon
vendor/github.com/asaskevich
vendor/github.com/aws
vendor/github.com/beorn7
vendor/github.com/bgentry
vendor/github.com/boltdb
vendor/github.com/circonus-labs
vendor/github.com/davecgh
vendor/github.com/denisenkom
vendor/github.com/denverdino
vendor/github.com/dgrijalva
vendor/github.com/digitalocean
vendor/github.com/docker
vendor/github.com/elazarl
vendor/github.com/fatih
vendor/github.com/ghodss
vendor/github.com/go-ole
vendor/github.com/go-sql-driver
vendor/github.com/gocql
vendor/github.com/gogo
vendor/github.com/golang
vendor/github.com/google
vendor/github.com/googleapis
vendor/github.com/gophercloud
vendor/github.com/gregjones
vendor/github.com/hailocab
vendor/github.com/imdario
vendor/github.com/jefferai
vendor/github.com/joyent
vendor/github.com/json-iterator
vendor/github.com/keybase
vendor/github.com/kr
vendor/github.com/lib
vendor/github.com/mattn
vendor/github.com/matttproud
vendor/github.com/miekg
vendor/github.com/mitchellh
vendor/github.com/modern-go
vendor/github.com/nicolai86
vendor/github.com/packethost
vendor/github.com/pascaldekloe
vendor/github.com/patrickmn
vendor/github.com/peterbourgon
vendor/github.com/pkg
vendor/github.com/pmezard
vendor/github.com/posener
vendor/github.com/prometheus
vendor/github.com/ryanuber
vendor/github.com/shirou
vendor/github.com/softlayer
vendor/github.com/spf13
vendor/github.com/stretchr
vendor/github.com/vmware
vendor/golang.org
vendor/gopkg.in
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
 debian/rules build-arch
dh build-arch --buildsystem=golang --with=golang,bash-completion --builddirectory=_build
   dh_update_autotools_config -a -O--buildsystem=golang -O--builddirectory=_build
   dh_autoreconf -a -O--buildsystem=golang -O--builddirectory=_build
   debian/rules override_dh_auto_configure
make[1]: Entering directory '/<<PKGBUILDDIR>>'
dh_auto_configure
mkdir -v -p _build/src/github.com/keybase/
mkdir: created directory '_build/src/github.com/keybase/'
ln -sv /usr/share/gocode/src/golang.org/x/crypto  _build/src/github.com/keybase/go-crypto
'_build/src/github.com/keybase/go-crypto' -> '/usr/share/gocode/src/golang.org/x/crypto'
mkdir -v -p _build/src/github.com/SermoDigital/
mkdir: created directory '_build/src/github.com/SermoDigital/'
ln -sv /usr/share/gocode/src/gopkg.in/square/go-jose.v1  _build/src/github.com/SermoDigital/jose
'_build/src/github.com/SermoDigital/jose' -> '/usr/share/gocode/src/gopkg.in/square/go-jose.v1'
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
   debian/rules override_dh_auto_build
make[1]: Entering directory '/<<PKGBUILDDIR>>'
export GOPATH=/<<PKGBUILDDIR>>/_build \
        && /usr/bin/make -C _build/src/github.com/hashicorp/consul --makefile=/<<PKGBUILDDIR>>/GNUmakefile proto
make[2]: Entering directory '/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul'
protoc agent/connect/ca/plugin/*.proto --gofast_out=plugins=grpc:../../..
bash: git: command not found
bash: git: command not found
bash: git: command not found
bash: git: command not found
bash: git: command not found
bash: git: command not found
make[2]: Leaving directory '/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul'
dh_auto_build -v
	cd _build && go generate -v github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/checks github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/connect github.com/hashicorp/consul/agent/connect/ca github.com/hashicorp/consul/agent/connect/ca/plugin github.com/hashicorp/consul/agent/consul github.com/hashicorp/consul/agent/consul/autopilot github.com/hashicorp/consul/agent/consul/fsm github.com/hashicorp/consul/agent/consul/prepared_query github.com/hashicorp/consul/agent/consul/state github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/api github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/tls github.com/hashicorp/consul/command/tls/ca github.com/hashicorp/consul/command/tls/ca/create github.com/hashicorp/consul/command/tls/cert github.com/hashicorp/consul/command/tls/cert/create github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/freeport github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/testutil github.com/hashicorp/consul/testutil/retry github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version github.com/hashicorp/consul/watch
src/github.com/hashicorp/consul/main.go
src/github.com/hashicorp/consul/main_test.go
src/github.com/hashicorp/consul/acl/acl.go
src/github.com/hashicorp/consul/acl/acl_test.go
src/github.com/hashicorp/consul/acl/errors.go
src/github.com/hashicorp/consul/acl/policy.go
src/github.com/hashicorp/consul/acl/policy_test.go
src/github.com/hashicorp/consul/agent/acl.go
src/github.com/hashicorp/consul/agent/acl_endpoint.go
src/github.com/hashicorp/consul/agent/acl_endpoint_legacy.go
src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go
src/github.com/hashicorp/consul/agent/acl_endpoint_test.go
src/github.com/hashicorp/consul/agent/acl_test.go
src/github.com/hashicorp/consul/agent/agent.go
src/github.com/hashicorp/consul/agent/agent_endpoint.go
src/github.com/hashicorp/consul/agent/agent_endpoint_test.go
src/github.com/hashicorp/consul/agent/agent_test.go
src/github.com/hashicorp/consul/agent/bindata_assetfs.go
src/github.com/hashicorp/consul/agent/blacklist.go
src/github.com/hashicorp/consul/agent/blacklist_test.go
src/github.com/hashicorp/consul/agent/catalog_endpoint.go
src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go
src/github.com/hashicorp/consul/agent/check.go
src/github.com/hashicorp/consul/agent/config.go
src/github.com/hashicorp/consul/agent/connect_auth.go
src/github.com/hashicorp/consul/agent/connect_ca_endpoint.go
src/github.com/hashicorp/consul/agent/connect_ca_endpoint_test.go
src/github.com/hashicorp/consul/agent/coordinate_endpoint.go
src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go
src/github.com/hashicorp/consul/agent/dns.go
src/github.com/hashicorp/consul/agent/dns_test.go
src/github.com/hashicorp/consul/agent/enterprise_delegate_oss.go
src/github.com/hashicorp/consul/agent/event_endpoint.go
src/github.com/hashicorp/consul/agent/event_endpoint_test.go
src/github.com/hashicorp/consul/agent/health_endpoint.go
src/github.com/hashicorp/consul/agent/health_endpoint_test.go
src/github.com/hashicorp/consul/agent/http.go
src/github.com/hashicorp/consul/agent/http_oss.go
src/github.com/hashicorp/consul/agent/http_oss_test.go
src/github.com/hashicorp/consul/agent/http_test.go
src/github.com/hashicorp/consul/agent/intentions_endpoint.go
src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go
src/github.com/hashicorp/consul/agent/keyring.go
src/github.com/hashicorp/consul/agent/keyring_test.go
src/github.com/hashicorp/consul/agent/kvs_endpoint.go
src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go
src/github.com/hashicorp/consul/agent/notify.go
src/github.com/hashicorp/consul/agent/notify_test.go
src/github.com/hashicorp/consul/agent/operator_endpoint.go
src/github.com/hashicorp/consul/agent/operator_endpoint_test.go
src/github.com/hashicorp/consul/agent/prepared_query_endpoint.go
src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go
src/github.com/hashicorp/consul/agent/remote_exec.go
src/github.com/hashicorp/consul/agent/remote_exec_test.go
src/github.com/hashicorp/consul/agent/retry_join.go
src/github.com/hashicorp/consul/agent/session_endpoint.go
src/github.com/hashicorp/consul/agent/session_endpoint_test.go
src/github.com/hashicorp/consul/agent/sidecar_service.go
src/github.com/hashicorp/consul/agent/sidecar_service_test.go
src/github.com/hashicorp/consul/agent/signal_unix.go
src/github.com/hashicorp/consul/agent/snapshot_endpoint.go
src/github.com/hashicorp/consul/agent/snapshot_endpoint_test.go
src/github.com/hashicorp/consul/agent/status_endpoint.go
src/github.com/hashicorp/consul/agent/status_endpoint_test.go
src/github.com/hashicorp/consul/agent/testagent.go
src/github.com/hashicorp/consul/agent/testagent_test.go
src/github.com/hashicorp/consul/agent/translate_addr.go
src/github.com/hashicorp/consul/agent/txn_endpoint.go
src/github.com/hashicorp/consul/agent/txn_endpoint_test.go
src/github.com/hashicorp/consul/agent/ui_endpoint.go
src/github.com/hashicorp/consul/agent/ui_endpoint_test.go
src/github.com/hashicorp/consul/agent/user_event.go
src/github.com/hashicorp/consul/agent/user_event_test.go
src/github.com/hashicorp/consul/agent/util.go
src/github.com/hashicorp/consul/agent/util_test.go
src/github.com/hashicorp/consul/agent/watch_handler.go
src/github.com/hashicorp/consul/agent/watch_handler_test.go
src/github.com/hashicorp/consul/agent/ae/ae.go
src/github.com/hashicorp/consul/agent/ae/ae_test.go
src/github.com/hashicorp/consul/agent/ae/trigger.go
src/github.com/hashicorp/consul/agent/cache/cache.go
Generating mock for: Request in file: /<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/mock_Request.go
Generating mock for: Type in file: /<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/mock_Type.go
src/github.com/hashicorp/consul/agent/cache/cache_test.go
src/github.com/hashicorp/consul/agent/cache/entry.go
src/github.com/hashicorp/consul/agent/cache/entry_test.go
src/github.com/hashicorp/consul/agent/cache/mock_Request.go
src/github.com/hashicorp/consul/agent/cache/mock_Type.go
src/github.com/hashicorp/consul/agent/cache/request.go
src/github.com/hashicorp/consul/agent/cache/testing.go
src/github.com/hashicorp/consul/agent/cache/type.go
src/github.com/hashicorp/consul/agent/cache/watch.go
src/github.com/hashicorp/consul/agent/cache/watch_test.go
src/github.com/hashicorp/consul/agent/cache-types/catalog_services.go
src/github.com/hashicorp/consul/agent/cache-types/catalog_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_leaf.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_leaf_test.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go
src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root_test.go
src/github.com/hashicorp/consul/agent/cache-types/health_services.go
src/github.com/hashicorp/consul/agent/cache-types/health_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/intention_match.go
src/github.com/hashicorp/consul/agent/cache-types/intention_match_test.go
src/github.com/hashicorp/consul/agent/cache-types/mock_RPC.go
src/github.com/hashicorp/consul/agent/cache-types/node_services.go
src/github.com/hashicorp/consul/agent/cache-types/node_services_test.go
src/github.com/hashicorp/consul/agent/cache-types/prepared_query.go
src/github.com/hashicorp/consul/agent/cache-types/prepared_query_test.go
src/github.com/hashicorp/consul/agent/cache-types/rpc.go
Generating mock for: RPC in file: /<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/mock_RPC.go
src/github.com/hashicorp/consul/agent/cache-types/testing.go
src/github.com/hashicorp/consul/agent/checks/alias.go
src/github.com/hashicorp/consul/agent/checks/alias_test.go
src/github.com/hashicorp/consul/agent/checks/check.go
src/github.com/hashicorp/consul/agent/checks/check_test.go
src/github.com/hashicorp/consul/agent/checks/docker.go
src/github.com/hashicorp/consul/agent/checks/docker_unix.go
src/github.com/hashicorp/consul/agent/checks/grpc.go
src/github.com/hashicorp/consul/agent/checks/grpc_test.go
src/github.com/hashicorp/consul/agent/config/builder.go
src/github.com/hashicorp/consul/agent/config/config.go
src/github.com/hashicorp/consul/agent/config/default.go
src/github.com/hashicorp/consul/agent/config/default_oss.go
src/github.com/hashicorp/consul/agent/config/doc.go
src/github.com/hashicorp/consul/agent/config/flags.go
src/github.com/hashicorp/consul/agent/config/flags_test.go
src/github.com/hashicorp/consul/agent/config/flagset.go
src/github.com/hashicorp/consul/agent/config/merge.go
src/github.com/hashicorp/consul/agent/config/merge_test.go
src/github.com/hashicorp/consul/agent/config/patch_hcl.go
src/github.com/hashicorp/consul/agent/config/patch_hcl_test.go
src/github.com/hashicorp/consul/agent/config/runtime.go
src/github.com/hashicorp/consul/agent/config/runtime_test.go
src/github.com/hashicorp/consul/agent/config/segment_oss.go
src/github.com/hashicorp/consul/agent/config/segment_oss_test.go
src/github.com/hashicorp/consul/agent/config/translate.go
src/github.com/hashicorp/consul/agent/config/translate_test.go
src/github.com/hashicorp/consul/agent/connect/csr.go
src/github.com/hashicorp/consul/agent/connect/generate.go
src/github.com/hashicorp/consul/agent/connect/parsing.go
src/github.com/hashicorp/consul/agent/connect/testing_ca.go
src/github.com/hashicorp/consul/agent/connect/testing_ca_test.go
src/github.com/hashicorp/consul/agent/connect/testing_spiffe.go
src/github.com/hashicorp/consul/agent/connect/uri.go
src/github.com/hashicorp/consul/agent/connect/uri_service.go
src/github.com/hashicorp/consul/agent/connect/uri_service_test.go
src/github.com/hashicorp/consul/agent/connect/uri_signing.go
src/github.com/hashicorp/consul/agent/connect/uri_signing_test.go
src/github.com/hashicorp/consul/agent/connect/uri_test.go
src/github.com/hashicorp/consul/agent/connect/ca/mock_Provider.go
src/github.com/hashicorp/consul/agent/connect/ca/provider.go
Generating mock for: Provider in file: /<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/connect/ca/mock_Provider.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul_config.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_consul_test.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_vault.go
src/github.com/hashicorp/consul/agent/connect/ca/provider_vault_test.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/client.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/plugin.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/plugin_test.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/provider.pb.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/serve.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/transport_grpc.go
src/github.com/hashicorp/consul/agent/connect/ca/plugin/transport_netrpc.go
src/github.com/hashicorp/consul/agent/consul/acl.go
src/github.com/hashicorp/consul/agent/consul/acl_client.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint_legacy.go
src/github.com/hashicorp/consul/agent/consul/acl_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_legacy.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_legacy_test.go
src/github.com/hashicorp/consul/agent/consul/acl_replication_test.go
src/github.com/hashicorp/consul/agent/consul/acl_server.go
src/github.com/hashicorp/consul/agent/consul/acl_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot.go
src/github.com/hashicorp/consul/agent/consul/autopilot_oss.go
src/github.com/hashicorp/consul/agent/consul/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go
src/github.com/hashicorp/consul/agent/consul/catalog_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/client.go
src/github.com/hashicorp/consul/agent/consul/client_serf.go
src/github.com/hashicorp/consul/agent/consul/client_test.go
src/github.com/hashicorp/consul/agent/consul/config.go
src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go
src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/consul_ca_delegate.go
src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go
src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/enterprise_client_oss.go
src/github.com/hashicorp/consul/agent/consul/enterprise_server_oss.go
src/github.com/hashicorp/consul/agent/consul/filter.go
src/github.com/hashicorp/consul/agent/consul/filter_test.go
src/github.com/hashicorp/consul/agent/consul/flood.go
src/github.com/hashicorp/consul/agent/consul/health_endpoint.go
src/github.com/hashicorp/consul/agent/consul/health_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/helper_test.go
src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go
src/github.com/hashicorp/consul/agent/consul/intention_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/internal_endpoint.go
src/github.com/hashicorp/consul/agent/consul/internal_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/issue_test.go
src/github.com/hashicorp/consul/agent/consul/kvs_endpoint.go
src/github.com/hashicorp/consul/agent/consul/kvs_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/leader.go
src/github.com/hashicorp/consul/agent/consul/leader_oss.go
src/github.com/hashicorp/consul/agent/consul/leader_test.go
src/github.com/hashicorp/consul/agent/consul/merge.go
src/github.com/hashicorp/consul/agent/consul/merge_test.go
src/github.com/hashicorp/consul/agent/consul/operator_autopilot_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_autopilot_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/operator_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_raft_endpoint.go
src/github.com/hashicorp/consul/agent/consul/operator_raft_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query_endpoint.go
src/github.com/hashicorp/consul/agent/consul/prepared_query_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/raft_rpc.go
src/github.com/hashicorp/consul/agent/consul/rpc.go
src/github.com/hashicorp/consul/agent/consul/rpc_test.go
src/github.com/hashicorp/consul/agent/consul/rtt.go
src/github.com/hashicorp/consul/agent/consul/rtt_test.go
src/github.com/hashicorp/consul/agent/consul/segment_oss.go
src/github.com/hashicorp/consul/agent/consul/serf_test.go
src/github.com/hashicorp/consul/agent/consul/server.go
src/github.com/hashicorp/consul/agent/consul/server_lookup.go
src/github.com/hashicorp/consul/agent/consul/server_lookup_test.go
src/github.com/hashicorp/consul/agent/consul/server_oss.go
src/github.com/hashicorp/consul/agent/consul/server_serf.go
src/github.com/hashicorp/consul/agent/consul/server_test.go
src/github.com/hashicorp/consul/agent/consul/session_endpoint.go
src/github.com/hashicorp/consul/agent/consul/session_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/session_timers.go
src/github.com/hashicorp/consul/agent/consul/session_timers_test.go
src/github.com/hashicorp/consul/agent/consul/session_ttl.go
src/github.com/hashicorp/consul/agent/consul/session_ttl_test.go
src/github.com/hashicorp/consul/agent/consul/snapshot_endpoint.go
src/github.com/hashicorp/consul/agent/consul/snapshot_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/stats_fetcher.go
src/github.com/hashicorp/consul/agent/consul/stats_fetcher_test.go
src/github.com/hashicorp/consul/agent/consul/status_endpoint.go
src/github.com/hashicorp/consul/agent/consul/status_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/txn_endpoint.go
src/github.com/hashicorp/consul/agent/consul/txn_endpoint_test.go
src/github.com/hashicorp/consul/agent/consul/util.go
src/github.com/hashicorp/consul/agent/consul/util_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go
src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/promotion.go
src/github.com/hashicorp/consul/agent/consul/autopilot/promotion_test.go
src/github.com/hashicorp/consul/agent/consul/autopilot/structs.go
src/github.com/hashicorp/consul/agent/consul/autopilot/structs_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/commands_oss.go
src/github.com/hashicorp/consul/agent/consul/fsm/commands_oss_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/fsm.go
src/github.com/hashicorp/consul/agent/consul/fsm/fsm_test.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot_oss.go
src/github.com/hashicorp/consul/agent/consul/fsm/snapshot_oss_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/template.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/template_test.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/walk.go
src/github.com/hashicorp/consul/agent/consul/prepared_query/walk_test.go
src/github.com/hashicorp/consul/agent/consul/state/acl.go
src/github.com/hashicorp/consul/agent/consul/state/acl_test.go
src/github.com/hashicorp/consul/agent/consul/state/autopilot.go
src/github.com/hashicorp/consul/agent/consul/state/autopilot_test.go
src/github.com/hashicorp/consul/agent/consul/state/catalog.go
src/github.com/hashicorp/consul/agent/consul/state/catalog_test.go
src/github.com/hashicorp/consul/agent/consul/state/connect_ca.go
src/github.com/hashicorp/consul/agent/consul/state/connect_ca_test.go
src/github.com/hashicorp/consul/agent/consul/state/coordinate.go
src/github.com/hashicorp/consul/agent/consul/state/coordinate_test.go
src/github.com/hashicorp/consul/agent/consul/state/delay.go
src/github.com/hashicorp/consul/agent/consul/state/delay_test.go
src/github.com/hashicorp/consul/agent/consul/state/graveyard.go
src/github.com/hashicorp/consul/agent/consul/state/graveyard_test.go
src/github.com/hashicorp/consul/agent/consul/state/index_connect.go
src/github.com/hashicorp/consul/agent/consul/state/index_connect_test.go
src/github.com/hashicorp/consul/agent/consul/state/intention.go
src/github.com/hashicorp/consul/agent/consul/state/intention_test.go
src/github.com/hashicorp/consul/agent/consul/state/kvs.go
src/github.com/hashicorp/consul/agent/consul/state/kvs_test.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_index.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_index_test.go
src/github.com/hashicorp/consul/agent/consul/state/prepared_query_test.go
src/github.com/hashicorp/consul/agent/consul/state/schema.go
src/github.com/hashicorp/consul/agent/consul/state/schema_test.go
src/github.com/hashicorp/consul/agent/consul/state/session.go
src/github.com/hashicorp/consul/agent/consul/state/session_test.go
src/github.com/hashicorp/consul/agent/consul/state/state_store.go
src/github.com/hashicorp/consul/agent/consul/state/state_store_test.go
src/github.com/hashicorp/consul/agent/consul/state/tombstone_gc.go
src/github.com/hashicorp/consul/agent/consul/state/tombstone_gc_test.go
src/github.com/hashicorp/consul/agent/consul/state/txn.go
src/github.com/hashicorp/consul/agent/consul/state/txn_test.go
src/github.com/hashicorp/consul/agent/debug/host.go
src/github.com/hashicorp/consul/agent/debug/host_test.go
src/github.com/hashicorp/consul/agent/exec/exec.go
src/github.com/hashicorp/consul/agent/exec/exec_unix.go
src/github.com/hashicorp/consul/agent/local/state.go
src/github.com/hashicorp/consul/agent/local/testing.go
src/github.com/hashicorp/consul/agent/local/state_test.go
src/github.com/hashicorp/consul/agent/metadata/build.go
src/github.com/hashicorp/consul/agent/metadata/build_test.go
src/github.com/hashicorp/consul/agent/metadata/server.go
src/github.com/hashicorp/consul/agent/metadata/server_internal_test.go
src/github.com/hashicorp/consul/agent/metadata/server_test.go
src/github.com/hashicorp/consul/agent/mock/notify.go
src/github.com/hashicorp/consul/agent/pool/conn.go
src/github.com/hashicorp/consul/agent/pool/pool.go
src/github.com/hashicorp/consul/agent/proxycfg/manager.go
src/github.com/hashicorp/consul/agent/proxycfg/manager_test.go
src/github.com/hashicorp/consul/agent/proxycfg/proxycfg.go
src/github.com/hashicorp/consul/agent/proxycfg/snapshot.go
src/github.com/hashicorp/consul/agent/proxycfg/state.go
src/github.com/hashicorp/consul/agent/proxycfg/state_test.go
src/github.com/hashicorp/consul/agent/proxycfg/testing.go
src/github.com/hashicorp/consul/agent/proxyprocess/daemon.go
src/github.com/hashicorp/consul/agent/proxyprocess/daemon_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/exitstatus_syscall.go
src/github.com/hashicorp/consul/agent/proxyprocess/manager.go
src/github.com/hashicorp/consul/agent/proxyprocess/manager_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/noop.go
src/github.com/hashicorp/consul/agent/proxyprocess/noop_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/process.go
src/github.com/hashicorp/consul/agent/proxyprocess/process_unix.go
src/github.com/hashicorp/consul/agent/proxyprocess/proxy.go
src/github.com/hashicorp/consul/agent/proxyprocess/proxy_test.go
src/github.com/hashicorp/consul/agent/proxyprocess/root.go
src/github.com/hashicorp/consul/agent/proxyprocess/snapshot.go
src/github.com/hashicorp/consul/agent/proxyprocess/test.go
src/github.com/hashicorp/consul/agent/router/manager.go
src/github.com/hashicorp/consul/agent/router/manager_internal_test.go
src/github.com/hashicorp/consul/agent/router/router.go
src/github.com/hashicorp/consul/agent/router/router_test.go
src/github.com/hashicorp/consul/agent/router/serf_adapter.go
src/github.com/hashicorp/consul/agent/router/serf_flooder.go
src/github.com/hashicorp/consul/agent/router/manager_test.go
src/github.com/hashicorp/consul/agent/structs/acl.go
src/github.com/hashicorp/consul/agent/structs/acl_cache.go
src/github.com/hashicorp/consul/agent/structs/acl_cache_test.go
src/github.com/hashicorp/consul/agent/structs/acl_legacy.go
src/github.com/hashicorp/consul/agent/structs/acl_legacy_test.go
src/github.com/hashicorp/consul/agent/structs/acl_test.go
src/github.com/hashicorp/consul/agent/structs/catalog.go
src/github.com/hashicorp/consul/agent/structs/check_definition.go
src/github.com/hashicorp/consul/agent/structs/check_definition_test.go
src/github.com/hashicorp/consul/agent/structs/check_type.go
src/github.com/hashicorp/consul/agent/structs/connect.go
src/github.com/hashicorp/consul/agent/structs/connect_ca.go
src/github.com/hashicorp/consul/agent/structs/connect_ca_test.go
src/github.com/hashicorp/consul/agent/structs/connect_proxy_config.go
src/github.com/hashicorp/consul/agent/structs/connect_proxy_config_test.go
src/github.com/hashicorp/consul/agent/structs/connect_test.go
src/github.com/hashicorp/consul/agent/structs/errors.go
src/github.com/hashicorp/consul/agent/structs/intention.go
src/github.com/hashicorp/consul/agent/structs/intention_test.go
src/github.com/hashicorp/consul/agent/structs/operator.go
src/github.com/hashicorp/consul/agent/structs/prepared_query.go
src/github.com/hashicorp/consul/agent/structs/prepared_query_test.go
src/github.com/hashicorp/consul/agent/structs/sanitize_oss.go
src/github.com/hashicorp/consul/agent/structs/service_definition.go
src/github.com/hashicorp/consul/agent/structs/service_definition_test.go
src/github.com/hashicorp/consul/agent/structs/snapshot.go
src/github.com/hashicorp/consul/agent/structs/structs.go
src/github.com/hashicorp/consul/agent/structs/structs_test.go
src/github.com/hashicorp/consul/agent/structs/testing_catalog.go
src/github.com/hashicorp/consul/agent/structs/testing_connect_proxy_config.go
src/github.com/hashicorp/consul/agent/structs/testing_intention.go
src/github.com/hashicorp/consul/agent/structs/testing_service_definition.go
src/github.com/hashicorp/consul/agent/structs/txn.go
src/github.com/hashicorp/consul/agent/systemd/notify.go
src/github.com/hashicorp/consul/agent/token/store.go
src/github.com/hashicorp/consul/agent/token/store_test.go
src/github.com/hashicorp/consul/agent/xds/clusters.go
src/github.com/hashicorp/consul/agent/xds/endpoints.go
src/github.com/hashicorp/consul/agent/xds/listeners.go
src/github.com/hashicorp/consul/agent/xds/response.go
src/github.com/hashicorp/consul/agent/xds/routes.go
src/github.com/hashicorp/consul/agent/xds/server.go
src/github.com/hashicorp/consul/agent/xds/server_test.go
src/github.com/hashicorp/consul/agent/xds/testing.go
src/github.com/hashicorp/consul/agent/xds/xds.go
src/github.com/hashicorp/consul/api/acl.go
src/github.com/hashicorp/consul/api/acl_test.go
src/github.com/hashicorp/consul/api/agent.go
src/github.com/hashicorp/consul/api/agent_test.go
src/github.com/hashicorp/consul/api/api.go
src/github.com/hashicorp/consul/api/api_test.go
src/github.com/hashicorp/consul/api/catalog.go
src/github.com/hashicorp/consul/api/catalog_test.go
src/github.com/hashicorp/consul/api/connect.go
src/github.com/hashicorp/consul/api/connect_ca.go
src/github.com/hashicorp/consul/api/connect_ca_test.go
src/github.com/hashicorp/consul/api/connect_intention.go
src/github.com/hashicorp/consul/api/connect_intention_test.go
src/github.com/hashicorp/consul/api/coordinate.go
src/github.com/hashicorp/consul/api/coordinate_test.go
src/github.com/hashicorp/consul/api/debug.go
src/github.com/hashicorp/consul/api/debug_test.go
src/github.com/hashicorp/consul/api/event.go
src/github.com/hashicorp/consul/api/event_test.go
src/github.com/hashicorp/consul/api/health.go
src/github.com/hashicorp/consul/api/health_test.go
src/github.com/hashicorp/consul/api/kv.go
src/github.com/hashicorp/consul/api/kv_test.go
src/github.com/hashicorp/consul/api/lock.go
src/github.com/hashicorp/consul/api/lock_test.go
src/github.com/hashicorp/consul/api/operator.go
src/github.com/hashicorp/consul/api/operator_area.go
src/github.com/hashicorp/consul/api/operator_autopilot.go
src/github.com/hashicorp/consul/api/operator_autopilot_test.go
src/github.com/hashicorp/consul/api/operator_keyring.go
src/github.com/hashicorp/consul/api/operator_keyring_test.go
src/github.com/hashicorp/consul/api/operator_raft.go
src/github.com/hashicorp/consul/api/operator_raft_test.go
src/github.com/hashicorp/consul/api/operator_segment.go
src/github.com/hashicorp/consul/api/prepared_query.go
src/github.com/hashicorp/consul/api/prepared_query_test.go
src/github.com/hashicorp/consul/api/raw.go
src/github.com/hashicorp/consul/api/semaphore.go
src/github.com/hashicorp/consul/api/semaphore_test.go
src/github.com/hashicorp/consul/api/session.go
src/github.com/hashicorp/consul/api/session_test.go
src/github.com/hashicorp/consul/api/snapshot.go
src/github.com/hashicorp/consul/api/snapshot_test.go
src/github.com/hashicorp/consul/api/status.go
src/github.com/hashicorp/consul/api/status_test.go
src/github.com/hashicorp/consul/api/txn.go
src/github.com/hashicorp/consul/api/txn_test.go
src/github.com/hashicorp/consul/command/commands_oss.go
src/github.com/hashicorp/consul/command/registry.go
src/github.com/hashicorp/consul/command/acl/acl.go
src/github.com/hashicorp/consul/command/acl/acl_helpers.go
src/github.com/hashicorp/consul/command/acl/agenttokens/agent_tokens.go
src/github.com/hashicorp/consul/command/acl/agenttokens/agent_tokens_test.go
src/github.com/hashicorp/consul/command/acl/bootstrap/bootstrap.go
src/github.com/hashicorp/consul/command/acl/bootstrap/bootstrap_test.go
src/github.com/hashicorp/consul/command/acl/policy/policy.go
src/github.com/hashicorp/consul/command/acl/policy/create/policy_create.go
src/github.com/hashicorp/consul/command/acl/policy/create/policy_create_test.go
src/github.com/hashicorp/consul/command/acl/policy/delete/policy_delete.go
src/github.com/hashicorp/consul/command/acl/policy/delete/policy_delete_test.go
src/github.com/hashicorp/consul/command/acl/policy/list/policy_list.go
src/github.com/hashicorp/consul/command/acl/policy/list/policy_list_test.go
src/github.com/hashicorp/consul/command/acl/policy/read/policy_read.go
src/github.com/hashicorp/consul/command/acl/policy/read/policy_read_test.go
src/github.com/hashicorp/consul/command/acl/policy/update/policy_update.go
src/github.com/hashicorp/consul/command/acl/policy/update/policy_update_test.go
src/github.com/hashicorp/consul/command/acl/rules/translate.go
src/github.com/hashicorp/consul/command/acl/rules/translate_test.go
src/github.com/hashicorp/consul/command/acl/token/token.go
src/github.com/hashicorp/consul/command/acl/token/clone/token_clone.go
src/github.com/hashicorp/consul/command/acl/token/clone/token_clone_test.go
src/github.com/hashicorp/consul/command/acl/token/create/token_create.go
src/github.com/hashicorp/consul/command/acl/token/create/token_create_test.go
src/github.com/hashicorp/consul/command/acl/token/delete/token_delete.go
src/github.com/hashicorp/consul/command/acl/token/delete/token_delete_test.go
src/github.com/hashicorp/consul/command/acl/token/list/token_list.go
src/github.com/hashicorp/consul/command/acl/token/list/token_list_test.go
src/github.com/hashicorp/consul/command/acl/token/read/token_read.go
src/github.com/hashicorp/consul/command/acl/token/read/token_read_test.go
src/github.com/hashicorp/consul/command/acl/token/update/token_update.go
src/github.com/hashicorp/consul/command/acl/token/update/token_update_test.go
src/github.com/hashicorp/consul/command/agent/agent.go
src/github.com/hashicorp/consul/command/agent/agent_test.go
src/github.com/hashicorp/consul/command/catalog/catalog.go
src/github.com/hashicorp/consul/command/catalog/catalog_test.go
src/github.com/hashicorp/consul/command/catalog/list/dc/catalog_list_datacenters.go
src/github.com/hashicorp/consul/command/catalog/list/dc/catalog_list_datacenters_test.go
src/github.com/hashicorp/consul/command/catalog/list/nodes/catalog_list_nodes.go
src/github.com/hashicorp/consul/command/catalog/list/nodes/catalog_list_nodes_test.go
src/github.com/hashicorp/consul/command/catalog/list/services/catalog_list_services.go
src/github.com/hashicorp/consul/command/catalog/list/services/catalog_list_services_test.go
src/github.com/hashicorp/consul/command/connect/connect.go
src/github.com/hashicorp/consul/command/connect/connect_test.go
src/github.com/hashicorp/consul/command/connect/ca/ca.go
src/github.com/hashicorp/consul/command/connect/ca/ca_test.go
src/github.com/hashicorp/consul/command/connect/ca/get/connect_ca_get.go
src/github.com/hashicorp/consul/command/connect/ca/get/connect_ca_get_test.go
src/github.com/hashicorp/consul/command/connect/ca/set/connect_ca_set.go
src/github.com/hashicorp/consul/command/connect/ca/set/connect_ca_set_test.go
src/github.com/hashicorp/consul/command/connect/envoy/bootstrap_tpl.go
src/github.com/hashicorp/consul/command/connect/envoy/envoy.go
src/github.com/hashicorp/consul/command/connect/envoy/envoy_test.go
src/github.com/hashicorp/consul/command/connect/envoy/exec_test.go
src/github.com/hashicorp/consul/command/connect/envoy/exec_unix.go
src/github.com/hashicorp/consul/command/connect/proxy/flag_upstreams.go
src/github.com/hashicorp/consul/command/connect/proxy/flag_upstreams_test.go
src/github.com/hashicorp/consul/command/connect/proxy/proxy.go
src/github.com/hashicorp/consul/command/connect/proxy/proxy_test.go
src/github.com/hashicorp/consul/command/connect/proxy/register.go
src/github.com/hashicorp/consul/command/connect/proxy/register_test.go
src/github.com/hashicorp/consul/command/debug/debug.go
src/github.com/hashicorp/consul/command/debug/debug_test.go
src/github.com/hashicorp/consul/command/event/event.go
src/github.com/hashicorp/consul/command/event/event_test.go
src/github.com/hashicorp/consul/command/exec/exec.go
src/github.com/hashicorp/consul/command/exec/exec_test.go
src/github.com/hashicorp/consul/command/flags/config.go
src/github.com/hashicorp/consul/command/flags/config_test.go
src/github.com/hashicorp/consul/command/flags/flag_map_value.go
src/github.com/hashicorp/consul/command/flags/flag_map_value_test.go
src/github.com/hashicorp/consul/command/flags/flag_slice_value.go
src/github.com/hashicorp/consul/command/flags/flag_slice_value_test.go
src/github.com/hashicorp/consul/command/flags/http.go
src/github.com/hashicorp/consul/command/flags/http_test.go
src/github.com/hashicorp/consul/command/flags/merge.go
src/github.com/hashicorp/consul/command/flags/usage.go
src/github.com/hashicorp/consul/command/forceleave/forceleave.go
src/github.com/hashicorp/consul/command/forceleave/forceleave_test.go
src/github.com/hashicorp/consul/command/helpers/helpers.go
src/github.com/hashicorp/consul/command/info/info.go
src/github.com/hashicorp/consul/command/info/info_test.go
src/github.com/hashicorp/consul/command/intention/intention.go
src/github.com/hashicorp/consul/command/intention/intention_test.go
src/github.com/hashicorp/consul/command/intention/check/check.go
src/github.com/hashicorp/consul/command/intention/check/check_test.go
src/github.com/hashicorp/consul/command/intention/create/create.go
src/github.com/hashicorp/consul/command/intention/create/create_test.go
src/github.com/hashicorp/consul/command/intention/delete/delete.go
src/github.com/hashicorp/consul/command/intention/delete/delete_test.go
src/github.com/hashicorp/consul/command/intention/finder/finder.go
src/github.com/hashicorp/consul/command/intention/finder/finder_test.go
src/github.com/hashicorp/consul/command/intention/get/get.go
src/github.com/hashicorp/consul/command/intention/get/get_test.go
src/github.com/hashicorp/consul/command/intention/match/match.go
src/github.com/hashicorp/consul/command/intention/match/match_test.go
src/github.com/hashicorp/consul/command/join/join.go
src/github.com/hashicorp/consul/command/join/join_test.go
src/github.com/hashicorp/consul/command/keygen/keygen.go
src/github.com/hashicorp/consul/command/keygen/keygen_test.go
src/github.com/hashicorp/consul/command/keyring/keyring.go
src/github.com/hashicorp/consul/command/keyring/keyring_test.go
src/github.com/hashicorp/consul/command/kv/kv.go
src/github.com/hashicorp/consul/command/kv/kv_test.go
src/github.com/hashicorp/consul/command/kv/del/kv_delete.go
src/github.com/hashicorp/consul/command/kv/del/kv_delete_test.go
src/github.com/hashicorp/consul/command/kv/exp/kv_export.go
src/github.com/hashicorp/consul/command/kv/exp/kv_export_test.go
src/github.com/hashicorp/consul/command/kv/get/kv_get.go
src/github.com/hashicorp/consul/command/kv/get/kv_get_test.go
src/github.com/hashicorp/consul/command/kv/imp/kv_import.go
src/github.com/hashicorp/consul/command/kv/imp/kv_import_test.go
src/github.com/hashicorp/consul/command/kv/impexp/kvimpexp.go
src/github.com/hashicorp/consul/command/kv/put/kv_put.go
src/github.com/hashicorp/consul/command/kv/put/kv_put_test.go
src/github.com/hashicorp/consul/command/leave/leave.go
src/github.com/hashicorp/consul/command/leave/leave_test.go
src/github.com/hashicorp/consul/command/lock/lock.go
src/github.com/hashicorp/consul/command/lock/lock_test.go
src/github.com/hashicorp/consul/command/lock/util_unix.go
src/github.com/hashicorp/consul/command/maint/maint.go
src/github.com/hashicorp/consul/command/maint/maint_test.go
src/github.com/hashicorp/consul/command/members/members.go
src/github.com/hashicorp/consul/command/members/members_test.go
src/github.com/hashicorp/consul/command/monitor/monitor.go
src/github.com/hashicorp/consul/command/monitor/monitor_test.go
src/github.com/hashicorp/consul/command/operator/operator.go
src/github.com/hashicorp/consul/command/operator/operator_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/operator_autopilot.go
src/github.com/hashicorp/consul/command/operator/autopilot/operator_autopilot_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/get/operator_autopilot_get.go
src/github.com/hashicorp/consul/command/operator/autopilot/get/operator_autopilot_get_test.go
src/github.com/hashicorp/consul/command/operator/autopilot/set/operator_autopilot_set.go
src/github.com/hashicorp/consul/command/operator/autopilot/set/operator_autopilot_set_test.go
src/github.com/hashicorp/consul/command/operator/raft/operator_raft.go
src/github.com/hashicorp/consul/command/operator/raft/operator_raft_test.go
src/github.com/hashicorp/consul/command/operator/raft/listpeers/operator_raft_list.go
src/github.com/hashicorp/consul/command/operator/raft/listpeers/operator_raft_list_test.go
src/github.com/hashicorp/consul/command/operator/raft/removepeer/operator_raft_remove.go
src/github.com/hashicorp/consul/command/operator/raft/removepeer/operator_raft_remove_test.go
src/github.com/hashicorp/consul/command/reload/reload.go
src/github.com/hashicorp/consul/command/reload/reload_test.go
src/github.com/hashicorp/consul/command/rtt/rtt.go
src/github.com/hashicorp/consul/command/rtt/rtt_test.go
src/github.com/hashicorp/consul/command/services/config.go
src/github.com/hashicorp/consul/command/services/config_test.go
src/github.com/hashicorp/consul/command/services/services.go
src/github.com/hashicorp/consul/command/services/services_test.go
src/github.com/hashicorp/consul/command/services/deregister/deregister.go
src/github.com/hashicorp/consul/command/services/deregister/deregister_test.go
src/github.com/hashicorp/consul/command/services/register/register.go
src/github.com/hashicorp/consul/command/services/register/register_test.go
src/github.com/hashicorp/consul/command/snapshot/snapshot_command.go
src/github.com/hashicorp/consul/command/snapshot/snapshot_command_test.go
src/github.com/hashicorp/consul/command/snapshot/inspect/snapshot_inspect.go
src/github.com/hashicorp/consul/command/snapshot/inspect/snapshot_inspect_test.go
src/github.com/hashicorp/consul/command/snapshot/restore/snapshot_restore.go
src/github.com/hashicorp/consul/command/snapshot/restore/snapshot_restore_test.go
src/github.com/hashicorp/consul/command/snapshot/save/snapshot_save.go
src/github.com/hashicorp/consul/command/snapshot/save/snapshot_save_test.go
src/github.com/hashicorp/consul/command/tls/generate.go
src/github.com/hashicorp/consul/command/tls/generate_test.go
src/github.com/hashicorp/consul/command/tls/tls.go
src/github.com/hashicorp/consul/command/tls/tls_test.go
src/github.com/hashicorp/consul/command/tls/ca/tls_ca.go
src/github.com/hashicorp/consul/command/tls/ca/tls_ca_test.go
src/github.com/hashicorp/consul/command/tls/ca/create/tls_ca_create.go
src/github.com/hashicorp/consul/command/tls/ca/create/tls_ca_create_test.go
src/github.com/hashicorp/consul/command/tls/cert/tls_cert.go
src/github.com/hashicorp/consul/command/tls/cert/tls_cert_test.go
src/github.com/hashicorp/consul/command/tls/cert/create/tls_cert_create.go
src/github.com/hashicorp/consul/command/tls/cert/create/tls_cert_create_test.go
src/github.com/hashicorp/consul/command/validate/validate.go
src/github.com/hashicorp/consul/command/validate/validate_test.go
src/github.com/hashicorp/consul/command/version/version.go
src/github.com/hashicorp/consul/command/version/version_test.go
src/github.com/hashicorp/consul/command/watch/watch.go
src/github.com/hashicorp/consul/command/watch/watch_test.go
src/github.com/hashicorp/consul/connect/example_test.go
src/github.com/hashicorp/consul/connect/resolver.go
src/github.com/hashicorp/consul/connect/resolver_test.go
src/github.com/hashicorp/consul/connect/service.go
src/github.com/hashicorp/consul/connect/service_test.go
src/github.com/hashicorp/consul/connect/testing.go
src/github.com/hashicorp/consul/connect/tls.go
src/github.com/hashicorp/consul/connect/tls_test.go
src/github.com/hashicorp/consul/connect/certgen/certgen.go
src/github.com/hashicorp/consul/connect/proxy/config.go
src/github.com/hashicorp/consul/connect/proxy/config_test.go
src/github.com/hashicorp/consul/connect/proxy/conn.go
src/github.com/hashicorp/consul/connect/proxy/conn_test.go
src/github.com/hashicorp/consul/connect/proxy/listener.go
src/github.com/hashicorp/consul/connect/proxy/listener_test.go
src/github.com/hashicorp/consul/connect/proxy/proxy.go
src/github.com/hashicorp/consul/connect/proxy/proxy_test.go
src/github.com/hashicorp/consul/connect/proxy/testing.go
src/github.com/hashicorp/consul/ipaddr/detect.go
src/github.com/hashicorp/consul/ipaddr/detect_test.go
src/github.com/hashicorp/consul/ipaddr/ipaddr.go
src/github.com/hashicorp/consul/lib/cluster.go
src/github.com/hashicorp/consul/lib/cluster_test.go
src/github.com/hashicorp/consul/lib/eof.go
src/github.com/hashicorp/consul/lib/math.go
src/github.com/hashicorp/consul/lib/path.go
src/github.com/hashicorp/consul/lib/rand.go
src/github.com/hashicorp/consul/lib/rtt.go
src/github.com/hashicorp/consul/lib/rtt_test.go
src/github.com/hashicorp/consul/lib/serf.go
src/github.com/hashicorp/consul/lib/stop_context.go
src/github.com/hashicorp/consul/lib/string.go
src/github.com/hashicorp/consul/lib/string_test.go
src/github.com/hashicorp/consul/lib/telemetry.go
src/github.com/hashicorp/consul/lib/telemetry_test.go
src/github.com/hashicorp/consul/lib/useragent.go
src/github.com/hashicorp/consul/lib/useragent_test.go
src/github.com/hashicorp/consul/lib/uuid.go
src/github.com/hashicorp/consul/lib/math_test.go
src/github.com/hashicorp/consul/lib/file/atomic.go
src/github.com/hashicorp/consul/lib/file/atomic_test.go
src/github.com/hashicorp/consul/lib/freeport/freeport.go
src/github.com/hashicorp/consul/lib/semaphore/semaphore.go
src/github.com/hashicorp/consul/lib/semaphore/semaphore_test.go
src/github.com/hashicorp/consul/logger/gated_writer.go
src/github.com/hashicorp/consul/logger/gated_writer_test.go
src/github.com/hashicorp/consul/logger/grpc.go
src/github.com/hashicorp/consul/logger/grpc_test.go
src/github.com/hashicorp/consul/logger/log_levels.go
src/github.com/hashicorp/consul/logger/log_writer.go
src/github.com/hashicorp/consul/logger/log_writer_test.go
src/github.com/hashicorp/consul/logger/logfile.go
src/github.com/hashicorp/consul/logger/logfile_test.go
src/github.com/hashicorp/consul/logger/logger.go
src/github.com/hashicorp/consul/logger/syslog.go
src/github.com/hashicorp/consul/sentinel/evaluator.go
src/github.com/hashicorp/consul/sentinel/scope.go
src/github.com/hashicorp/consul/sentinel/sentinel_oss.go
src/github.com/hashicorp/consul/service_os/service.go
src/github.com/hashicorp/consul/snapshot/archive.go
src/github.com/hashicorp/consul/snapshot/archive_test.go
src/github.com/hashicorp/consul/snapshot/snapshot.go
src/github.com/hashicorp/consul/snapshot/snapshot_test.go
src/github.com/hashicorp/consul/testrpc/wait.go
src/github.com/hashicorp/consul/testutil/io.go
src/github.com/hashicorp/consul/testutil/server.go
src/github.com/hashicorp/consul/testutil/server_methods.go
src/github.com/hashicorp/consul/testutil/server_wrapper.go
src/github.com/hashicorp/consul/testutil/testlog.go
src/github.com/hashicorp/consul/testutil/retry/retry.go
src/github.com/hashicorp/consul/testutil/retry/retry_test.go
src/github.com/hashicorp/consul/tlsutil/config.go
src/github.com/hashicorp/consul/tlsutil/config_test.go
src/github.com/hashicorp/consul/types/area.go
src/github.com/hashicorp/consul/types/checks.go
src/github.com/hashicorp/consul/types/node_id.go
src/github.com/hashicorp/consul/version/version.go
src/github.com/hashicorp/consul/watch/funcs.go
src/github.com/hashicorp/consul/watch/plan.go
src/github.com/hashicorp/consul/watch/plan_test.go
src/github.com/hashicorp/consul/watch/watch.go
src/github.com/hashicorp/consul/watch/watch_test.go
src/github.com/hashicorp/consul/watch/funcs_test.go
	cd _build && go install -gcflags=all=\"-trimpath=/<<BUILDDIR>>/consul-1.4.4\~dfsg3/_build/src\" -asmflags=all=\"-trimpath=/<<BUILDDIR>>/consul-1.4.4\~dfsg3/_build/src\" -v -p 4 github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/checks github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/connect github.com/hashicorp/consul/agent/connect/ca github.com/hashicorp/consul/agent/connect/ca/plugin github.com/hashicorp/consul/agent/consul github.com/hashicorp/consul/agent/consul/autopilot github.com/hashicorp/consul/agent/consul/fsm github.com/hashicorp/consul/agent/consul/prepared_query github.com/hashicorp/consul/agent/consul/state github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/api github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/tls github.com/hashicorp/consul/command/tls/ca github.com/hashicorp/consul/command/tls/ca/create github.com/hashicorp/consul/command/tls/cert github.com/hashicorp/consul/command/tls/cert/create github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/freeport github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/testutil github.com/hashicorp/consul/testutil/retry github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version github.com/hashicorp/consul/watch
runtime/internal/sys
internal/cpu
math/bits
unicode/utf8
runtime/internal/math
internal/race
internal/bytealg
runtime/internal/atomic
sync/atomic
math
unicode
internal/testlog
encoding
unicode/utf16
runtime
container/list
crypto/internal/subtle
crypto/subtle
vendor/golang.org/x/crypto/cryptobyte/asn1
internal/nettrace
runtime/cgo
vendor/golang.org/x/crypto/internal/subtle
github.com/circonus-labs/circonus-gometrics/api/config
golang.org/x/net/internal/iana
github.com/hashicorp/consul/types
github.com/aws/aws-sdk-go/aws/client/metadata
go.opencensus.io
go.opencensus.io/trace/internal
go.opencensus.io/internal/tagencoding
github.com/hashicorp/consul/service_os
github.com/hashicorp/consul/vendor/github.com/oklog/run
internal/reflectlite
sync
internal/singleflight
math/rand
github.com/hashicorp/consul/agent/token
golang.org/x/sync/singleflight
google.golang.org/grpc/internal/grpcsync
errors
sort
internal/oserror
io
strconv
syscall
vendor/golang.org/x/net/dns/dnsmessage
bytes
strings
reflect
bufio
github.com/armon/go-radix
hash
crypto
crypto/internal/randutil
crypto/hmac
crypto/rc4
vendor/golang.org/x/crypto/hkdf
hash/crc32
vendor/golang.org/x/text/transform
time
internal/syscall/unix
path
github.com/hashicorp/golang-lru/simplelru
github.com/hashicorp/hcl/hcl/strconv
github.com/hashicorp/go-immutable-radix
regexp/syntax
text/tabwriter
container/heap
github.com/beorn7/perks/quantile
github.com/prometheus/common/internal/bitbucket.org/ww/goautoneg
html
encoding/base32
internal/poll
context
regexp
hash/crc64
hash/fnv
os
github.com/kr/text
github.com/posener/complete/match
internal/fmtsort
encoding/binary
github.com/hashicorp/errwrap
github.com/mitchellh/reflectwalk
github.com/ryanuber/go-glob
github.com/mitchellh/copystructure
golang.org/x/text/transform
encoding/base64
crypto/cipher
crypto/sha512
fmt
crypto/ed25519/internal/edwards25519
crypto/md5
crypto/aes
crypto/des
crypto/sha1
crypto/sha256
encoding/pem
path/filepath
net
vendor/golang.org/x/crypto/internal/chacha20
encoding/json
math/big
encoding/hex
io/ioutil
net/url
vendor/golang.org/x/crypto/poly1305
vendor/golang.org/x/crypto/chacha20poly1305
vendor/golang.org/x/crypto/curve25519
compress/flate
log
vendor/golang.org/x/text/unicode/bidi
compress/gzip
vendor/golang.org/x/text/secure/bidirule
vendor/golang.org/x/text/unicode/norm
vendor/golang.org/x/net/http2/hpack
crypto/elliptic
encoding/asn1
crypto/rand
crypto/ed25519
crypto/rsa
crypto/ecdsa
crypto/dsa
crypto/x509/pkix
vendor/golang.org/x/crypto/cryptobyte
vendor/golang.org/x/net/idna
mime
mime/quotedprintable
net/http/internal
os/signal
github.com/hashicorp/hcl/hcl/token
golang.org/x/crypto/blake2b
github.com/hashicorp/go-hclog
github.com/hashicorp/hcl/hcl/ast
github.com/hashicorp/hcl/hcl/scanner
github.com/hashicorp/hcl/json/token
github.com/hashicorp/hcl/hcl/parser
github.com/hashicorp/hcl/json/scanner
github.com/pkg/errors
github.com/hashicorp/hcl/json/parser
github.com/hashicorp/hcl/hcl/printer
github.com/circonus-labs/circonusllhist
github.com/hashicorp/hcl
github.com/cespare/xxhash
github.com/golang/protobuf/proto
github.com/prometheus/common/model
github.com/prometheus/procfs/internal/fs
crypto/x509
net/textproto
vendor/golang.org/x/net/http/httpguts
vendor/golang.org/x/net/http/httpproxy
mime/multipart
github.com/mitchellh/mapstructure
github.com/DataDog/datadog-go/statsd
github.com/prometheus/procfs
runtime/debug
github.com/hashicorp/consul/version
github.com/hashicorp/go-uuid
crypto/tls
encoding/gob
go/token
text/template/parse
compress/lzw
github.com/google/btree
text/template
github.com/hashicorp/go-multierror
os/exec
github.com/prometheus/client_model/go
github.com/matttproud/golang_protobuf_extensions/pbutil
github.com/prometheus/client_golang/prometheus/internal
github.com/hashicorp/go-sockaddr
golang.org/x/crypto/ed25519
golang.org/x/net/bpf
golang.org/x/sys/unix
html/template
net/http/httptrace
net/http
github.com/hashicorp/go-rootcerts
text/scanner
github.com/hashicorp/memberlist/vendor/github.com/sean-/seed
github.com/hashicorp/yamux
github.com/mitchellh/go-testing-interface
github.com/davecgh/go-spew/spew
github.com/pmezard/go-difflib/difflib
golang.org/x/net/internal/socket
github.com/stretchr/objx
gopkg.in/yaml.v2
golang.org/x/net/ipv4
golang.org/x/net/ipv6
flag
github.com/hashicorp/go-version
runtime/trace
testing
github.com/miekg/dns
github.com/hashicorp/golang-lru
github.com/mitchellh/hashstructure
github.com/bgentry/speakeasy
github.com/mattn/go-isatty
github.com/mattn/go-colorable
os/user
github.com/fatih/color
github.com/hashicorp/consul/command/helpers
github.com/armon/circbuf
github.com/hashicorp/hil/ast
github.com/hashicorp/hil
github.com/hashicorp/go-memdb
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/hclutil
github.com/golang/snappy
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/compressutil
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/jsonutil
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/strutil
github.com/posener/complete/cmd/install
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/helper/parseutil
golang.org/x/text/unicode/bidi
github.com/posener/complete/cmd
github.com/posener/complete
github.com/mitchellh/cli
github.com/hashicorp/go-cleanhttp
github.com/armon/go-metrics
github.com/hashicorp/go-retryablehttp
github.com/tv42/httpunix
expvar
github.com/circonus-labs/circonus-gometrics/api
github.com/prometheus/common/expfmt
github.com/hashicorp/serf/coordinate
github.com/hashicorp/consul/api
github.com/armon/go-metrics/datadog
github.com/prometheus/client_golang/prometheus
github.com/circonus-labs/circonus-gometrics/checkmgr
github.com/circonus-labs/circonus-gometrics
net/rpc
github.com/armon/go-metrics/circonus
github.com/hashicorp/go-msgpack/codec
net/http/httptest
github.com/hashicorp/consul/sentinel
github.com/hashicorp/consul/acl
github.com/stretchr/testify/assert
github.com/armon/go-metrics/prometheus
github.com/hashicorp/consul/command/flags
github.com/hashicorp/consul/command/acl/agenttokens
github.com/hashicorp/consul/command/acl/policy
github.com/hashicorp/consul/command/acl/token
github.com/NYTimes/gziphandler
github.com/hashicorp/consul/vendor/github.com/coredns/coredns/plugin/pkg/dnsutil
github.com/elazarl/go-bindata-assetfs
golang.org/x/text/secure/bidirule
github.com/hashicorp/memberlist
github.com/hashicorp/raft
github.com/stretchr/testify/mock
github.com/stretchr/testify/require
golang.org/x/text/unicode/norm
golang.org/x/net/http2/hpack
golang.org/x/net/context
golang.org/x/time/rate
github.com/hashicorp/consul/tlsutil
github.com/hashicorp/serf/serf
github.com/hashicorp/net-rpc-msgpackrpc
github.com/hashicorp/consul/ipaddr
golang.org/x/net/idna
github.com/hashicorp/consul/lib/semaphore
archive/tar
golang.org/x/net/http/httpguts
golang.org/x/net/http2
github.com/boltdb/bolt
github.com/hashicorp/consul/lib
github.com/hashicorp/consul/agent/consul/autopilot
github.com/hashicorp/consul/agent/cache
github.com/hashicorp/consul/agent/ae
github.com/hashicorp/consul/agent/pool
github.com/hashicorp/consul/snapshot
github.com/hashicorp/raft-boltdb
golang.org/x/net/internal/socks
github.com/hashicorp/consul/agent/structs
github.com/hashicorp/consul/agent/exec
golang.org/x/net/proxy
golang.org/x/net/internal/timeseries
github.com/docker/go-connections/sockets
golang.org/x/net/trace
google.golang.org/grpc/grpclog
google.golang.org/grpc/connectivity
google.golang.org/grpc/credentials/internal
google.golang.org/grpc/credentials
google.golang.org/grpc/internal
google.golang.org/grpc/metadata
google.golang.org/grpc/serviceconfig
google.golang.org/grpc/resolver
google.golang.org/grpc/internal/grpcrand
google.golang.org/grpc/balancer
google.golang.org/grpc/codes
google.golang.org/grpc/encoding
google.golang.org/grpc/balancer/base
google.golang.org/grpc/encoding/proto
github.com/hashicorp/consul/command/acl
github.com/hashicorp/consul/agent/connect
github.com/hashicorp/consul/agent/consul/prepared_query
github.com/hashicorp/consul/vendor/github.com/hashicorp/vault/api
github.com/hashicorp/consul/command/acl/bootstrap
github.com/hashicorp/consul/command/acl/policy/create
github.com/hashicorp/consul/command/acl/policy/delete
github.com/hashicorp/consul/command/acl/policy/list
github.com/hashicorp/consul/command/acl/policy/read
github.com/hashicorp/consul/command/acl/policy/update
github.com/hashicorp/consul/command/acl/rules
github.com/hashicorp/consul/command/acl/token/clone
github.com/hashicorp/consul/command/acl/token/create
github.com/hashicorp/consul/command/acl/token/delete
github.com/hashicorp/consul/command/acl/token/list
github.com/hashicorp/consul/command/acl/token/read
github.com/hashicorp/consul/command/acl/token/update
github.com/hashicorp/consul/agent/consul/state
github.com/hashicorp/consul/agent/metadata
google.golang.org/grpc/balancer/roundrobin
google.golang.org/grpc/internal/backoff
github.com/hashicorp/consul/agent/router
google.golang.org/grpc/internal/balancerload
github.com/golang/protobuf/ptypes/any
github.com/golang/protobuf/ptypes/duration
github.com/golang/protobuf/ptypes/timestamp
github.com/golang/protobuf/ptypes
google.golang.org/grpc/binarylog/grpc_binarylog_v1
google.golang.org/genproto/googleapis/rpc/status
google.golang.org/grpc/internal/channelz
google.golang.org/grpc/status
google.golang.org/grpc/internal/envconfig
google.golang.org/grpc/internal/syscall
google.golang.org/grpc/internal/binarylog
google.golang.org/grpc/keepalive
google.golang.org/grpc/peer
google.golang.org/grpc/stats
google.golang.org/grpc/tap
google.golang.org/grpc/naming
google.golang.org/grpc/internal/transport
google.golang.org/grpc/resolver/dns
google.golang.org/grpc/resolver/passthrough
net/http/httputil
github.com/hashicorp/go-sockaddr/template
github.com/shirou/gopsutil/internal/common
github.com/hashicorp/consul/agent/local
github.com/shirou/gopsutil/cpu
github.com/shirou/gopsutil/disk
github.com/shirou/gopsutil/host
github.com/hashicorp/consul/agent/connect/ca
github.com/hashicorp/consul/agent/consul/fsm
github.com/shirou/gopsutil/mem
github.com/hashicorp/consul/agent/debug
github.com/hashicorp/consul/lib/file
github.com/hashicorp/consul/agent/proxyprocess
github.com/hashicorp/consul/agent/systemd
github.com/hashicorp/consul/agent/consul
google.golang.org/grpc
github.com/gogo/protobuf/proto
github.com/gogo/protobuf/sortkeys
github.com/golang/protobuf/protoc-gen-go/descriptor
github.com/hashicorp/consul/vendor/github.com/lyft/protoc-gen-validate/validate
net/mail
google.golang.org/grpc/health/grpc_health_v1
github.com/hashicorp/consul/agent/checks
github.com/hashicorp/consul/lib/freeport
log/syslog
github.com/hashicorp/go-syslog
github.com/hashicorp/logutils
github.com/hashicorp/consul/testutil/retry
github.com/hashicorp/consul/logger
github.com/hashicorp/consul/watch
github.com/denverdino/aliyungo/util
encoding/xml
github.com/aws/aws-sdk-go/aws/awserr
github.com/aws/aws-sdk-go/internal/ini
github.com/hashicorp/consul/agent/cache-types
github.com/hashicorp/consul/agent/config
github.com/hashicorp/consul/agent/proxycfg
github.com/denverdino/aliyungo/common
github.com/aws/aws-sdk-go/internal/shareddefaults
github.com/aws/aws-sdk-go/aws/credentials
github.com/aws/aws-sdk-go/aws/endpoints
github.com/denverdino/aliyungo/ecs
github.com/gogo/protobuf/protoc-gen-gogo/descriptor
github.com/gogo/protobuf/types
github.com/hashicorp/go-discover/provider/aliyun
github.com/aws/aws-sdk-go/internal/sdkio
github.com/jmespath/go-jmespath
github.com/gogo/protobuf/gogoproto
github.com/gogo/googleapis/google/api
github.com/aws/aws-sdk-go/aws/awsutil
github.com/aws/aws-sdk-go/internal/sdkrand
github.com/aws/aws-sdk-go/internal/sdkuri
github.com/aws/aws-sdk-go/aws/credentials/processcreds
golang.org/x/net/context/ctxhttp
cloud.google.com/go/compute/metadata
golang.org/x/oauth2/internal
golang.org/x/oauth2
golang.org/x/oauth2/jws
github.com/aws/aws-sdk-go/aws
google.golang.org/api/googleapi/internal/uritemplates
golang.org/x/oauth2/jwt
google.golang.org/api/googleapi
golang.org/x/oauth2/google
github.com/aws/aws-sdk-go/aws/request
google.golang.org/api/gensupport
google.golang.org/api/internal
go.opencensus.io/internal
go.opencensus.io/trace/tracestate
google.golang.org/api/option
go.opencensus.io/trace
go.opencensus.io/resource
github.com/aws/aws-sdk-go/aws/corehandlers
github.com/aws/aws-sdk-go/aws/client
github.com/aws/aws-sdk-go/private/protocol
github.com/aws/aws-sdk-go/aws/ec2metadata
github.com/aws/aws-sdk-go/aws/csm
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/core
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/type
github.com/gogo/protobuf/jsonpb
github.com/gogo/googleapis/google/rpc
github.com/aws/aws-sdk-go/aws/credentials/ec2rolecreds
github.com/aws/aws-sdk-go/private/protocol/json/jsonutil
github.com/aws/aws-sdk-go/aws/credentials/endpointcreds
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/pkg/util
github.com/aws/aws-sdk-go/private/protocol/rest
github.com/aws/aws-sdk-go/aws/defaults
github.com/aws/aws-sdk-go/private/protocol/query/queryutil
github.com/aws/aws-sdk-go/private/protocol/xml/xmlutil
go.opencensus.io/trace/propagation
go.opencensus.io/plugin/ochttp/propagation/b3
github.com/aws/aws-sdk-go/aws/signer/v4
go.opencensus.io/metric/metricdata
runtime/pprof
github.com/aws/aws-sdk-go/private/protocol/query
github.com/aws/aws-sdk-go/private/protocol/ec2query
github.com/aws/aws-sdk-go/service/sts
github.com/aws/aws-sdk-go/service/ec2
go.opencensus.io/tag
go.opencensus.io/stats/internal
go.opencensus.io/stats
go.opencensus.io/metric/metricproducer
go.opencensus.io/stats/view
github.com/aws/aws-sdk-go/service/sts/stsiface
github.com/aws/aws-sdk-go/aws/credentials/stscreds
github.com/aws/aws-sdk-go/aws/session
go.opencensus.io/plugin/ochttp
google.golang.org/api/googleapi/transport
google.golang.org/api/transport/http/internal/propagation
github.com/hashicorp/mdns
github.com/hashicorp/go-discover/provider/mdns
github.com/gophercloud/gophercloud
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/auth
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/cluster
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/endpoint
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/route
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2/listener
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/ext_authz/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/auth/v2alpha
google.golang.org/api/transport/http
google.golang.org/api/compute/v1
github.com/gophercloud/gophercloud/pagination
github.com/gophercloud/gophercloud/openstack/identity/v2/tenants
github.com/gophercloud/gophercloud/openstack/identity/v2/tokens
github.com/gophercloud/gophercloud/openstack/identity/v3/tokens
github.com/gophercloud/gophercloud/openstack/utils
github.com/gophercloud/gophercloud/openstack
github.com/gophercloud/gophercloud/openstack/compute/v2/flavors
github.com/gophercloud/gophercloud/openstack/compute/v2/images
github.com/gophercloud/gophercloud/openstack/compute/v2/servers
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/api/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/accesslog/v2
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/config/filter/network/tcp_proxy/v2
github.com/hashicorp/go-discover/provider/os
github.com/packethost/packngo
github.com/hashicorp/consul/vendor/github.com/envoyproxy/go-control-plane/envoy/service/discovery/v2
github.com/hashicorp/go-discover/provider/packet
github.com/prometheus/client_golang/prometheus/promhttp
github.com/hashicorp/consul/agent/xds
net/http/pprof
github.com/hashicorp/go-checkpoint
github.com/hashicorp/consul/command/catalog
github.com/hashicorp/consul/command/catalog/list/dc
github.com/ryanuber/columnize
github.com/hashicorp/consul/command/catalog/list/services
github.com/hashicorp/consul/command/catalog/list/nodes
github.com/hashicorp/consul/command/connect
github.com/hashicorp/consul/command/connect/ca
github.com/hashicorp/consul/command/connect/ca/get
github.com/hashicorp/consul/command/connect/ca/set
github.com/hashicorp/consul/connect
github.com/hashicorp/consul/command/debug
github.com/hashicorp/consul/command/event
github.com/hashicorp/consul/connect/proxy
github.com/hashicorp/consul/command/exec
github.com/hashicorp/consul/command/connect/proxy
github.com/hashicorp/consul/command/forceleave
github.com/hashicorp/consul/command/info
github.com/hashicorp/consul/command/connect/envoy
github.com/hashicorp/consul/command/intention
github.com/hashicorp/consul/command/intention/check
github.com/hashicorp/consul/command/intention/finder
github.com/hashicorp/consul/command/intention/create
github.com/hashicorp/consul/command/intention/delete
github.com/hashicorp/consul/command/intention/get
github.com/hashicorp/consul/command/intention/match
github.com/hashicorp/consul/command/join
github.com/hashicorp/consul/command/keygen
github.com/hashicorp/consul/command/kv
github.com/hashicorp/consul/command/kv/del
github.com/hashicorp/consul/command/kv/impexp
github.com/hashicorp/consul/command/kv/exp
github.com/hashicorp/consul/command/kv/get
github.com/hashicorp/consul/command/kv/imp
github.com/hashicorp/consul/command/kv/put
github.com/hashicorp/consul/command/leave
github.com/hashicorp/consul/command/maint
github.com/hashicorp/consul/command/members
github.com/hashicorp/consul/command/monitor
github.com/hashicorp/consul/command/operator
github.com/hashicorp/consul/command/operator/autopilot
github.com/hashicorp/consul/command/operator/autopilot/get
github.com/hashicorp/consul/command/operator/autopilot/set
github.com/hashicorp/consul/command/operator/raft
github.com/hashicorp/consul/command/operator/raft/listpeers
github.com/hashicorp/consul/command/operator/raft/removepeer
github.com/hashicorp/consul/command/reload
github.com/hashicorp/consul/command/rtt
github.com/hashicorp/consul/command/services
github.com/hashicorp/consul/command/services/deregister
github.com/hashicorp/consul/command/services/register
github.com/hashicorp/consul/command/snapshot
github.com/hashicorp/consul/command/snapshot/inspect
github.com/hashicorp/consul/command/snapshot/restore
github.com/hashicorp/consul/command/snapshot/save
github.com/hashicorp/consul/command/tls
github.com/hashicorp/consul/command/tls/ca
github.com/hashicorp/consul/command/tls/cert
github.com/hashicorp/consul/command/validate
github.com/hashicorp/consul/command/tls/ca/create
github.com/hashicorp/consul/command/tls/cert/create
github.com/hashicorp/consul/command/version
google.golang.org/grpc/health
github.com/hashicorp/consul/agent/mock
github.com/hashicorp/consul/vendor/github.com/hashicorp/go-plugin
github.com/hashicorp/consul/connect/certgen
github.com/hashicorp/consul/agent/connect/ca/plugin
github.com/hashicorp/consul/testrpc
github.com/hashicorp/consul/testutil
github.com/hashicorp/go-discover/provider/aws
github.com/hashicorp/go-discover/provider/gce
github.com/hashicorp/go-discover
github.com/hashicorp/consul/agent
github.com/hashicorp/consul/command/keyring
github.com/hashicorp/consul/command/watch
github.com/hashicorp/consul/command/agent
github.com/hashicorp/consul/command/lock
github.com/hashicorp/consul/command
github.com/hashicorp/consul
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
   debian/rules override_dh_auto_test
make[1]: Entering directory '/<<PKGBUILDDIR>>'
PATH="/<<PKGBUILDDIR>>/_build/bin:${PATH}" \
        DH_GOLANG_EXCLUDES=" api agent/checks agent/connect agent/consul command/tls" \
        dh_auto_test -v --max-parallel=4 -- -short -failfast -timeout 5m
	cd _build && go test -vet=off -v -p 4 -short -failfast -timeout 5m github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/freeport github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/testutil github.com/hashicorp/consul/testutil/retry github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version github.com/hashicorp/consul/watch
testing: warning: no tests to run
PASS
ok  	github.com/hashicorp/consul	0.154s [no tests to run]
=== RUN   TestACL
=== RUN   TestACL/DenyAll
=== RUN   TestACL/DenyAll/DenyACLRead
=== RUN   TestACL/DenyAll/DenyACLWrite
=== RUN   TestACL/DenyAll/DenyAgentRead
=== RUN   TestACL/DenyAll/DenyAgentWrite
=== RUN   TestACL/DenyAll/DenyEventRead
=== RUN   TestACL/DenyAll/DenyEventWrite
=== RUN   TestACL/DenyAll/DenyIntentionDefaultAllow
=== RUN   TestACL/DenyAll/DenyIntentionRead
=== RUN   TestACL/DenyAll/DenyIntentionWrite
=== RUN   TestACL/DenyAll/DenyKeyRead
=== RUN   TestACL/DenyAll/DenyKeyringRead
=== RUN   TestACL/DenyAll/DenyKeyringWrite
=== RUN   TestACL/DenyAll/DenyKeyWrite
=== RUN   TestACL/DenyAll/DenyNodeRead
=== RUN   TestACL/DenyAll/DenyNodeWrite
=== RUN   TestACL/DenyAll/DenyOperatorRead
=== RUN   TestACL/DenyAll/DenyOperatorWrite
=== RUN   TestACL/DenyAll/DenyPreparedQueryRead
=== RUN   TestACL/DenyAll/DenyPreparedQueryWrite
=== RUN   TestACL/DenyAll/DenyServiceRead
=== RUN   TestACL/DenyAll/DenyServiceWrite
=== RUN   TestACL/DenyAll/DenySessionRead
=== RUN   TestACL/DenyAll/DenySessionWrite
=== RUN   TestACL/DenyAll/DenySnapshot
=== RUN   TestACL/AllowAll
=== RUN   TestACL/AllowAll/DenyACLRead
=== RUN   TestACL/AllowAll/DenyACLWrite
=== RUN   TestACL/AllowAll/AllowAgentRead
=== RUN   TestACL/AllowAll/AllowAgentWrite
=== RUN   TestACL/AllowAll/AllowEventRead
=== RUN   TestACL/AllowAll/AllowEventWrite
=== RUN   TestACL/AllowAll/AllowIntentionDefaultAllow
=== RUN   TestACL/AllowAll/AllowIntentionRead
=== RUN   TestACL/AllowAll/AllowIntentionWrite
=== RUN   TestACL/AllowAll/AllowKeyRead
=== RUN   TestACL/AllowAll/AllowKeyringRead
=== RUN   TestACL/AllowAll/AllowKeyringWrite
=== RUN   TestACL/AllowAll/AllowKeyWrite
=== RUN   TestACL/AllowAll/AllowNodeRead
=== RUN   TestACL/AllowAll/AllowNodeWrite
=== RUN   TestACL/AllowAll/AllowOperatorRead
=== RUN   TestACL/AllowAll/AllowOperatorWrite
=== RUN   TestACL/AllowAll/AllowPreparedQueryRead
=== RUN   TestACL/AllowAll/AllowPreparedQueryWrite
=== RUN   TestACL/AllowAll/AllowServiceRead
=== RUN   TestACL/AllowAll/AllowServiceWrite
=== RUN   TestACL/AllowAll/AllowSessionRead
=== RUN   TestACL/AllowAll/AllowSessionWrite
=== RUN   TestACL/AllowAll/DenySnapshot
=== RUN   TestACL/ManageAll
=== RUN   TestACL/ManageAll/AllowACLRead
=== RUN   TestACL/ManageAll/AllowACLWrite
=== RUN   TestACL/ManageAll/AllowAgentRead
=== RUN   TestACL/ManageAll/AllowAgentWrite
=== RUN   TestACL/ManageAll/AllowEventRead
=== RUN   TestACL/ManageAll/AllowEventWrite
=== RUN   TestACL/ManageAll/AllowIntentionDefaultAllow
=== RUN   TestACL/ManageAll/AllowIntentionRead
=== RUN   TestACL/ManageAll/AllowIntentionWrite
=== RUN   TestACL/ManageAll/AllowKeyRead
=== RUN   TestACL/ManageAll/AllowKeyringRead
=== RUN   TestACL/ManageAll/AllowKeyringWrite
=== RUN   TestACL/ManageAll/AllowKeyWrite
=== RUN   TestACL/ManageAll/AllowNodeRead
=== RUN   TestACL/ManageAll/AllowNodeWrite
=== RUN   TestACL/ManageAll/AllowOperatorRead
=== RUN   TestACL/ManageAll/AllowOperatorWrite
=== RUN   TestACL/ManageAll/AllowPreparedQueryRead
=== RUN   TestACL/ManageAll/AllowPreparedQueryWrite
=== RUN   TestACL/ManageAll/AllowServiceRead
=== RUN   TestACL/ManageAll/AllowServiceWrite
=== RUN   TestACL/ManageAll/AllowSessionRead
=== RUN   TestACL/ManageAll/AllowSessionWrite
=== RUN   TestACL/ManageAll/AllowSnapshot
=== RUN   TestACL/AgentBasicDefaultDeny
=== RUN   TestACL/AgentBasicDefaultDeny/DefaultReadDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultDeny/DefaultWriteDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultDeny/ROReadAllowed.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultDeny/ROWriteDenied.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultDeny/RWWriteDenied.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultAllow
=== RUN   TestACL/AgentBasicDefaultAllow/DefaultReadDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultAllow/DefaultWriteDenied.Prefix(ro)
=== RUN   TestACL/AgentBasicDefaultAllow/ROReadAllowed.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultAllow/ROWriteDenied.Prefix(root)
=== RUN   TestACL/AgentBasicDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentBasicDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultAllow/RWWriteDenied.Prefix(root-rw)
=== RUN   TestACL/AgentBasicDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentBasicDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-sub)
=== RUN   TestACL/AgentBasicDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-sub)
=== RUN   TestACL/PreparedQueryDefaultAllow
=== RUN   TestACL/PreparedQueryDefaultAllow/ReadAllowed.Prefix(foo)
=== RUN   TestACL/PreparedQueryDefaultAllow/WriteAllowed.Prefix(foo)
=== RUN   TestACL/PreparedQueryDefaultAllow/ReadDenied.Prefix(other)
=== RUN   TestACL/PreparedQueryDefaultAllow/WriteDenied.Prefix(other)
=== RUN   TestACL/AgentNestedDefaultDeny
=== RUN   TestACL/AgentNestedDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultAllow
=== RUN   TestACL/AgentNestedDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/AgentNestedDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny/ReadDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyDeny/WriteDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyRead
=== RUN   TestACL/KeyringDefaultAllowPolicyRead/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyRead/WriteDenied
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyWrite/WriteAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyNone
=== RUN   TestACL/KeyringDefaultAllowPolicyNone/ReadAllowed
=== RUN   TestACL/KeyringDefaultAllowPolicyNone/WriteAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny/ReadDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyDeny/WriteDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyRead
=== RUN   TestACL/KeyringDefaultDenyPolicyRead/ReadAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyRead/WriteDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite/ReadAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyWrite/WriteAllowed
=== RUN   TestACL/KeyringDefaultDenyPolicyNone
=== RUN   TestACL/KeyringDefaultDenyPolicyNone/ReadDenied
=== RUN   TestACL/KeyringDefaultDenyPolicyNone/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny/ReadDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyDeny/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyRead
=== RUN   TestACL/OperatorDefaultAllowPolicyRead/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyRead/WriteDenied
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyWrite/WriteAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyNone
=== RUN   TestACL/OperatorDefaultAllowPolicyNone/ReadAllowed
=== RUN   TestACL/OperatorDefaultAllowPolicyNone/WriteAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny/ReadDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyDeny/WriteDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyRead
=== RUN   TestACL/OperatorDefaultDenyPolicyRead/ReadAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyRead/WriteDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite/ReadAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyWrite/WriteAllowed
=== RUN   TestACL/OperatorDefaultDenyPolicyNone
=== RUN   TestACL/OperatorDefaultDenyPolicyNone/ReadDenied
=== RUN   TestACL/OperatorDefaultDenyPolicyNone/WriteDenied
=== RUN   TestACL/NodeDefaultDeny
=== RUN   TestACL/NodeDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/NodeDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/NodeDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultAllow
=== RUN   TestACL/NodeDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/NodeDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/NodeDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/NodeDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/NodeDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/NodeDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/NodeDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/NodeDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/NodeDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/NodeDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/NodeDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultDeny
=== RUN   TestACL/SessionDefaultDeny/DefaultReadDenied.Prefix(nope)
=== RUN   TestACL/SessionDefaultDeny/DefaultWriteDenied.Prefix(nope)
=== RUN   TestACL/SessionDefaultDeny/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultDeny/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultDeny/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultDeny/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultDeny/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultDeny/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultDeny/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultDeny/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultDeny/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultDeny/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultDeny/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultAllow
=== RUN   TestACL/SessionDefaultAllow/DefaultReadAllowed.Prefix(nope)
=== RUN   TestACL/SessionDefaultAllow/DefaultWriteAllowed.Prefix(nope)
=== RUN   TestACL/SessionDefaultAllow/DenyReadDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultAllow/DenyWriteDenied.Prefix(root-nope)
=== RUN   TestACL/SessionDefaultAllow/ROReadAllowed.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultAllow/ROWriteDenied.Prefix(root-ro)
=== RUN   TestACL/SessionDefaultAllow/RWReadAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultAllow/RWWriteAllowed.Prefix(root-rw)
=== RUN   TestACL/SessionDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildDenyReadDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope)
=== RUN   TestACL/SessionDefaultAllow/ChildROReadAllowed.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultAllow/ChildROWriteDenied.Prefix(child-ro)
=== RUN   TestACL/SessionDefaultAllow/ChildRWReadAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw)
=== RUN   TestACL/SessionDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix)
=== RUN   TestACL/SessionDefaultAllow/ChildOverrideReadAllowed.Prefix(override)
=== RUN   TestACL/SessionDefaultAllow/ChildOverrideWriteAllowed.Prefix(override)
=== RUN   TestACL/Parent
=== RUN   TestACL/Parent/KeyReadDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(other)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyWriteAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyWritePrefixAllowed.Prefix(foo/test)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(foo/priv/test)
=== RUN   TestACL/Parent/KeyReadDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(bar/any)
=== RUN   TestACL/Parent/KeyReadAllowed.Prefix(zip/test)
=== RUN   TestACL/Parent/KeyWriteDenied.Prefix(zip/test)
=== RUN   TestACL/Parent/KeyWritePrefixDenied.Prefix(zip/test)
=== RUN   TestACL/Parent/ServiceReadDenied.Prefix(fail)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(fail)
=== RUN   TestACL/Parent/ServiceReadAllowed.Prefix(other)
=== RUN   TestACL/Parent/ServiceWriteAllowed.Prefix(other)
=== RUN   TestACL/Parent/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(foo)
=== RUN   TestACL/Parent/ServiceReadDenied.Prefix(bar)
=== RUN   TestACL/Parent/ServiceWriteDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(foo)
=== RUN   TestACL/Parent/PreparedQueryReadAllowed.Prefix(foobar)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(foobar)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(bar)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(barbaz)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(barbaz)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(baz)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(baz)
=== RUN   TestACL/Parent/PreparedQueryReadDenied.Prefix(nope)
=== RUN   TestACL/Parent/PreparedQueryWriteDenied.Prefix(nope)
=== RUN   TestACL/Parent/ACLReadDenied
=== RUN   TestACL/Parent/ACLWriteDenied
=== RUN   TestACL/Parent/SnapshotDenied
=== RUN   TestACL/Parent/IntentionDefaultAllowDenied
=== RUN   TestACL/ComplexDefaultAllow
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(foo/priv/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(bar/any)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(zip/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/)
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed
=== RUN   TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(zap/test)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(intbaz)
=== RUN   TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(intbaz)
=== RUN   TestACL/ComplexDefaultAllow/IntentionDefaultAllowAllowed
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(other)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(barfo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo)
=== RUN   TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo2)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foobar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(bar)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(barbaz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(baz)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(nope)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(nope)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zoo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zoo)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zookeeper)
=== RUN   TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zookeeper)
=== RUN   TestACL/ExactMatchPrecedence
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/AgentWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/AgentReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/AgentWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/KeyWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/KeyReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/KeyWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/ServiceReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/ServiceWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)#01
=== RUN   TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)#01
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/IntentionReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/IntentionWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/SessionWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/SessionReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/SessionWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/EventReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/EventWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/EventReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/EventWriteDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(fo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(for)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWriteAllowed.Prefix(foo)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot2)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(food)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryReadDenied.Prefix(football)
=== RUN   TestACL/ExactMatchPrecedence/PreparedQueryWriteDenied.Prefix(football)
=== RUN   TestACL/ACLRead
=== RUN   TestACL/ACLRead/ReadAllowed
=== RUN   TestACL/ACLRead/WriteDenied
=== RUN   TestACL/ACLRead#01
=== RUN   TestACL/ACLRead#01/ReadAllowed
=== RUN   TestACL/ACLRead#01/WriteAllowed
--- PASS: TestACL (0.29s)
    --- PASS: TestACL/DenyAll (0.01s)
        --- PASS: TestACL/DenyAll/DenyACLRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyACLWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyAgentRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyAgentWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyEventRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyEventWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyIntentionWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyringRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyringWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyKeyWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyNodeRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyNodeWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyOperatorRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyOperatorWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyPreparedQueryRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyPreparedQueryWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenyServiceRead (0.00s)
        --- PASS: TestACL/DenyAll/DenyServiceWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenySessionRead (0.00s)
        --- PASS: TestACL/DenyAll/DenySessionWrite (0.00s)
        --- PASS: TestACL/DenyAll/DenySnapshot (0.00s)
    --- PASS: TestACL/AllowAll (0.01s)
        --- PASS: TestACL/AllowAll/DenyACLRead (0.00s)
        --- PASS: TestACL/AllowAll/DenyACLWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowAgentRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowAgentWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowEventRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowEventWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowIntentionWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyringRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyringWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowKeyWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowNodeRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowNodeWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowOperatorRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowOperatorWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowPreparedQueryRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowPreparedQueryWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowServiceRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowServiceWrite (0.00s)
        --- PASS: TestACL/AllowAll/AllowSessionRead (0.00s)
        --- PASS: TestACL/AllowAll/AllowSessionWrite (0.00s)
        --- PASS: TestACL/AllowAll/DenySnapshot (0.00s)
    --- PASS: TestACL/ManageAll (0.01s)
        --- PASS: TestACL/ManageAll/AllowACLRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowACLWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowAgentRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowAgentWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowEventRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowEventWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionDefaultAllow (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowIntentionWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyringRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyringWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowKeyWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowNodeRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowNodeWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowOperatorRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowOperatorWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowPreparedQueryRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowPreparedQueryWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowServiceRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowServiceWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowSessionRead (0.00s)
        --- PASS: TestACL/ManageAll/AllowSessionWrite (0.00s)
        --- PASS: TestACL/ManageAll/AllowSnapshot (0.00s)
    --- PASS: TestACL/AgentBasicDefaultDeny (0.01s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DefaultReadDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DefaultWriteDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROReadAllowed.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROWriteDenied.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWWriteDenied.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-sub) (0.00s)
    --- PASS: TestACL/AgentBasicDefaultAllow (0.01s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DefaultReadDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DefaultWriteDenied.Prefix(ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROReadAllowed.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROWriteDenied.Prefix(root) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWWriteDenied.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-sub) (0.00s)
        --- PASS: TestACL/AgentBasicDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-sub) (0.00s)
    --- PASS: TestACL/PreparedQueryDefaultAllow (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/ReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/WriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/ReadDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/PreparedQueryDefaultAllow/WriteDenied.Prefix(other) (0.00s)
    --- PASS: TestACL/AgentNestedDefaultDeny (0.01s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/AgentNestedDefaultAllow (0.01s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/AgentNestedDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyDeny (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyRead (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyWrite (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultAllowPolicyNone (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyNone/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultAllowPolicyNone/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyDeny (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyRead (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyWrite (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/KeyringDefaultDenyPolicyNone (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyNone/ReadDenied (0.00s)
        --- PASS: TestACL/KeyringDefaultDenyPolicyNone/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyDeny (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyRead (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyWrite (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultAllowPolicyNone (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyNone/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultAllowPolicyNone/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyDeny (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyDeny/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyDeny/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyRead (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyRead/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyRead/WriteDenied (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyWrite (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyWrite/ReadAllowed (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyWrite/WriteAllowed (0.00s)
    --- PASS: TestACL/OperatorDefaultDenyPolicyNone (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyNone/ReadDenied (0.00s)
        --- PASS: TestACL/OperatorDefaultDenyPolicyNone/WriteDenied (0.00s)
    --- PASS: TestACL/NodeDefaultDeny (0.01s)
        --- PASS: TestACL/NodeDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/NodeDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/NodeDefaultAllow (0.01s)
        --- PASS: TestACL/NodeDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/NodeDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/SessionDefaultDeny (0.01s)
        --- PASS: TestACL/SessionDefaultDeny/DefaultReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DefaultWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/SessionDefaultDeny/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/SessionDefaultAllow (0.01s)
        --- PASS: TestACL/SessionDefaultAllow/DefaultReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DefaultWriteAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenyReadDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenyWriteDenied.Prefix(root-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROReadAllowed.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROWriteDenied.Prefix(root-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWReadAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWWriteAllowed.Prefix(root-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenySuffixReadDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/DenySuffixWriteDenied.Prefix(root-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROSuffixReadAllowed.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ROSuffixWriteDenied.Prefix(root-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWSuffixReadAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/RWSuffixWriteAllowed.Prefix(root-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenyReadDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenyWriteDenied.Prefix(child-nope) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROReadAllowed.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROWriteDenied.Prefix(child-ro) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWReadAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWWriteAllowed.Prefix(child-rw) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenySuffixReadDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildDenySuffixWriteDenied.Prefix(child-nope-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROSuffixReadAllowed.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildROSuffixWriteDenied.Prefix(child-ro-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWSuffixReadAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildRWSuffixWriteAllowed.Prefix(child-rw-prefix) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildOverrideReadAllowed.Prefix(override) (0.00s)
        --- PASS: TestACL/SessionDefaultAllow/ChildOverrideWriteAllowed.Prefix(override) (0.00s)
    --- PASS: TestACL/Parent (0.02s)
        --- PASS: TestACL/Parent/KeyReadDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/Parent/KeyReadDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/Parent/KeyReadAllowed.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWriteDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/KeyWritePrefixDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadDenied.Prefix(fail) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(fail) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/ServiceReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/ServiceWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryReadDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/Parent/PreparedQueryWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/Parent/ACLReadDenied (0.00s)
        --- PASS: TestACL/Parent/ACLWriteDenied (0.00s)
        --- PASS: TestACL/Parent/SnapshotDenied (0.00s)
        --- PASS: TestACL/Parent/IntentionDefaultAllowDenied (0.00s)
    --- PASS: TestACL/ComplexDefaultAllow (0.04s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(foo/priv/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(bar/any) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListDenied.Prefix(zip/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(foo/) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyReadAllowed.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWriteDenied.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyWritePrefixDenied.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/KeyListAllowed.Prefix(zap/test) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionReadDenied.Prefix(intbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionWriteDenied.Prefix(intbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/IntentionDefaultAllowAllowed (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(other) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteDenied.Prefix(barfo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceReadAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/ServiceWriteAllowed.Prefix(barfoo2) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventReadAllowed.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/EventWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(foobar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(bar) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(barbaz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(baz) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(nope) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteDenied.Prefix(nope) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zoo) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryReadAllowed.Prefix(zookeeper) (0.00s)
        --- PASS: TestACL/ComplexDefaultAllow/PreparedQueryWriteAllowed.Prefix(zookeeper) (0.00s)
    --- PASS: TestACL/ExactMatchPrecedence (0.07s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/AgentWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/KeyWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/ServiceWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(fo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(fo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(for)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(for)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadAllowed.Prefix(foo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteAllowed.Prefix(foo)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(foot2)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(foot2)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadPrefixAllowed.Prefix(food)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWritePrefixDenied.Prefix(food)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeReadDenied.Prefix(football)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/NodeWriteDenied.Prefix(football)#01 (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/IntentionWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/SessionWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/EventWriteDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(fo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(for) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWriteAllowed.Prefix(foo) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(foot2) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadPrefixAllowed.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWritePrefixDenied.Prefix(food) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryReadDenied.Prefix(football) (0.00s)
        --- PASS: TestACL/ExactMatchPrecedence/PreparedQueryWriteDenied.Prefix(football) (0.00s)
    --- PASS: TestACL/ACLRead (0.00s)
        --- PASS: TestACL/ACLRead/ReadAllowed (0.00s)
        --- PASS: TestACL/ACLRead/WriteDenied (0.00s)
    --- PASS: TestACL/ACLRead#01 (0.00s)
        --- PASS: TestACL/ACLRead#01/ReadAllowed (0.00s)
        --- PASS: TestACL/ACLRead#01/WriteAllowed (0.00s)
=== RUN   TestRootAuthorizer
--- PASS: TestRootAuthorizer (0.00s)
=== RUN   TestACLEnforce
=== RUN   TestACLEnforce/RuleNoneRequireRead
=== RUN   TestACLEnforce/RuleNoneRequireWrite
=== RUN   TestACLEnforce/RuleNoneRequireList
=== RUN   TestACLEnforce/RuleReadRequireRead
=== RUN   TestACLEnforce/RuleReadRequireWrite
=== RUN   TestACLEnforce/RuleReadRequireList
=== RUN   TestACLEnforce/RuleListRequireRead
=== RUN   TestACLEnforce/RuleListRequireWrite
=== RUN   TestACLEnforce/RuleListRequireList
=== RUN   TestACLEnforce/RuleWritetRequireRead
=== RUN   TestACLEnforce/RuleWritetRequireWrite
=== RUN   TestACLEnforce/RuleWritetRequireList
=== RUN   TestACLEnforce/RuleDenyRequireRead
=== RUN   TestACLEnforce/RuleDenyRequireWrite
=== RUN   TestACLEnforce/RuleDenyRequireList
--- PASS: TestACLEnforce (0.01s)
    --- PASS: TestACLEnforce/RuleNoneRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleNoneRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleNoneRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleReadRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleListRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleWritetRequireList (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireRead (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireWrite (0.00s)
    --- PASS: TestACLEnforce/RuleDenyRequireList (0.00s)
=== RUN   TestPolicySourceParse
=== RUN   TestPolicySourceParse/Legacy_Basic
=== RUN   TestPolicySourceParse/Legacy_(JSON)
=== RUN   TestPolicySourceParse/Service_No_Intentions_(Legacy)
=== RUN   TestPolicySourceParse/Service_Intentions_(Legacy)
=== RUN   TestPolicySourceParse/Service_Intention:_invalid_value_(Legacy)
=== RUN   TestPolicySourceParse/Bad_Policy_-_ACL
=== RUN   TestPolicySourceParse/Bad_Policy_-_Agent
=== RUN   TestPolicySourceParse/Bad_Policy_-_Agent_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Key
=== RUN   TestPolicySourceParse/Bad_Policy_-_Key_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Node
=== RUN   TestPolicySourceParse/Bad_Policy_-_Node_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Service
=== RUN   TestPolicySourceParse/Bad_Policy_-_Service_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Session
=== RUN   TestPolicySourceParse/Bad_Policy_-_Session_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Event
=== RUN   TestPolicySourceParse/Bad_Policy_-_Event_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Prepared_Query
=== RUN   TestPolicySourceParse/Bad_Policy_-_Prepared_Query_Prefix
=== RUN   TestPolicySourceParse/Bad_Policy_-_Keyring
=== RUN   TestPolicySourceParse/Bad_Policy_-_Operator
=== RUN   TestPolicySourceParse/Keyring_Empty
=== RUN   TestPolicySourceParse/Operator_Empty
--- PASS: TestPolicySourceParse (0.04s)
    --- PASS: TestPolicySourceParse/Legacy_Basic (0.00s)
    --- PASS: TestPolicySourceParse/Legacy_(JSON) (0.00s)
    --- PASS: TestPolicySourceParse/Service_No_Intentions_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Service_Intentions_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Service_Intention:_invalid_value_(Legacy) (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_ACL (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Agent (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Agent_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Key (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Key_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Node (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Node_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Service (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Service_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Session (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Session_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Event (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Event_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Prepared_Query (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Prepared_Query_Prefix (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Keyring (0.00s)
    --- PASS: TestPolicySourceParse/Bad_Policy_-_Operator (0.00s)
    --- PASS: TestPolicySourceParse/Keyring_Empty (0.00s)
    --- PASS: TestPolicySourceParse/Operator_Empty (0.00s)
=== RUN   TestMergePolicies
=== RUN   TestMergePolicies/Agents
=== RUN   TestMergePolicies/Events
=== RUN   TestMergePolicies/Node
=== RUN   TestMergePolicies/Keys
=== RUN   TestMergePolicies/Services
=== RUN   TestMergePolicies/Sessions
=== RUN   TestMergePolicies/Prepared_Queries
=== RUN   TestMergePolicies/Write_Precedence
=== RUN   TestMergePolicies/Deny_Precedence
=== RUN   TestMergePolicies/Read_Precedence
--- PASS: TestMergePolicies (0.02s)
    --- PASS: TestMergePolicies/Agents (0.00s)
    --- PASS: TestMergePolicies/Events (0.00s)
    --- PASS: TestMergePolicies/Node (0.00s)
    --- PASS: TestMergePolicies/Keys (0.00s)
    --- PASS: TestMergePolicies/Services (0.00s)
    --- PASS: TestMergePolicies/Sessions (0.00s)
    --- PASS: TestMergePolicies/Prepared_Queries (0.00s)
    --- PASS: TestMergePolicies/Write_Precedence (0.00s)
    --- PASS: TestMergePolicies/Deny_Precedence (0.00s)
    --- PASS: TestMergePolicies/Read_Precedence (0.00s)
=== RUN   TestRulesTranslate
--- PASS: TestRulesTranslate (0.00s)
=== RUN   TestPrecedence
=== RUN   TestPrecedence/Deny_Over_Write
=== RUN   TestPrecedence/Deny_Over_List
=== RUN   TestPrecedence/Deny_Over_Read
=== RUN   TestPrecedence/Deny_Over_Unknown
=== RUN   TestPrecedence/Write_Over_List
=== RUN   TestPrecedence/Write_Over_Read
=== RUN   TestPrecedence/Write_Over_Unknown
=== RUN   TestPrecedence/List_Over_Read
=== RUN   TestPrecedence/List_Over_Unknown
=== RUN   TestPrecedence/Read_Over_Unknown
=== RUN   TestPrecedence/Write_Over_Deny
=== RUN   TestPrecedence/List_Over_Deny
=== RUN   TestPrecedence/Read_Over_Deny
=== RUN   TestPrecedence/Deny_Over_Unknown#01
=== RUN   TestPrecedence/List_Over_Write
=== RUN   TestPrecedence/Read_Over_Write
=== RUN   TestPrecedence/Unknown_Over_Write
=== RUN   TestPrecedence/Read_Over_List
=== RUN   TestPrecedence/Unknown_Over_List
=== RUN   TestPrecedence/Unknown_Over_Read
--- PASS: TestPrecedence (0.01s)
    --- PASS: TestPrecedence/Deny_Over_Write (0.00s)
    --- PASS: TestPrecedence/Deny_Over_List (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Read (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Write_Over_List (0.00s)
    --- PASS: TestPrecedence/Write_Over_Read (0.00s)
    --- PASS: TestPrecedence/Write_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/List_Over_Read (0.00s)
    --- PASS: TestPrecedence/List_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Read_Over_Unknown (0.00s)
    --- PASS: TestPrecedence/Write_Over_Deny (0.00s)
    --- PASS: TestPrecedence/List_Over_Deny (0.00s)
    --- PASS: TestPrecedence/Read_Over_Deny (0.00s)
    --- PASS: TestPrecedence/Deny_Over_Unknown#01 (0.00s)
    --- PASS: TestPrecedence/List_Over_Write (0.00s)
    --- PASS: TestPrecedence/Read_Over_Write (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_Write (0.00s)
    --- PASS: TestPrecedence/Read_Over_List (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_List (0.00s)
    --- PASS: TestPrecedence/Unknown_Over_Read (0.00s)
PASS
ok  	github.com/hashicorp/consul/acl	0.456s
=== RUN   TestACL_Legacy_Disabled_Response
=== PAUSE TestACL_Legacy_Disabled_Response
=== RUN   TestACL_Legacy_Update
=== PAUSE TestACL_Legacy_Update
=== RUN   TestACL_Legacy_UpdateUpsert
=== PAUSE TestACL_Legacy_UpdateUpsert
=== RUN   TestACL_Legacy_Destroy
=== PAUSE TestACL_Legacy_Destroy
=== RUN   TestACL_Legacy_Clone
=== PAUSE TestACL_Legacy_Clone
=== RUN   TestACL_Legacy_Get
=== PAUSE TestACL_Legacy_Get
=== RUN   TestACL_Legacy_List
--- SKIP: TestACL_Legacy_List (0.00s)
    acl_endpoint_legacy_test.go:253: DM-skipped
=== RUN   TestACLReplicationStatus
=== PAUSE TestACLReplicationStatus
=== RUN   TestACL_Disabled_Response
=== PAUSE TestACL_Disabled_Response
=== RUN   TestACL_Bootstrap
=== PAUSE TestACL_Bootstrap
=== RUN   TestACL_HTTP
=== PAUSE TestACL_HTTP
=== RUN   TestACL_Version8
=== PAUSE TestACL_Version8
=== RUN   TestACL_AgentMasterToken
=== PAUSE TestACL_AgentMasterToken
=== RUN   TestACL_RootAuthorizersDenied
=== PAUSE TestACL_RootAuthorizersDenied
=== RUN   TestACL_vetServiceRegister
=== PAUSE TestACL_vetServiceRegister
=== RUN   TestACL_vetServiceUpdate
=== PAUSE TestACL_vetServiceUpdate
=== RUN   TestACL_vetCheckRegister
=== PAUSE TestACL_vetCheckRegister
=== RUN   TestACL_vetCheckUpdate
=== PAUSE TestACL_vetCheckUpdate
=== RUN   TestACL_filterMembers
=== PAUSE TestACL_filterMembers
=== RUN   TestACL_filterServices
=== PAUSE TestACL_filterServices
=== RUN   TestACL_filterChecks
=== PAUSE TestACL_filterChecks
=== RUN   TestAgent_Services
=== PAUSE TestAgent_Services
=== RUN   TestAgent_Services_ExternalConnectProxy
=== PAUSE TestAgent_Services_ExternalConnectProxy
=== RUN   TestAgent_Services_Sidecar
=== PAUSE TestAgent_Services_Sidecar
=== RUN   TestAgent_Services_ACLFilter
=== PAUSE TestAgent_Services_ACLFilter
=== RUN   TestAgent_Service
--- SKIP: TestAgent_Service (0.00s)
    agent_endpoint_test.go:232: DM-skipped
=== RUN   TestAgent_Service_DeprecatedManagedProxy
=== PAUSE TestAgent_Service_DeprecatedManagedProxy
=== RUN   TestAgent_Checks
=== PAUSE TestAgent_Checks
=== RUN   TestAgent_HealthServiceByID
=== PAUSE TestAgent_HealthServiceByID
=== RUN   TestAgent_HealthServiceByName
=== PAUSE TestAgent_HealthServiceByName
=== RUN   TestAgent_Checks_ACLFilter
=== PAUSE TestAgent_Checks_ACLFilter
=== RUN   TestAgent_Self
=== PAUSE TestAgent_Self
=== RUN   TestAgent_Self_ACLDeny
=== PAUSE TestAgent_Self_ACLDeny
=== RUN   TestAgent_Metrics_ACLDeny
=== PAUSE TestAgent_Metrics_ACLDeny
=== RUN   TestAgent_Reload
=== PAUSE TestAgent_Reload
=== RUN   TestAgent_Reload_ACLDeny
=== PAUSE TestAgent_Reload_ACLDeny
=== RUN   TestAgent_Members
=== PAUSE TestAgent_Members
=== RUN   TestAgent_Members_WAN
=== PAUSE TestAgent_Members_WAN
=== RUN   TestAgent_Members_ACLFilter
=== PAUSE TestAgent_Members_ACLFilter
=== RUN   TestAgent_Join
=== PAUSE TestAgent_Join
=== RUN   TestAgent_Join_WAN
=== PAUSE TestAgent_Join_WAN
=== RUN   TestAgent_Join_ACLDeny
=== PAUSE TestAgent_Join_ACLDeny
=== RUN   TestAgent_JoinLANNotify
=== PAUSE TestAgent_JoinLANNotify
=== RUN   TestAgent_Leave
--- SKIP: TestAgent_Leave (0.00s)
    agent_endpoint_test.go:1508: DM-skipped
=== RUN   TestAgent_Leave_ACLDeny
=== PAUSE TestAgent_Leave_ACLDeny
=== RUN   TestAgent_ForceLeave
--- SKIP: TestAgent_ForceLeave (0.00s)
    agent_endpoint_test.go:1576: DM-skipped
=== RUN   TestAgent_ForceLeave_ACLDeny
=== PAUSE TestAgent_ForceLeave_ACLDeny
=== RUN   TestAgent_RegisterCheck
=== PAUSE TestAgent_RegisterCheck
=== RUN   TestAgent_RegisterCheck_Scripts
=== PAUSE TestAgent_RegisterCheck_Scripts
=== RUN   TestAgent_RegisterCheckScriptsExecDisable
=== PAUSE TestAgent_RegisterCheckScriptsExecDisable
=== RUN   TestAgent_RegisterCheckScriptsExecRemoteDisable
=== PAUSE TestAgent_RegisterCheckScriptsExecRemoteDisable
=== RUN   TestAgent_RegisterCheck_Passing
=== PAUSE TestAgent_RegisterCheck_Passing
=== RUN   TestAgent_RegisterCheck_BadStatus
=== PAUSE TestAgent_RegisterCheck_BadStatus
=== RUN   TestAgent_RegisterCheck_ACLDeny
=== PAUSE TestAgent_RegisterCheck_ACLDeny
=== RUN   TestAgent_DeregisterCheck
=== PAUSE TestAgent_DeregisterCheck
=== RUN   TestAgent_DeregisterCheckACLDeny
=== PAUSE TestAgent_DeregisterCheckACLDeny
=== RUN   TestAgent_PassCheck
=== PAUSE TestAgent_PassCheck
=== RUN   TestAgent_PassCheck_ACLDeny
=== PAUSE TestAgent_PassCheck_ACLDeny
=== RUN   TestAgent_WarnCheck
=== PAUSE TestAgent_WarnCheck
=== RUN   TestAgent_WarnCheck_ACLDeny
=== PAUSE TestAgent_WarnCheck_ACLDeny
=== RUN   TestAgent_FailCheck
=== PAUSE TestAgent_FailCheck
=== RUN   TestAgent_FailCheck_ACLDeny
=== PAUSE TestAgent_FailCheck_ACLDeny
=== RUN   TestAgent_UpdateCheck
=== PAUSE TestAgent_UpdateCheck
=== RUN   TestAgent_UpdateCheck_ACLDeny
=== PAUSE TestAgent_UpdateCheck_ACLDeny
=== RUN   TestAgent_RegisterService
=== PAUSE TestAgent_RegisterService
=== RUN   TestAgent_RegisterService_TranslateKeys
=== PAUSE TestAgent_RegisterService_TranslateKeys
=== RUN   TestAgent_RegisterService_ACLDeny
=== PAUSE TestAgent_RegisterService_ACLDeny
=== RUN   TestAgent_RegisterService_InvalidAddress
=== PAUSE TestAgent_RegisterService_InvalidAddress
=== RUN   TestAgent_RegisterService_ManagedConnectProxy
=== PAUSE TestAgent_RegisterService_ManagedConnectProxy
=== RUN   TestAgent_RegisterService_ManagedConnectProxyDeprecated
=== PAUSE TestAgent_RegisterService_ManagedConnectProxyDeprecated
=== RUN   TestAgent_RegisterService_ManagedConnectProxy_Disabled
=== PAUSE TestAgent_RegisterService_ManagedConnectProxy_Disabled
=== RUN   TestAgent_RegisterService_UnmanagedConnectProxy
=== PAUSE TestAgent_RegisterService_UnmanagedConnectProxy
=== RUN   TestAgent_RegisterServiceDeregisterService_Sidecar
=== PAUSE TestAgent_RegisterServiceDeregisterService_Sidecar
=== RUN   TestAgent_RegisterService_UnmanagedConnectProxyInvalid
=== PAUSE TestAgent_RegisterService_UnmanagedConnectProxyInvalid
=== RUN   TestAgent_RegisterService_ConnectNative
=== PAUSE TestAgent_RegisterService_ConnectNative
=== RUN   TestAgent_RegisterService_ScriptCheck_ExecDisable
=== PAUSE TestAgent_RegisterService_ScriptCheck_ExecDisable
=== RUN   TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable
=== PAUSE TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable
=== RUN   TestAgent_DeregisterService
=== PAUSE TestAgent_DeregisterService
=== RUN   TestAgent_DeregisterService_ACLDeny
=== PAUSE TestAgent_DeregisterService_ACLDeny
=== RUN   TestAgent_DeregisterService_withManagedProxy
=== PAUSE TestAgent_DeregisterService_withManagedProxy
=== RUN   TestAgent_DeregisterService_managedProxyDirect
=== PAUSE TestAgent_DeregisterService_managedProxyDirect
=== RUN   TestAgent_ServiceMaintenance_BadRequest
=== PAUSE TestAgent_ServiceMaintenance_BadRequest
=== RUN   TestAgent_ServiceMaintenance_Enable
--- SKIP: TestAgent_ServiceMaintenance_Enable (0.00s)
    agent_endpoint_test.go:3741: DM-skipped
=== RUN   TestAgent_ServiceMaintenance_Disable
=== PAUSE TestAgent_ServiceMaintenance_Disable
=== RUN   TestAgent_ServiceMaintenance_ACLDeny
=== PAUSE TestAgent_ServiceMaintenance_ACLDeny
=== RUN   TestAgent_NodeMaintenance_BadRequest
=== PAUSE TestAgent_NodeMaintenance_BadRequest
=== RUN   TestAgent_NodeMaintenance_Enable
=== PAUSE TestAgent_NodeMaintenance_Enable
=== RUN   TestAgent_NodeMaintenance_Disable
=== PAUSE TestAgent_NodeMaintenance_Disable
=== RUN   TestAgent_NodeMaintenance_ACLDeny
=== PAUSE TestAgent_NodeMaintenance_ACLDeny
=== RUN   TestAgent_RegisterCheck_Service
=== PAUSE TestAgent_RegisterCheck_Service
=== RUN   TestAgent_Monitor
--- SKIP: TestAgent_Monitor (0.00s)
    agent_endpoint_test.go:3994: DM-skipped
=== RUN   TestAgent_Monitor_ACLDeny
=== PAUSE TestAgent_Monitor_ACLDeny
=== RUN   TestAgent_Token
=== PAUSE TestAgent_Token
=== RUN   TestAgentConnectCARoots_empty
=== PAUSE TestAgentConnectCARoots_empty
=== RUN   TestAgentConnectCARoots_list
=== PAUSE TestAgentConnectCARoots_list
=== RUN   TestAgentConnectCALeafCert_aclDefaultDeny
=== PAUSE TestAgentConnectCALeafCert_aclDefaultDeny
=== RUN   TestAgentConnectCALeafCert_aclProxyToken
=== PAUSE TestAgentConnectCALeafCert_aclProxyToken
=== RUN   TestAgentConnectCALeafCert_aclProxyTokenOther
=== PAUSE TestAgentConnectCALeafCert_aclProxyTokenOther
=== RUN   TestAgentConnectCALeafCert_aclServiceWrite
=== PAUSE TestAgentConnectCALeafCert_aclServiceWrite
=== RUN   TestAgentConnectCALeafCert_aclServiceReadDeny
=== PAUSE TestAgentConnectCALeafCert_aclServiceReadDeny
=== RUN   TestAgentConnectCALeafCert_good
=== PAUSE TestAgentConnectCALeafCert_good
=== RUN   TestAgentConnectCALeafCert_goodNotLocal
--- SKIP: TestAgentConnectCALeafCert_goodNotLocal (0.00s)
    agent_endpoint_test.go:4795: DM-skipped
=== RUN   TestAgentConnectProxyConfig_Blocking
--- SKIP: TestAgentConnectProxyConfig_Blocking (0.00s)
    agent_endpoint_test.go:4932: DM-skipped
=== RUN   TestAgentConnectProxyConfig_aclDefaultDeny
=== PAUSE TestAgentConnectProxyConfig_aclDefaultDeny
=== RUN   TestAgentConnectProxyConfig_aclProxyToken
=== PAUSE TestAgentConnectProxyConfig_aclProxyToken
=== RUN   TestAgentConnectProxyConfig_aclServiceWrite
=== PAUSE TestAgentConnectProxyConfig_aclServiceWrite
=== RUN   TestAgentConnectProxyConfig_aclServiceReadDeny
=== PAUSE TestAgentConnectProxyConfig_aclServiceReadDeny
=== RUN   TestAgentConnectProxyConfig_ConfigHandling
--- SKIP: TestAgentConnectProxyConfig_ConfigHandling (0.00s)
    agent_endpoint_test.go:5344: DM-skipped
=== RUN   TestAgentConnectAuthorize_badBody
=== PAUSE TestAgentConnectAuthorize_badBody
=== RUN   TestAgentConnectAuthorize_noTarget
=== PAUSE TestAgentConnectAuthorize_noTarget
=== RUN   TestAgentConnectAuthorize_idInvalidFormat
=== PAUSE TestAgentConnectAuthorize_idInvalidFormat
=== RUN   TestAgentConnectAuthorize_idNotService
=== PAUSE TestAgentConnectAuthorize_idNotService
=== RUN   TestAgentConnectAuthorize_allow
=== PAUSE TestAgentConnectAuthorize_allow
=== RUN   TestAgentConnectAuthorize_deny
=== PAUSE TestAgentConnectAuthorize_deny
=== RUN   TestAgentConnectAuthorize_allowTrustDomain
=== PAUSE TestAgentConnectAuthorize_allowTrustDomain
=== RUN   TestAgentConnectAuthorize_denyWildcard
=== PAUSE TestAgentConnectAuthorize_denyWildcard
=== RUN   TestAgentConnectAuthorize_serviceWrite
=== PAUSE TestAgentConnectAuthorize_serviceWrite
=== RUN   TestAgentConnectAuthorize_defaultDeny
=== PAUSE TestAgentConnectAuthorize_defaultDeny
=== RUN   TestAgentConnectAuthorize_defaultAllow
=== PAUSE TestAgentConnectAuthorize_defaultAllow
=== RUN   TestAgent_Host
=== PAUSE TestAgent_Host
=== RUN   TestAgent_HostBadACL
=== PAUSE TestAgent_HostBadACL
=== RUN   TestAgent_MultiStartStop
=== RUN   TestAgent_MultiStartStop/#00
=== PAUSE TestAgent_MultiStartStop/#00
=== RUN   TestAgent_MultiStartStop/#01
=== PAUSE TestAgent_MultiStartStop/#01
=== RUN   TestAgent_MultiStartStop/#02
=== PAUSE TestAgent_MultiStartStop/#02
=== RUN   TestAgent_MultiStartStop/#03
=== PAUSE TestAgent_MultiStartStop/#03
=== RUN   TestAgent_MultiStartStop/#04
=== PAUSE TestAgent_MultiStartStop/#04
=== RUN   TestAgent_MultiStartStop/#05
=== PAUSE TestAgent_MultiStartStop/#05
=== RUN   TestAgent_MultiStartStop/#06
=== PAUSE TestAgent_MultiStartStop/#06
=== RUN   TestAgent_MultiStartStop/#07
=== PAUSE TestAgent_MultiStartStop/#07
=== RUN   TestAgent_MultiStartStop/#08
=== PAUSE TestAgent_MultiStartStop/#08
=== RUN   TestAgent_MultiStartStop/#09
=== PAUSE TestAgent_MultiStartStop/#09
=== CONT  TestAgent_MultiStartStop/#00
=== CONT  TestAgent_MultiStartStop/#05
=== CONT  TestAgent_MultiStartStop/#03
=== CONT  TestAgent_MultiStartStop/#08
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:19.811559 [WARN] agent: Node name "Node 59440e12-9d94-7630-9049-4a2ca470d528" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:19.812999 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:19.813104 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:19.813488 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:19.813903 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:19.839803 [WARN] agent: Node name "Node 7b55bb07-9a75-bdc9-70ec-a8133feacf7c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:19.840195 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:19.840267 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:19.840435 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:19.840543 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:19.842501 [WARN] agent: Node name "Node d20ce383-2869-3673-23bd-2f2bf3d1d3cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:19.842952 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:19.843026 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:19.843222 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:19.843332 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:19.850538 [WARN] agent: Node name "Node 01f00cb3-636d-f2dc-7126-ee534c67348a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:19.850985 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:19.851131 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:19.851376 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:19.851557 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:59440e12-9d94-7630-9049-4a2ca470d528 Address:127.0.0.1:11512}]
2019/11/27 02:17:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:01f00cb3-636d-f2dc-7126-ee534c67348a Address:127.0.0.1:11506}]
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11512 [Follower] entering Follower state (Leader: "")
2019/11/27 02:17:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7b55bb07-9a75-bdc9-70ec-a8133feacf7c Address:127.0.0.1:11518}]
2019/11/27 02:17:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d20ce383-2869-3673-23bd-2f2bf3d1d3cf Address:127.0.0.1:11524}]
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11518 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.377262 [INFO] serf: EventMemberJoin: Node 01f00cb3-636d-f2dc-7126-ee534c67348a.dc1 127.0.0.1
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.383809 [INFO] serf: EventMemberJoin: Node d20ce383-2869-3673-23bd-2f2bf3d1d3cf.dc1 127.0.0.1
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11524 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.392539 [INFO] serf: EventMemberJoin: Node 59440e12-9d94-7630-9049-4a2ca470d528.dc1 127.0.0.1
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.394392 [INFO] serf: EventMemberJoin: Node d20ce383-2869-3673-23bd-2f2bf3d1d3cf 127.0.0.1
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.397744 [INFO] serf: EventMemberJoin: Node 59440e12-9d94-7630-9049-4a2ca470d528 127.0.0.1
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.399835 [INFO] serf: EventMemberJoin: Node 7b55bb07-9a75-bdc9-70ec-a8133feacf7c.dc1 127.0.0.1
2019/11/27 02:17:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11512 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.427131 [INFO] agent: Started DNS server 127.0.0.1:11507 (udp)
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.427984 [INFO] agent: Started DNS server 127.0.0.1:11519 (udp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.428202 [INFO] serf: EventMemberJoin: Node 7b55bb07-9a75-bdc9-70ec-a8133feacf7c 127.0.0.1
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.428392 [INFO] agent: Started DNS server 127.0.0.1:11507 (tcp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.429521 [INFO] agent: Started DNS server 127.0.0.1:11513 (udp)
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.431019 [INFO] agent: Started HTTP server on 127.0.0.1:11508 (tcp)
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.431173 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.431147 [INFO] consul: Adding LAN server Node 7b55bb07-9a75-bdc9-70ec-a8133feacf7c (Addr: tcp/127.0.0.1:11518) (DC: dc1)
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.432712 [INFO] consul: Handled member-join event for server "Node 59440e12-9d94-7630-9049-4a2ca470d528.dc1" in area "wan"
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11506 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.433012 [INFO] agent: Started DNS server 127.0.0.1:11519 (tcp)
2019/11/27 02:17:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11524 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.434934 [INFO] consul: Adding LAN server Node d20ce383-2869-3673-23bd-2f2bf3d1d3cf (Addr: tcp/127.0.0.1:11524) (DC: dc1)
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.435375 [INFO] consul: Handled member-join event for server "Node d20ce383-2869-3673-23bd-2f2bf3d1d3cf.dc1" in area "wan"
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.435520 [INFO] agent: Started HTTP server on 127.0.0.1:11520 (tcp)
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:22.435649 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:22.435748 [INFO] consul: Adding LAN server Node 59440e12-9d94-7630-9049-4a2ca470d528 (Addr: tcp/127.0.0.1:11512) (DC: dc1)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.437580 [INFO] agent: Started DNS server 127.0.0.1:11513 (tcp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.439535 [INFO] agent: Started HTTP server on 127.0.0.1:11514 (tcp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.439635 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:22.440153 [INFO] consul: Handled member-join event for server "Node 7b55bb07-9a75-bdc9-70ec-a8133feacf7c.dc1" in area "wan"
2019/11/27 02:17:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11518 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.449026 [INFO] serf: EventMemberJoin: Node 01f00cb3-636d-f2dc-7126-ee534c67348a 127.0.0.1
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.450309 [INFO] agent: Started DNS server 127.0.0.1:11501 (udp)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.450437 [INFO] consul: Adding LAN server Node 01f00cb3-636d-f2dc-7126-ee534c67348a (Addr: tcp/127.0.0.1:11506) (DC: dc1)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.450636 [INFO] consul: Handled member-join event for server "Node 01f00cb3-636d-f2dc-7126-ee534c67348a.dc1" in area "wan"
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.450756 [INFO] agent: Started DNS server 127.0.0.1:11501 (tcp)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.452828 [INFO] agent: Started HTTP server on 127.0.0.1:11502 (tcp)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:22.452958 [INFO] agent: started state syncer
2019/11/27 02:17:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:22 [INFO]  raft: Node at 127.0.0.1:11506 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:23 [INFO]  raft: Node at 127.0.0.1:11506 [Leader] entering Leader state
2019/11/27 02:17:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:23 [INFO]  raft: Node at 127.0.0.1:11524 [Leader] entering Leader state
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.459526 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.460198 [INFO] consul: New leader elected: Node 01f00cb3-636d-f2dc-7126-ee534c67348a
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:23.460471 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:23.460799 [INFO] consul: New leader elected: Node d20ce383-2869-3673-23bd-2f2bf3d1d3cf
2019/11/27 02:17:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:23 [INFO]  raft: Node at 127.0.0.1:11512 [Leader] entering Leader state
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:23.462847 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:23.463263 [INFO] consul: New leader elected: Node 59440e12-9d94-7630-9049-4a2ca470d528
2019/11/27 02:17:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:23 [INFO]  raft: Node at 127.0.0.1:11518 [Leader] entering Leader state
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:23.465985 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:23.466356 [INFO] consul: New leader elected: Node 7b55bb07-9a75-bdc9-70ec-a8133feacf7c
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.728009 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.728135 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.728201 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.844784 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:23.916832 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:23.916949 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:23.916996 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:23.921968 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:23.922075 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:23.922123 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:23.936310 [INFO] manager: shutting down
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:23.987732 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:23.987836 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:23.987890 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:24.046616 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:24.048298 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:24.055883 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.049089 [INFO] agent: consul server down
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.058803 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.058861 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (tcp)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.059029 [INFO] agent: Stopping DNS server 127.0.0.1:11501 (udp)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.059189 [INFO] agent: Stopping HTTP server 127.0.0.1:11502 (tcp)
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.059456 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.059548 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:24.054434 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.049181 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.049244 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#00 - 2019/11/27 02:17:24.060389 [ERR] agent: failed to sync remote state: leadership lost while committing log
=== CONT  TestAgent_MultiStartStop/#02
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:24.055015 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:24.060996 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:24.282077 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:24.341286 [WARN] agent: Node name "Node 67e02128-c7b6-7af9-7cf9-0ae696c5b346" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:24.341815 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:24.341882 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:24.342041 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:24.342146 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:24.444880 [INFO] manager: shutting down
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:24.445730 [INFO] manager: shutting down
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:24.756912 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:24.770844 [ERR] agent: failed to sync remote state: No cluster leader
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.934771 [INFO] manager: shutting down
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.935281 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.935661 [INFO] agent: consul server down
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.935716 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.935769 [INFO] agent: Stopping DNS server 127.0.0.1:11507 (tcp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.935870 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.935927 [INFO] agent: Stopping DNS server 127.0.0.1:11507 (udp)
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.936083 [INFO] agent: Stopping HTTP server 127.0.0.1:11508 (tcp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936130 [INFO] agent: consul server down
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936171 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936225 [INFO] agent: Stopping DNS server 127.0.0.1:11513 (tcp)
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.936282 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936347 [INFO] agent: Stopping DNS server 127.0.0.1:11513 (udp)
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936497 [INFO] agent: Stopping HTTP server 127.0.0.1:11514 (tcp)
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.936350 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#01
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936786 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#03 - 2019/11/27 02:17:25.936865 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#07
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.936392 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.937840 [INFO] agent: consul server down
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.937887 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.937940 [INFO] agent: Stopping DNS server 127.0.0.1:11519 (tcp)
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.938034 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.938068 [INFO] agent: Stopping DNS server 127.0.0.1:11519 (udp)
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.938210 [INFO] agent: Stopping HTTP server 127.0.0.1:11520 (tcp)
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.938241 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#05 - 2019/11/27 02:17:25.937948 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.938437 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#08 - 2019/11/27 02:17:25.938499 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#09
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:26.203849 [WARN] agent: Node name "Node 0954f6de-38f0-ac8e-ed88-a9b97af64ec0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:26.205753 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:26.206174 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:26.206837 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:26.208142 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:26.232047 [WARN] agent: Node name "Node 863b5d30-6a82-8a45-fd40-be83ff8298ca" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:26.232623 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:26.232946 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:26.233347 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:26.233790 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:26.267668 [WARN] agent: Node name "Node 27ac68a9-a8f9-f75a-e216-ca1bec620229" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:26.268391 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:26.268492 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:26.271268 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:26.271438 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:67e02128-c7b6-7af9-7cf9-0ae696c5b346 Address:127.0.0.1:11530}]
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.519879 [INFO] serf: EventMemberJoin: Node 67e02128-c7b6-7af9-7cf9-0ae696c5b346.dc1 127.0.0.1
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11530 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.534954 [INFO] serf: EventMemberJoin: Node 67e02128-c7b6-7af9-7cf9-0ae696c5b346 127.0.0.1
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.536408 [INFO] consul: Adding LAN server Node 67e02128-c7b6-7af9-7cf9-0ae696c5b346 (Addr: tcp/127.0.0.1:11530) (DC: dc1)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.536597 [INFO] consul: Handled member-join event for server "Node 67e02128-c7b6-7af9-7cf9-0ae696c5b346.dc1" in area "wan"
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.538166 [INFO] agent: Started DNS server 127.0.0.1:11525 (tcp)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.538865 [INFO] agent: Started DNS server 127.0.0.1:11525 (udp)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.541150 [INFO] agent: Started HTTP server on 127.0.0.1:11526 (tcp)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:27.541275 [INFO] agent: started state syncer
2019/11/27 02:17:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11530 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0954f6de-38f0-ac8e-ed88-a9b97af64ec0 Address:127.0.0.1:11548}]
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11548 [Follower] entering Follower state (Leader: "")
2019/11/27 02:17:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:27ac68a9-a8f9-f75a-e216-ca1bec620229 Address:127.0.0.1:11536}]
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11536 [Follower] entering Follower state (Leader: "")
2019/11/27 02:17:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:863b5d30-6a82-8a45-fd40-be83ff8298ca Address:127.0.0.1:11542}]
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11542 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.874658 [INFO] serf: EventMemberJoin: Node 27ac68a9-a8f9-f75a-e216-ca1bec620229.dc1 127.0.0.1
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.875280 [INFO] serf: EventMemberJoin: Node 863b5d30-6a82-8a45-fd40-be83ff8298ca.dc1 127.0.0.1
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.878792 [INFO] serf: EventMemberJoin: Node 0954f6de-38f0-ac8e-ed88-a9b97af64ec0.dc1 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.878832 [INFO] serf: EventMemberJoin: Node 27ac68a9-a8f9-f75a-e216-ca1bec620229 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.888330 [INFO] agent: Started DNS server 127.0.0.1:11531 (udp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.900749 [INFO] serf: EventMemberJoin: Node 863b5d30-6a82-8a45-fd40-be83ff8298ca 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.900714 [INFO] agent: Started DNS server 127.0.0.1:11531 (tcp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.906974 [INFO] agent: Started DNS server 127.0.0.1:11537 (udp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.910763 [INFO] agent: Started DNS server 127.0.0.1:11537 (tcp)
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.914989 [INFO] agent: Started HTTP server on 127.0.0.1:11532 (tcp)
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.915145 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.917158 [INFO] agent: Started HTTP server on 127.0.0.1:11538 (tcp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.917344 [INFO] agent: started state syncer
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.920119 [INFO] consul: Adding LAN server Node 863b5d30-6a82-8a45-fd40-be83ff8298ca (Addr: tcp/127.0.0.1:11542) (DC: dc1)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.920972 [INFO] serf: EventMemberJoin: Node 0954f6de-38f0-ac8e-ed88-a9b97af64ec0 127.0.0.1
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.921663 [INFO] consul: Adding LAN server Node 27ac68a9-a8f9-f75a-e216-ca1bec620229 (Addr: tcp/127.0.0.1:11536) (DC: dc1)
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:27.922490 [INFO] consul: Handled member-join event for server "Node 27ac68a9-a8f9-f75a-e216-ca1bec620229.dc1" in area "wan"
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.922971 [INFO] agent: Started DNS server 127.0.0.1:11543 (udp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.923532 [INFO] consul: Adding LAN server Node 0954f6de-38f0-ac8e-ed88-a9b97af64ec0 (Addr: tcp/127.0.0.1:11548) (DC: dc1)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.923777 [INFO] consul: Handled member-join event for server "Node 0954f6de-38f0-ac8e-ed88-a9b97af64ec0.dc1" in area "wan"
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.924384 [INFO] agent: Started DNS server 127.0.0.1:11543 (tcp)
2019/11/27 02:17:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11548 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:27.926670 [INFO] consul: Handled member-join event for server "Node 863b5d30-6a82-8a45-fd40-be83ff8298ca.dc1" in area "wan"
2019/11/27 02:17:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11536 [Candidate] entering Candidate state in term 2
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.929482 [INFO] agent: Started HTTP server on 127.0.0.1:11544 (tcp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:27.929785 [INFO] agent: started state syncer
2019/11/27 02:17:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:11542 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:11530 [Leader] entering Leader state
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:28.401333 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:28.402032 [INFO] consul: New leader elected: Node 67e02128-c7b6-7af9-7cf9-0ae696c5b346
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:28.845365 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:28.845468 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:28.845514 [WARN] serf: Shutdown without a Leave
2019/11/27 02:17:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:11542 [Leader] entering Leader state
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:28.858426 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:28.858960 [INFO] consul: New leader elected: Node 863b5d30-6a82-8a45-fd40-be83ff8298ca
2019/11/27 02:17:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:11536 [Leader] entering Leader state
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:28.861963 [INFO] consul: cluster leadership acquired
2019/11/27 02:17:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:11548 [Leader] entering Leader state
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:28.862329 [INFO] consul: New leader elected: Node 27ac68a9-a8f9-f75a-e216-ca1bec620229
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:28.862348 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:28.862692 [INFO] consul: New leader elected: Node 0954f6de-38f0-ac8e-ed88-a9b97af64ec0
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.026517 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.134290 [INFO] manager: shutting down
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.134887 [INFO] agent: consul server down
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.134947 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135001 [INFO] agent: Stopping DNS server 127.0.0.1:11525 (tcp)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135156 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135179 [INFO] agent: Stopping DNS server 127.0.0.1:11525 (udp)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135267 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135319 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135382 [INFO] agent: Stopping HTTP server 127.0.0.1:11526 (tcp)
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135588 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#02 - 2019/11/27 02:17:29.135662 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#06
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.167001 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.167105 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.167166 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:29.191195 [WARN] agent: Node name "Node a0acfdc8-fff5-92fd-392e-68ce03337510" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:29.191603 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:29.191667 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:29.191890 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:29.191992 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.240982 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.241075 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.241128 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.362110 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.362221 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.362270 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.566737 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.567719 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.570248 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.855765 [INFO] manager: shutting down
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.855798 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856119 [INFO] agent: consul server down
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856176 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856229 [INFO] agent: Stopping DNS server 127.0.0.1:11531 (tcp)
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856367 [INFO] agent: Stopping DNS server 127.0.0.1:11531 (udp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.856484 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856514 [INFO] agent: Stopping HTTP server 127.0.0.1:11532 (tcp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.856566 [DEBUG] agent: Node info in sync
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856762 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#01 - 2019/11/27 02:17:29.856843 [INFO] agent: Endpoints down
=== CONT  TestAgent_MultiStartStop/#04
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:29.861621 [INFO] manager: shutting down
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.863516 [INFO] manager: shutting down
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.869422 [INFO] agent: consul server down
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.869499 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.869560 [INFO] agent: Stopping DNS server 127.0.0.1:11537 (tcp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.869704 [INFO] agent: Stopping DNS server 127.0.0.1:11537 (udp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.869862 [INFO] agent: Stopping HTTP server 127.0.0.1:11538 (tcp)
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.870062 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.870132 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.870264 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.870383 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgent_MultiStartStop/#09 - 2019/11/27 02:17:29.870436 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:29.925044 [WARN] agent: Node name "Node e7b5addd-a7f1-5484-8493-9df3e579a846" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:29.925531 [DEBUG] tlsutil: Update with version 1
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:29.925604 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:29.925821 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:29.925949 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169026 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169225 [INFO] agent: consul server down
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169279 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169337 [INFO] agent: Stopping DNS server 127.0.0.1:11543 (tcp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169374 [ERR] consul: failed to establish leadership: raft is already shutdown
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169479 [INFO] agent: Stopping DNS server 127.0.0.1:11543 (udp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169628 [INFO] agent: Stopping HTTP server 127.0.0.1:11544 (tcp)
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169903 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#07 - 2019/11/27 02:17:30.169995 [INFO] agent: Endpoints down
2019/11/27 02:17:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a0acfdc8-fff5-92fd-392e-68ce03337510 Address:127.0.0.1:11554}]
2019/11/27 02:17:31 [INFO]  raft: Node at 127.0.0.1:11554 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.594348 [INFO] serf: EventMemberJoin: Node a0acfdc8-fff5-92fd-392e-68ce03337510.dc1 127.0.0.1
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.598460 [INFO] serf: EventMemberJoin: Node a0acfdc8-fff5-92fd-392e-68ce03337510 127.0.0.1
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.599537 [INFO] consul: Adding LAN server Node a0acfdc8-fff5-92fd-392e-68ce03337510 (Addr: tcp/127.0.0.1:11554) (DC: dc1)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.599846 [INFO] consul: Handled member-join event for server "Node a0acfdc8-fff5-92fd-392e-68ce03337510.dc1" in area "wan"
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.600142 [INFO] agent: Started DNS server 127.0.0.1:11549 (udp)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.600542 [INFO] agent: Started DNS server 127.0.0.1:11549 (tcp)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.602708 [INFO] agent: Started HTTP server on 127.0.0.1:11550 (tcp)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:31.602805 [INFO] agent: started state syncer
2019/11/27 02:17:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:31 [INFO]  raft: Node at 127.0.0.1:11554 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e7b5addd-a7f1-5484-8493-9df3e579a846 Address:127.0.0.1:11560}]
2019/11/27 02:17:31 [INFO]  raft: Node at 127.0.0.1:11560 [Follower] entering Follower state (Leader: "")
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.905036 [INFO] serf: EventMemberJoin: Node e7b5addd-a7f1-5484-8493-9df3e579a846.dc1 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.913114 [INFO] serf: EventMemberJoin: Node e7b5addd-a7f1-5484-8493-9df3e579a846 127.0.0.1
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.914021 [INFO] consul: Adding LAN server Node e7b5addd-a7f1-5484-8493-9df3e579a846 (Addr: tcp/127.0.0.1:11560) (DC: dc1)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.914556 [INFO] consul: Handled member-join event for server "Node e7b5addd-a7f1-5484-8493-9df3e579a846.dc1" in area "wan"
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.917333 [INFO] agent: Started DNS server 127.0.0.1:11555 (tcp)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.917417 [INFO] agent: Started DNS server 127.0.0.1:11555 (udp)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.920542 [INFO] agent: Started HTTP server on 127.0.0.1:11556 (tcp)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:31.920958 [INFO] agent: started state syncer
2019/11/27 02:17:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:31 [INFO]  raft: Node at 127.0.0.1:11560 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:32 [INFO]  raft: Node at 127.0.0.1:11554 [Leader] entering Leader state
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:32.379675 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:32.380067 [INFO] consul: New leader elected: Node a0acfdc8-fff5-92fd-392e-68ce03337510
2019/11/27 02:17:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:32 [INFO]  raft: Node at 127.0.0.1:11560 [Leader] entering Leader state
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:32.644800 [INFO] consul: cluster leadership acquired
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:32.645315 [INFO] consul: New leader elected: Node e7b5addd-a7f1-5484-8493-9df3e579a846
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:32.729870 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:32.881981 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:32.882081 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:32.882131 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.043930 [INFO] agent: Requesting shutdown
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.044243 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.044273 [INFO] consul: shutting down server
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.044323 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.123267 [INFO] agent: Synced node info
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.123764 [INFO] manager: shutting down
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.124108 [WARN] serf: Shutdown without a Leave
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.233450 [INFO] manager: shutting down
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.233609 [INFO] agent: consul server down
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.233650 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.233666 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.233836 [INFO] agent: Stopping DNS server 127.0.0.1:11549 (tcp)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.234011 [INFO] agent: Stopping DNS server 127.0.0.1:11549 (udp)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.234190 [INFO] agent: Stopping HTTP server 127.0.0.1:11550 (tcp)
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.234466 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#06 - 2019/11/27 02:17:33.234530 [INFO] agent: Endpoints down
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.366985 [INFO] agent: consul server down
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.367087 [INFO] agent: shutdown complete
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.367154 [INFO] agent: Stopping DNS server 127.0.0.1:11555 (tcp)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.367315 [INFO] agent: Stopping DNS server 127.0.0.1:11555 (udp)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.367506 [INFO] agent: Stopping HTTP server 127.0.0.1:11556 (tcp)
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.367729 [INFO] agent: Waiting for endpoints to shut down
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.367801 [INFO] agent: Endpoints down
--- PASS: TestAgent_MultiStartStop (0.00s)
    --- PASS: TestAgent_MultiStartStop/#00 (5.50s)
    --- PASS: TestAgent_MultiStartStop/#05 (7.38s)
    --- PASS: TestAgent_MultiStartStop/#03 (7.37s)
    --- PASS: TestAgent_MultiStartStop/#08 (7.38s)
    --- PASS: TestAgent_MultiStartStop/#02 (5.08s)
    --- PASS: TestAgent_MultiStartStop/#01 (3.92s)
    --- PASS: TestAgent_MultiStartStop/#09 (3.93s)
    --- PASS: TestAgent_MultiStartStop/#07 (4.23s)
    --- PASS: TestAgent_MultiStartStop/#06 (4.10s)
    --- PASS: TestAgent_MultiStartStop/#04 (3.51s)
=== RUN   TestAgent_ConnectClusterIDConfig
=== RUN   TestAgent_ConnectClusterIDConfig/default_TestAgent_has_fixed_cluster_id
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.368849 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestAgent_MultiStartStop/#04 - 2019/11/27 02:17:33.369326 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/11/27 02:17:33.523088 [WARN] agent: Node name "Node ad7355da-264a-b2c1-320a-bb3086e5fe23" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/11/27 02:17:33.523519 [DEBUG] tlsutil: Update with version 1
test - 2019/11/27 02:17:33.523583 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
test - 2019/11/27 02:17:33.523746 [DEBUG] tlsutil: IncomingRPCConfig with version 1
test - 2019/11/27 02:17:33.523857 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ad7355da-264a-b2c1-320a-bb3086e5fe23 Address:127.0.0.1:11566}]
2019/11/27 02:17:35 [INFO]  raft: Node at 127.0.0.1:11566 [Follower] entering Follower state (Leader: "")
test - 2019/11/27 02:17:35.341264 [INFO] serf: EventMemberJoin: Node ad7355da-264a-b2c1-320a-bb3086e5fe23.dc1 127.0.0.1
test - 2019/11/27 02:17:35.345284 [INFO] serf: EventMemberJoin: Node ad7355da-264a-b2c1-320a-bb3086e5fe23 127.0.0.1
test - 2019/11/27 02:17:35.346299 [INFO] consul: Handled member-join event for server "Node ad7355da-264a-b2c1-320a-bb3086e5fe23.dc1" in area "wan"
test - 2019/11/27 02:17:35.346666 [INFO] consul: Adding LAN server Node ad7355da-264a-b2c1-320a-bb3086e5fe23 (Addr: tcp/127.0.0.1:11566) (DC: dc1)
test - 2019/11/27 02:17:35.347513 [INFO] agent: Started DNS server 127.0.0.1:11561 (udp)
test - 2019/11/27 02:17:35.347616 [INFO] agent: Started DNS server 127.0.0.1:11561 (tcp)
test - 2019/11/27 02:17:35.349751 [INFO] agent: Started HTTP server on 127.0.0.1:11562 (tcp)
test - 2019/11/27 02:17:35.349866 [INFO] agent: started state syncer
2019/11/27 02:17:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:35 [INFO]  raft: Node at 127.0.0.1:11566 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:36 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:36 [INFO]  raft: Node at 127.0.0.1:11566 [Leader] entering Leader state
test - 2019/11/27 02:17:36.352800 [INFO] consul: cluster leadership acquired
test - 2019/11/27 02:17:36.353376 [INFO] consul: New leader elected: Node ad7355da-264a-b2c1-320a-bb3086e5fe23
test - 2019/11/27 02:17:36.431898 [INFO] agent: Requesting shutdown
test - 2019/11/27 02:17:36.431996 [INFO] consul: shutting down server
test - 2019/11/27 02:17:36.432045 [WARN] serf: Shutdown without a Leave
test - 2019/11/27 02:17:36.566473 [WARN] serf: Shutdown without a Leave
test - 2019/11/27 02:17:36.655278 [INFO] manager: shutting down
test - 2019/11/27 02:17:36.655343 [WARN] agent: Syncing node info failed. raft is already shutdown
test - 2019/11/27 02:17:36.655411 [ERR] consul: failed to wait for barrier: raft is already shutdown
test - 2019/11/27 02:17:36.655424 [ERR] agent: failed to sync remote state: raft is already shutdown
test - 2019/11/27 02:17:36.655743 [INFO] agent: consul server down
test - 2019/11/27 02:17:36.655792 [INFO] agent: shutdown complete
test - 2019/11/27 02:17:36.655842 [INFO] agent: Stopping DNS server 127.0.0.1:11561 (tcp)
test - 2019/11/27 02:17:36.655978 [INFO] agent: Stopping DNS server 127.0.0.1:11561 (udp)
test - 2019/11/27 02:17:36.656138 [INFO] agent: Stopping HTTP server 127.0.0.1:11562 (tcp)
test - 2019/11/27 02:17:36.656331 [INFO] agent: Waiting for endpoints to shut down
test - 2019/11/27 02:17:36.656398 [INFO] agent: Endpoints down
=== RUN   TestAgent_ConnectClusterIDConfig/no_cluster_ID_specified_sets_to_test_ID
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/11/27 02:17:36.722891 [WARN] agent: Node name "Node 73bd0e98-9d49-763e-6170-5863b2efb1c6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/11/27 02:17:36.723613 [DEBUG] tlsutil: Update with version 1
test - 2019/11/27 02:17:36.723834 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
test - 2019/11/27 02:17:36.724296 [DEBUG] tlsutil: IncomingRPCConfig with version 1
test - 2019/11/27 02:17:36.724674 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:73bd0e98-9d49-763e-6170-5863b2efb1c6 Address:127.0.0.1:11572}]
2019/11/27 02:17:37 [INFO]  raft: Node at 127.0.0.1:11572 [Follower] entering Follower state (Leader: "")
test - 2019/11/27 02:17:37.576032 [INFO] serf: EventMemberJoin: Node 73bd0e98-9d49-763e-6170-5863b2efb1c6.dc1 127.0.0.1
test - 2019/11/27 02:17:37.580288 [INFO] serf: EventMemberJoin: Node 73bd0e98-9d49-763e-6170-5863b2efb1c6 127.0.0.1
test - 2019/11/27 02:17:37.580994 [INFO] consul: Handled member-join event for server "Node 73bd0e98-9d49-763e-6170-5863b2efb1c6.dc1" in area "wan"
test - 2019/11/27 02:17:37.581043 [INFO] consul: Adding LAN server Node 73bd0e98-9d49-763e-6170-5863b2efb1c6 (Addr: tcp/127.0.0.1:11572) (DC: dc1)
test - 2019/11/27 02:17:37.581515 [INFO] agent: Started DNS server 127.0.0.1:11567 (tcp)
test - 2019/11/27 02:17:37.581809 [INFO] agent: Started DNS server 127.0.0.1:11567 (udp)
test - 2019/11/27 02:17:37.584000 [INFO] agent: Started HTTP server on 127.0.0.1:11568 (tcp)
test - 2019/11/27 02:17:37.584109 [INFO] agent: started state syncer
2019/11/27 02:17:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:37 [INFO]  raft: Node at 127.0.0.1:11572 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:38 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:38 [INFO]  raft: Node at 127.0.0.1:11572 [Leader] entering Leader state
test - 2019/11/27 02:17:38.199976 [INFO] consul: cluster leadership acquired
test - 2019/11/27 02:17:38.200488 [INFO] consul: New leader elected: Node 73bd0e98-9d49-763e-6170-5863b2efb1c6
test - 2019/11/27 02:17:38.260499 [INFO] agent: Requesting shutdown
test - 2019/11/27 02:17:38.260598 [INFO] consul: shutting down server
test - 2019/11/27 02:17:38.260643 [WARN] serf: Shutdown without a Leave
test - 2019/11/27 02:17:38.260887 [ERR] agent: failed to sync remote state: No cluster leader
test - 2019/11/27 02:17:38.355143 [WARN] serf: Shutdown without a Leave
test - 2019/11/27 02:17:38.510770 [INFO] manager: shutting down
test - 2019/11/27 02:17:38.590199 [ERR] consul: failed to wait for barrier: leadership lost while committing log
test - 2019/11/27 02:17:38.590362 [WARN] agent: Syncing node info failed. leadership lost while committing log
test - 2019/11/27 02:17:38.590444 [ERR] agent: failed to sync remote state: leadership lost while committing log
test - 2019/11/27 02:17:38.590471 [INFO] agent: consul server down
test - 2019/11/27 02:17:38.590515 [INFO] agent: shutdown complete
test - 2019/11/27 02:17:38.590566 [INFO] agent: Stopping DNS server 127.0.0.1:11567 (tcp)
test - 2019/11/27 02:17:38.590697 [INFO] agent: Stopping DNS server 127.0.0.1:11567 (udp)
test - 2019/11/27 02:17:38.590861 [INFO] agent: Stopping HTTP server 127.0.0.1:11568 (tcp)
test - 2019/11/27 02:17:38.591104 [INFO] agent: Waiting for endpoints to shut down
test - 2019/11/27 02:17:38.591157 [INFO] agent: Endpoints down
=== RUN   TestAgent_ConnectClusterIDConfig/non-UUID_cluster_id_is_fatal
WARNING: bootstrap = true: do not enable unless necessary
test - 2019/11/27 02:17:38.685480 [WARN] agent: Node name "Node 3aa8d637-4dea-456d-e0cb-fcea9100fb8f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test - 2019/11/27 02:17:38.685774 [ERR] connect CA config cluster_id specified but is not a valid UUID, aborting startup
--- PASS: TestAgent_ConnectClusterIDConfig (5.32s)
    --- PASS: TestAgent_ConnectClusterIDConfig/default_TestAgent_has_fixed_cluster_id (3.29s)
    --- PASS: TestAgent_ConnectClusterIDConfig/no_cluster_ID_specified_sets_to_test_ID (1.93s)
    --- PASS: TestAgent_ConnectClusterIDConfig/non-UUID_cluster_id_is_fatal (0.09s)
=== RUN   TestAgent_StartStop
=== PAUSE TestAgent_StartStop
=== RUN   TestAgent_RPCPing
=== PAUSE TestAgent_RPCPing
=== RUN   TestAgent_TokenStore
=== PAUSE TestAgent_TokenStore
=== RUN   TestAgent_ReconnectConfigSettings
=== PAUSE TestAgent_ReconnectConfigSettings
=== RUN   TestAgent_ReconnectConfigWanDisabled
=== PAUSE TestAgent_ReconnectConfigWanDisabled
=== RUN   TestAgent_setupNodeID
=== PAUSE TestAgent_setupNodeID
=== RUN   TestAgent_makeNodeID
=== PAUSE TestAgent_makeNodeID
=== RUN   TestAgent_AddService
=== PAUSE TestAgent_AddService
=== RUN   TestAgent_AddServiceNoExec
=== PAUSE TestAgent_AddServiceNoExec
=== RUN   TestAgent_AddServiceNoRemoteExec
=== PAUSE TestAgent_AddServiceNoRemoteExec
=== RUN   TestAgent_RemoveService
=== PAUSE TestAgent_RemoveService
=== RUN   TestAgent_RemoveServiceRemovesAllChecks
=== PAUSE TestAgent_RemoveServiceRemovesAllChecks
=== RUN   TestAgent_IndexChurn
=== PAUSE TestAgent_IndexChurn
=== RUN   TestAgent_AddCheck
=== PAUSE TestAgent_AddCheck
=== RUN   TestAgent_AddCheck_StartPassing
=== PAUSE TestAgent_AddCheck_StartPassing
=== RUN   TestAgent_AddCheck_MinInterval
=== PAUSE TestAgent_AddCheck_MinInterval
=== RUN   TestAgent_AddCheck_MissingService
=== PAUSE TestAgent_AddCheck_MissingService
=== RUN   TestAgent_AddCheck_RestoreState
=== PAUSE TestAgent_AddCheck_RestoreState
=== RUN   TestAgent_AddCheck_ExecDisable
=== PAUSE TestAgent_AddCheck_ExecDisable
=== RUN   TestAgent_AddCheck_ExecRemoteDisable
=== PAUSE TestAgent_AddCheck_ExecRemoteDisable
=== RUN   TestAgent_AddCheck_GRPC
=== PAUSE TestAgent_AddCheck_GRPC
=== RUN   TestAgent_AddCheck_Alias
=== PAUSE TestAgent_AddCheck_Alias
=== RUN   TestAgent_AddCheck_Alias_setToken
=== PAUSE TestAgent_AddCheck_Alias_setToken
=== RUN   TestAgent_AddCheck_Alias_userToken
=== PAUSE TestAgent_AddCheck_Alias_userToken
=== RUN   TestAgent_AddCheck_Alias_userAndSetToken
=== PAUSE TestAgent_AddCheck_Alias_userAndSetToken
=== RUN   TestAgent_RemoveCheck
=== PAUSE TestAgent_RemoveCheck
=== RUN   TestAgent_HTTPCheck_TLSSkipVerify
=== PAUSE TestAgent_HTTPCheck_TLSSkipVerify
=== RUN   TestAgent_HTTPCheck_EnableAgentTLSForChecks
--- SKIP: TestAgent_HTTPCheck_EnableAgentTLSForChecks (0.00s)
    agent_test.go:1339: DM-skipped
=== RUN   TestAgent_updateTTLCheck
=== PAUSE TestAgent_updateTTLCheck
=== RUN   TestAgent_PersistService
=== PAUSE TestAgent_PersistService
=== RUN   TestAgent_persistedService_compat
=== PAUSE TestAgent_persistedService_compat
=== RUN   TestAgent_PurgeService
=== PAUSE TestAgent_PurgeService
=== RUN   TestAgent_PurgeServiceOnDuplicate
=== PAUSE TestAgent_PurgeServiceOnDuplicate
=== RUN   TestAgent_PersistProxy
=== PAUSE TestAgent_PersistProxy
=== RUN   TestAgent_PurgeProxy
=== PAUSE TestAgent_PurgeProxy
=== RUN   TestAgent_PurgeProxyOnDuplicate
=== PAUSE TestAgent_PurgeProxyOnDuplicate
=== RUN   TestAgent_PersistCheck
=== PAUSE TestAgent_PersistCheck
=== RUN   TestAgent_PurgeCheck
--- SKIP: TestAgent_PurgeCheck (0.00s)
    agent_test.go:1950: DM-skipped
=== RUN   TestAgent_PurgeCheckOnDuplicate
=== PAUSE TestAgent_PurgeCheckOnDuplicate
=== RUN   TestAgent_loadChecks_token
=== PAUSE TestAgent_loadChecks_token
=== RUN   TestAgent_unloadChecks
=== PAUSE TestAgent_unloadChecks
=== RUN   TestAgent_loadServices_token
=== PAUSE TestAgent_loadServices_token
=== RUN   TestAgent_loadServices_sidecar
=== PAUSE TestAgent_loadServices_sidecar
=== RUN   TestAgent_loadServices_sidecarSeparateToken
=== PAUSE TestAgent_loadServices_sidecarSeparateToken
=== RUN   TestAgent_loadServices_sidecarInheritMeta
=== PAUSE TestAgent_loadServices_sidecarInheritMeta
=== RUN   TestAgent_loadServices_sidecarOverrideMeta
=== PAUSE TestAgent_loadServices_sidecarOverrideMeta
=== RUN   TestAgent_unloadServices
=== PAUSE TestAgent_unloadServices
=== RUN   TestAgent_loadProxies
=== PAUSE TestAgent_loadProxies
=== RUN   TestAgent_loadProxies_nilProxy
=== PAUSE TestAgent_loadProxies_nilProxy
=== RUN   TestAgent_unloadProxies
=== PAUSE TestAgent_unloadProxies
=== RUN   TestAgent_Service_MaintenanceMode
=== PAUSE TestAgent_Service_MaintenanceMode
=== RUN   TestAgent_Service_Reap
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_Reap - 2019/11/27 02:17:38.801571 [WARN] agent: Node name "Node afca63c6-6be9-2e98-35e9-0bab9d703163" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_Reap - 2019/11/27 02:17:38.803048 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_Reap - 2019/11/27 02:17:38.803122 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Service_Reap - 2019/11/27 02:17:38.803290 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_Service_Reap - 2019/11/27 02:17:38.803385 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:afca63c6-6be9-2e98-35e9-0bab9d703163 Address:127.0.0.1:11584}]
2019/11/27 02:17:40 [INFO]  raft: Node at 127.0.0.1:11584 [Follower] entering Follower state (Leader: "")
TestAgent_Service_Reap - 2019/11/27 02:17:40.241314 [INFO] serf: EventMemberJoin: Node afca63c6-6be9-2e98-35e9-0bab9d703163.dc1 127.0.0.1
TestAgent_Service_Reap - 2019/11/27 02:17:40.245247 [INFO] serf: EventMemberJoin: Node afca63c6-6be9-2e98-35e9-0bab9d703163 127.0.0.1
TestAgent_Service_Reap - 2019/11/27 02:17:40.246261 [INFO] consul: Adding LAN server Node afca63c6-6be9-2e98-35e9-0bab9d703163 (Addr: tcp/127.0.0.1:11584) (DC: dc1)
TestAgent_Service_Reap - 2019/11/27 02:17:40.246611 [INFO] consul: Handled member-join event for server "Node afca63c6-6be9-2e98-35e9-0bab9d703163.dc1" in area "wan"
TestAgent_Service_Reap - 2019/11/27 02:17:40.246667 [INFO] agent: Started DNS server 127.0.0.1:11579 (udp)
TestAgent_Service_Reap - 2019/11/27 02:17:40.247135 [INFO] agent: Started DNS server 127.0.0.1:11579 (tcp)
TestAgent_Service_Reap - 2019/11/27 02:17:40.249188 [INFO] agent: Started HTTP server on 127.0.0.1:11580 (tcp)
TestAgent_Service_Reap - 2019/11/27 02:17:40.249277 [INFO] agent: started state syncer
2019/11/27 02:17:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:40 [INFO]  raft: Node at 127.0.0.1:11584 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:41 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:41 [INFO]  raft: Node at 127.0.0.1:11584 [Leader] entering Leader state
TestAgent_Service_Reap - 2019/11/27 02:17:41.102327 [INFO] consul: cluster leadership acquired
TestAgent_Service_Reap - 2019/11/27 02:17:41.102763 [INFO] consul: New leader elected: Node afca63c6-6be9-2e98-35e9-0bab9d703163
TestAgent_Service_Reap - 2019/11/27 02:17:41.611300 [INFO] agent: Synced node info
TestAgent_Service_Reap - 2019/11/27 02:17:42.555471 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_Service_Reap - 2019/11/27 02:17:42.555915 [DEBUG] consul: Skipping self join check for "Node afca63c6-6be9-2e98-35e9-0bab9d703163" since the cluster is too small
TestAgent_Service_Reap - 2019/11/27 02:17:42.556062 [INFO] consul: member 'Node afca63c6-6be9-2e98-35e9-0bab9d703163' joined, marking health alive
TestAgent_Service_Reap - 2019/11/27 02:17:42.763018 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_Reap - 2019/11/27 02:17:42.838235 [DEBUG] agent: Check "service:redis" status is now passing
TestAgent_Service_Reap - 2019/11/27 02:17:42.867054 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_Reap - 2019/11/27 02:17:43.071943 [DEBUG] agent: removed check "service:redis"
TestAgent_Service_Reap - 2019/11/27 02:17:43.072043 [DEBUG] agent: removed service "redis"
TestAgent_Service_Reap - 2019/11/27 02:17:43.072117 [INFO] agent: Check "service:redis" for service "redis" has been critical for too long; deregistered service
TestAgent_Service_Reap - 2019/11/27 02:17:43.304624 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgent_Service_Reap - 2019/11/27 02:17:43.455088 [INFO] agent: Deregistered service "redis"
TestAgent_Service_Reap - 2019/11/27 02:17:43.588440 [INFO] agent: Deregistered check "service:redis"
TestAgent_Service_Reap - 2019/11/27 02:17:43.588510 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/11/27 02:17:43.588644 [INFO] agent: Requesting shutdown
TestAgent_Service_Reap - 2019/11/27 02:17:43.588717 [INFO] consul: shutting down server
TestAgent_Service_Reap - 2019/11/27 02:17:43.588766 [WARN] serf: Shutdown without a Leave
TestAgent_Service_Reap - 2019/11/27 02:17:43.588723 [DEBUG] agent: Node info in sync
TestAgent_Service_Reap - 2019/11/27 02:17:43.643594 [WARN] serf: Shutdown without a Leave
TestAgent_Service_Reap - 2019/11/27 02:17:43.699268 [INFO] manager: shutting down
TestAgent_Service_Reap - 2019/11/27 02:17:43.699686 [INFO] agent: consul server down
TestAgent_Service_Reap - 2019/11/27 02:17:43.699737 [INFO] agent: shutdown complete
TestAgent_Service_Reap - 2019/11/27 02:17:43.699799 [INFO] agent: Stopping DNS server 127.0.0.1:11579 (tcp)
TestAgent_Service_Reap - 2019/11/27 02:17:43.699948 [INFO] agent: Stopping DNS server 127.0.0.1:11579 (udp)
TestAgent_Service_Reap - 2019/11/27 02:17:43.700108 [INFO] agent: Stopping HTTP server 127.0.0.1:11580 (tcp)
TestAgent_Service_Reap - 2019/11/27 02:17:43.700319 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_Reap - 2019/11/27 02:17:43.700387 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_Reap (5.01s)
=== RUN   TestAgent_Service_NoReap
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_Service_NoReap - 2019/11/27 02:17:43.758305 [WARN] agent: Node name "Node b22229c3-bf9a-db10-62e3-49c54984cbc0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_Service_NoReap - 2019/11/27 02:17:43.758718 [DEBUG] tlsutil: Update with version 1
TestAgent_Service_NoReap - 2019/11/27 02:17:43.758789 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_Service_NoReap - 2019/11/27 02:17:43.758974 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_Service_NoReap - 2019/11/27 02:17:43.759094 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b22229c3-bf9a-db10-62e3-49c54984cbc0 Address:127.0.0.1:11590}]
2019/11/27 02:17:44 [INFO]  raft: Node at 127.0.0.1:11590 [Follower] entering Follower state (Leader: "")
TestAgent_Service_NoReap - 2019/11/27 02:17:44.659888 [INFO] serf: EventMemberJoin: Node b22229c3-bf9a-db10-62e3-49c54984cbc0.dc1 127.0.0.1
TestAgent_Service_NoReap - 2019/11/27 02:17:44.663083 [INFO] serf: EventMemberJoin: Node b22229c3-bf9a-db10-62e3-49c54984cbc0 127.0.0.1
TestAgent_Service_NoReap - 2019/11/27 02:17:44.664430 [INFO] agent: Started DNS server 127.0.0.1:11585 (udp)
TestAgent_Service_NoReap - 2019/11/27 02:17:44.664829 [INFO] consul: Adding LAN server Node b22229c3-bf9a-db10-62e3-49c54984cbc0 (Addr: tcp/127.0.0.1:11590) (DC: dc1)
TestAgent_Service_NoReap - 2019/11/27 02:17:44.664944 [INFO] agent: Started DNS server 127.0.0.1:11585 (tcp)
TestAgent_Service_NoReap - 2019/11/27 02:17:44.665033 [INFO] consul: Handled member-join event for server "Node b22229c3-bf9a-db10-62e3-49c54984cbc0.dc1" in area "wan"
TestAgent_Service_NoReap - 2019/11/27 02:17:44.672279 [INFO] agent: Started HTTP server on 127.0.0.1:11586 (tcp)
TestAgent_Service_NoReap - 2019/11/27 02:17:44.672448 [INFO] agent: started state syncer
2019/11/27 02:17:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:44 [INFO]  raft: Node at 127.0.0.1:11590 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:45 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:45 [INFO]  raft: Node at 127.0.0.1:11590 [Leader] entering Leader state
TestAgent_Service_NoReap - 2019/11/27 02:17:45.401054 [INFO] consul: cluster leadership acquired
TestAgent_Service_NoReap - 2019/11/27 02:17:45.401506 [INFO] consul: New leader elected: Node b22229c3-bf9a-db10-62e3-49c54984cbc0
TestAgent_Service_NoReap - 2019/11/27 02:17:45.655604 [WARN] agent: Check "service:redis" missed TTL, is now critical
TestAgent_Service_NoReap - 2019/11/27 02:17:46.179210 [INFO] agent: Synced service "redis"
TestAgent_Service_NoReap - 2019/11/27 02:17:46.179333 [DEBUG] agent: Check "service:redis" in sync
TestAgent_Service_NoReap - 2019/11/27 02:17:46.179373 [DEBUG] agent: Node info in sync
TestAgent_Service_NoReap - 2019/11/27 02:17:46.379897 [INFO] agent: Requesting shutdown
TestAgent_Service_NoReap - 2019/11/27 02:17:46.380045 [INFO] consul: shutting down server
TestAgent_Service_NoReap - 2019/11/27 02:17:46.380107 [WARN] serf: Shutdown without a Leave
TestAgent_Service_NoReap - 2019/11/27 02:17:46.787903 [WARN] serf: Shutdown without a Leave
TestAgent_Service_NoReap - 2019/11/27 02:17:47.165764 [INFO] manager: shutting down
TestAgent_Service_NoReap - 2019/11/27 02:17:47.410573 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestAgent_Service_NoReap - 2019/11/27 02:17:47.410737 [INFO] agent: consul server down
TestAgent_Service_NoReap - 2019/11/27 02:17:47.410973 [INFO] agent: shutdown complete
TestAgent_Service_NoReap - 2019/11/27 02:17:47.411029 [INFO] agent: Stopping DNS server 127.0.0.1:11585 (tcp)
TestAgent_Service_NoReap - 2019/11/27 02:17:47.413618 [INFO] agent: Stopping DNS server 127.0.0.1:11585 (udp)
TestAgent_Service_NoReap - 2019/11/27 02:17:47.413785 [INFO] agent: Stopping HTTP server 127.0.0.1:11586 (tcp)
TestAgent_Service_NoReap - 2019/11/27 02:17:47.413992 [INFO] agent: Waiting for endpoints to shut down
TestAgent_Service_NoReap - 2019/11/27 02:17:47.414060 [INFO] agent: Endpoints down
--- PASS: TestAgent_Service_NoReap (3.71s)
=== RUN   TestAgent_AddService_restoresSnapshot
=== PAUSE TestAgent_AddService_restoresSnapshot
=== RUN   TestAgent_AddCheck_restoresSnapshot
=== PAUSE TestAgent_AddCheck_restoresSnapshot
=== RUN   TestAgent_NodeMaintenanceMode
=== PAUSE TestAgent_NodeMaintenanceMode
=== RUN   TestAgent_checkStateSnapshot
=== PAUSE TestAgent_checkStateSnapshot
=== RUN   TestAgent_loadChecks_checkFails
=== PAUSE TestAgent_loadChecks_checkFails
=== RUN   TestAgent_persistCheckState
=== PAUSE TestAgent_persistCheckState
=== RUN   TestAgent_loadCheckState
=== PAUSE TestAgent_loadCheckState
=== RUN   TestAgent_purgeCheckState
=== PAUSE TestAgent_purgeCheckState
=== RUN   TestAgent_GetCoordinate
=== PAUSE TestAgent_GetCoordinate
=== RUN   TestAgent_reloadWatches
=== PAUSE TestAgent_reloadWatches
=== RUN   TestAgent_reloadWatchesHTTPS
=== PAUSE TestAgent_reloadWatchesHTTPS
=== RUN   TestAgent_AddProxy
--- SKIP: TestAgent_AddProxy (0.00s)
    agent_test.go:3069: DM-skipped
=== RUN   TestAgent_RemoveProxy
=== PAUSE TestAgent_RemoveProxy
=== RUN   TestAgent_ReLoadProxiesFromConfig
=== PAUSE TestAgent_ReLoadProxiesFromConfig
=== RUN   TestAgent_SetupProxyManager
=== PAUSE TestAgent_SetupProxyManager
=== RUN   TestAgent_loadTokens
=== PAUSE TestAgent_loadTokens
=== RUN   TestAgent_ReloadConfigOutgoingRPCConfig
=== PAUSE TestAgent_ReloadConfigOutgoingRPCConfig
=== RUN   TestAgent_ReloadConfigIncomingRPCConfig
=== PAUSE TestAgent_ReloadConfigIncomingRPCConfig
=== RUN   TestAgent_ReloadConfigTLSConfigFailure
=== PAUSE TestAgent_ReloadConfigTLSConfigFailure
=== RUN   TestBlacklist
=== PAUSE TestBlacklist
=== RUN   TestCatalogRegister_Service_InvalidAddress
=== PAUSE TestCatalogRegister_Service_InvalidAddress
=== RUN   TestCatalogDeregister
=== PAUSE TestCatalogDeregister
=== RUN   TestCatalogDatacenters
=== PAUSE TestCatalogDatacenters
=== RUN   TestCatalogNodes
=== PAUSE TestCatalogNodes
=== RUN   TestCatalogNodes_MetaFilter
=== PAUSE TestCatalogNodes_MetaFilter
=== RUN   TestCatalogNodes_WanTranslation
=== PAUSE TestCatalogNodes_WanTranslation
=== RUN   TestCatalogNodes_Blocking
=== PAUSE TestCatalogNodes_Blocking
=== RUN   TestCatalogNodes_DistanceSort
=== PAUSE TestCatalogNodes_DistanceSort
=== RUN   TestCatalogServices
=== PAUSE TestCatalogServices
=== RUN   TestCatalogServices_NodeMetaFilter
=== PAUSE TestCatalogServices_NodeMetaFilter
=== RUN   TestCatalogServiceNodes
=== PAUSE TestCatalogServiceNodes
=== RUN   TestCatalogServiceNodes_NodeMetaFilter
=== PAUSE TestCatalogServiceNodes_NodeMetaFilter
=== RUN   TestCatalogServiceNodes_WanTranslation
--- SKIP: TestCatalogServiceNodes_WanTranslation (0.00s)
    catalog_endpoint_test.go:655: DM-skipped
=== RUN   TestCatalogServiceNodes_DistanceSort
=== PAUSE TestCatalogServiceNodes_DistanceSort
=== RUN   TestCatalogServiceNodes_ConnectProxy
=== PAUSE TestCatalogServiceNodes_ConnectProxy
=== RUN   TestCatalogConnectServiceNodes_good
=== PAUSE TestCatalogConnectServiceNodes_good
=== RUN   TestCatalogNodeServices
=== PAUSE TestCatalogNodeServices
=== RUN   TestCatalogNodeServices_ConnectProxy
=== PAUSE TestCatalogNodeServices_ConnectProxy
=== RUN   TestCatalogNodeServices_WanTranslation
--- SKIP: TestCatalogNodeServices_WanTranslation (0.00s)
    catalog_endpoint_test.go:960: DM-skipped
=== RUN   TestConnectCARoots_empty
=== PAUSE TestConnectCARoots_empty
=== RUN   TestConnectCARoots_list
=== PAUSE TestConnectCARoots_list
=== RUN   TestConnectCAConfig
=== PAUSE TestConnectCAConfig
=== RUN   TestCoordinate_Disabled_Response
=== PAUSE TestCoordinate_Disabled_Response
=== RUN   TestCoordinate_Datacenters
--- SKIP: TestCoordinate_Datacenters (0.00s)
    coordinate_endpoint_test.go:53: DM-skipped
=== RUN   TestCoordinate_Nodes
=== PAUSE TestCoordinate_Nodes
=== RUN   TestCoordinate_Node
=== PAUSE TestCoordinate_Node
=== RUN   TestCoordinate_Update
=== PAUSE TestCoordinate_Update
=== RUN   TestCoordinate_Update_ACLDeny
=== PAUSE TestCoordinate_Update_ACLDeny
=== RUN   TestRecursorAddr
=== PAUSE TestRecursorAddr
=== RUN   TestEncodeKVasRFC1464
--- PASS: TestEncodeKVasRFC1464 (0.00s)
=== RUN   TestDNS_Over_TCP
=== PAUSE TestDNS_Over_TCP
=== RUN   TestDNS_NodeLookup
--- SKIP: TestDNS_NodeLookup (0.00s)
    dns_test.go:177: DM-skipped
=== RUN   TestDNS_CaseInsensitiveNodeLookup
=== PAUSE TestDNS_CaseInsensitiveNodeLookup
=== RUN   TestDNS_NodeLookup_PeriodName
=== PAUSE TestDNS_NodeLookup_PeriodName
=== RUN   TestDNS_NodeLookup_AAAA
=== PAUSE TestDNS_NodeLookup_AAAA
=== RUN   TestDNSCycleRecursorCheck
=== PAUSE TestDNSCycleRecursorCheck
=== RUN   TestDNSCycleRecursorCheckAllFail
--- SKIP: TestDNSCycleRecursorCheckAllFail (0.00s)
    dns_test.go:422: DM-skipped
=== RUN   TestDNS_NodeLookup_CNAME
=== PAUSE TestDNS_NodeLookup_CNAME
=== RUN   TestDNS_NodeLookup_TXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:47.520777 [WARN] agent: Node name "Node e7da9b9c-03cd-f06e-4528-868ac7031a29" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:47.521420 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:47.521591 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:47.521918 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:47.522182 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e7da9b9c-03cd-f06e-4528-868ac7031a29 Address:127.0.0.1:11596}]
2019/11/27 02:17:48 [INFO]  raft: Node at 127.0.0.1:11596 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.362729 [INFO] serf: EventMemberJoin: Node e7da9b9c-03cd-f06e-4528-868ac7031a29.dc1 127.0.0.1
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.365917 [INFO] serf: EventMemberJoin: Node e7da9b9c-03cd-f06e-4528-868ac7031a29 127.0.0.1
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.366556 [INFO] consul: Adding LAN server Node e7da9b9c-03cd-f06e-4528-868ac7031a29 (Addr: tcp/127.0.0.1:11596) (DC: dc1)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.366620 [INFO] consul: Handled member-join event for server "Node e7da9b9c-03cd-f06e-4528-868ac7031a29.dc1" in area "wan"
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.367146 [INFO] agent: Started DNS server 127.0.0.1:11591 (tcp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.367231 [INFO] agent: Started DNS server 127.0.0.1:11591 (udp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.369092 [INFO] agent: Started HTTP server on 127.0.0.1:11592 (tcp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.369168 [INFO] agent: started state syncer
2019/11/27 02:17:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:48 [INFO]  raft: Node at 127.0.0.1:11596 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:48 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:48 [INFO]  raft: Node at 127.0.0.1:11596 [Leader] entering Leader state
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.888206 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:48.888657 [INFO] consul: New leader elected: Node e7da9b9c-03cd-f06e-4528-868ac7031a29
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.372653 [INFO] agent: Synced node info
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.659472 [DEBUG] dns: request for name google.node.consul. type TXT class IN (took 1.19071ms) from client 127.0.0.1:56911 (udp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.660969 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.661067 [INFO] consul: shutting down server
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.661121 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.788387 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.865550 [INFO] manager: shutting down
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.866345 [INFO] agent: consul server down
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.866402 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.866459 [INFO] agent: Stopping DNS server 127.0.0.1:11591 (tcp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.866597 [INFO] agent: Stopping DNS server 127.0.0.1:11591 (udp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.866748 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.866861 [INFO] agent: Stopping HTTP server 127.0.0.1:11592 (tcp)
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.867057 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TXT - 2019/11/27 02:17:49.867131 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TXT (2.42s)
=== RUN   TestDNS_NodeLookup_TXT_DontSuppress
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:49.931826 [WARN] agent: Node name "Node 445ef4c3-18a0-87f1-507c-43457d16845f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:49.932452 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:49.932629 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:49.932911 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:49.933183 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:445ef4c3-18a0-87f1-507c-43457d16845f Address:127.0.0.1:11602}]
2019/11/27 02:17:50 [INFO]  raft: Node at 127.0.0.1:11602 [Follower] entering Follower state (Leader: "")
2019/11/27 02:17:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:50 [INFO]  raft: Node at 127.0.0.1:11602 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.098819 [WARN] raft: Unable to get address for server id 445ef4c3-18a0-87f1-507c-43457d16845f, using fallback address 127.0.0.1:11602: Could not find address for server id 445ef4c3-18a0-87f1-507c-43457d16845f
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.355607 [INFO] serf: EventMemberJoin: Node 445ef4c3-18a0-87f1-507c-43457d16845f.dc1 127.0.0.1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.359241 [INFO] serf: EventMemberJoin: Node 445ef4c3-18a0-87f1-507c-43457d16845f 127.0.0.1
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.360236 [INFO] consul: Adding LAN server Node 445ef4c3-18a0-87f1-507c-43457d16845f (Addr: tcp/127.0.0.1:11602) (DC: dc1)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.360573 [INFO] consul: Handled member-join event for server "Node 445ef4c3-18a0-87f1-507c-43457d16845f.dc1" in area "wan"
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.360651 [INFO] agent: Started DNS server 127.0.0.1:11597 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.361021 [INFO] agent: Started DNS server 127.0.0.1:11597 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.363142 [INFO] agent: Started HTTP server on 127.0.0.1:11598 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.363294 [INFO] agent: started state syncer
2019/11/27 02:17:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:51 [INFO]  raft: Node at 127.0.0.1:11602 [Leader] entering Leader state
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.455402 [INFO] consul: New leader elected: Node 445ef4c3-18a0-87f1-507c-43457d16845f
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:51.455863 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:52.145796 [INFO] agent: Synced node info
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:52.146098 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:52.805973 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.258614 [DEBUG] dns: request for name google.node.consul. type TXT class IN (took 791.029µs) from client 127.0.0.1:40109 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.259244 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.259340 [INFO] consul: shutting down server
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.259388 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.376417 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.487558 [INFO] manager: shutting down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.488379 [INFO] agent: consul server down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.488440 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.488494 [INFO] agent: Stopping DNS server 127.0.0.1:11597 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.488628 [INFO] agent: Stopping DNS server 127.0.0.1:11597 (udp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.488774 [INFO] agent: Stopping HTTP server 127.0.0.1:11598 (tcp)
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.488970 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.489047 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TXT_DontSuppress (3.62s)
=== RUN   TestDNS_NodeLookup_ANY
TestDNS_NodeLookup_TXT_DontSuppress - 2019/11/27 02:17:53.519600 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:53.628995 [WARN] agent: Node name "Node ba973b34-8ef8-dc46-df0c-24abe146b2d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:53.629376 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:53.629440 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:53.629619 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:53.629721 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ba973b34-8ef8-dc46-df0c-24abe146b2d7 Address:127.0.0.1:11608}]
2019/11/27 02:17:54 [INFO]  raft: Node at 127.0.0.1:11608 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.346979 [INFO] serf: EventMemberJoin: Node ba973b34-8ef8-dc46-df0c-24abe146b2d7.dc1 127.0.0.1
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.350327 [INFO] serf: EventMemberJoin: Node ba973b34-8ef8-dc46-df0c-24abe146b2d7 127.0.0.1
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.350972 [INFO] consul: Adding LAN server Node ba973b34-8ef8-dc46-df0c-24abe146b2d7 (Addr: tcp/127.0.0.1:11608) (DC: dc1)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.351158 [INFO] consul: Handled member-join event for server "Node ba973b34-8ef8-dc46-df0c-24abe146b2d7.dc1" in area "wan"
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.352614 [INFO] agent: Started DNS server 127.0.0.1:11603 (tcp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.352711 [INFO] agent: Started DNS server 127.0.0.1:11603 (udp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.354647 [INFO] agent: Started HTTP server on 127.0.0.1:11604 (tcp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.354727 [INFO] agent: started state syncer
2019/11/27 02:17:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:54 [INFO]  raft: Node at 127.0.0.1:11608 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:54 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:54 [INFO]  raft: Node at 127.0.0.1:11608 [Leader] entering Leader state
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.954641 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:54.955132 [INFO] consul: New leader elected: Node ba973b34-8ef8-dc46-df0c-24abe146b2d7
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:55.688238 [INFO] agent: Synced node info
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:55.989755 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 713.36µs) from client 127.0.0.1:55393 (udp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:55.989941 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:55.990013 [INFO] consul: shutting down server
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:55.990061 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.120757 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.243074 [INFO] manager: shutting down
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.298555 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.299189 [INFO] agent: consul server down
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.299366 [INFO] agent: shutdown complete
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.299519 [INFO] agent: Stopping DNS server 127.0.0.1:11603 (tcp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.299779 [INFO] agent: Stopping DNS server 127.0.0.1:11603 (udp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.300074 [INFO] agent: Stopping HTTP server 127.0.0.1:11604 (tcp)
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.300424 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_ANY - 2019/11/27 02:17:56.300578 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_ANY (2.81s)
=== RUN   TestDNS_NodeLookup_ANY_DontSuppressTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:56.409169 [WARN] agent: Node name "Node 5c71cbf7-6411-d91a-3269-2e390b8f0bd2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:56.409622 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:56.409694 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:56.409882 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:56.409989 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5c71cbf7-6411-d91a-3269-2e390b8f0bd2 Address:127.0.0.1:11614}]
2019/11/27 02:17:57 [INFO]  raft: Node at 127.0.0.1:11614 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.420952 [INFO] serf: EventMemberJoin: Node 5c71cbf7-6411-d91a-3269-2e390b8f0bd2.dc1 127.0.0.1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.426442 [INFO] serf: EventMemberJoin: Node 5c71cbf7-6411-d91a-3269-2e390b8f0bd2 127.0.0.1
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.428614 [INFO] consul: Adding LAN server Node 5c71cbf7-6411-d91a-3269-2e390b8f0bd2 (Addr: tcp/127.0.0.1:11614) (DC: dc1)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.429177 [INFO] consul: Handled member-join event for server "Node 5c71cbf7-6411-d91a-3269-2e390b8f0bd2.dc1" in area "wan"
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.429326 [INFO] agent: Started DNS server 127.0.0.1:11609 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.429391 [INFO] agent: Started DNS server 127.0.0.1:11609 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.431909 [INFO] agent: Started HTTP server on 127.0.0.1:11610 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:17:57.432041 [INFO] agent: started state syncer
2019/11/27 02:17:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:57 [INFO]  raft: Node at 127.0.0.1:11614 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:00 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:00 [INFO]  raft: Node at 127.0.0.1:11614 [Leader] entering Leader state
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:00.665427 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:00.665878 [INFO] consul: New leader elected: Node 5c71cbf7-6411-d91a-3269-2e390b8f0bd2
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.089674 [INFO] agent: Synced node info
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.089839 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.657794 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 629.024µs) from client 127.0.0.1:44311 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.658154 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.658219 [INFO] consul: shutting down server
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.658263 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.777907 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.865352 [INFO] manager: shutting down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.987346 [INFO] agent: consul server down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.987434 [INFO] agent: shutdown complete
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.987491 [INFO] agent: Stopping DNS server 127.0.0.1:11609 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.987636 [INFO] agent: Stopping DNS server 127.0.0.1:11609 (udp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.987792 [INFO] agent: Stopping HTTP server 127.0.0.1:11610 (tcp)
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.988028 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.988117 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_ANY_DontSuppressTXT (5.69s)
=== RUN   TestDNS_NodeLookup_A_SuppressTXT
TestDNS_NodeLookup_ANY_DontSuppressTXT - 2019/11/27 02:18:01.990373 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:02.181529 [WARN] agent: Node name "Node a6d63310-7494-bf08-f597-39341a7ee10d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:02.182007 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:02.182080 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:02.182248 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:02.182354 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a6d63310-7494-bf08-f597-39341a7ee10d Address:127.0.0.1:11620}]
2019/11/27 02:18:03 [INFO]  raft: Node at 127.0.0.1:11620 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.459853 [INFO] serf: EventMemberJoin: Node a6d63310-7494-bf08-f597-39341a7ee10d.dc1 127.0.0.1
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.463501 [INFO] serf: EventMemberJoin: Node a6d63310-7494-bf08-f597-39341a7ee10d 127.0.0.1
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.464641 [INFO] consul: Adding LAN server Node a6d63310-7494-bf08-f597-39341a7ee10d (Addr: tcp/127.0.0.1:11620) (DC: dc1)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.464912 [INFO] agent: Started DNS server 127.0.0.1:11615 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.465021 [INFO] consul: Handled member-join event for server "Node a6d63310-7494-bf08-f597-39341a7ee10d.dc1" in area "wan"
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.465322 [INFO] agent: Started DNS server 127.0.0.1:11615 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.467494 [INFO] agent: Started HTTP server on 127.0.0.1:11616 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.467610 [INFO] agent: started state syncer
2019/11/27 02:18:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:03 [INFO]  raft: Node at 127.0.0.1:11620 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:03 [INFO]  raft: Node at 127.0.0.1:11620 [Leader] entering Leader state
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.965369 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:03.965803 [INFO] consul: New leader elected: Node a6d63310-7494-bf08-f597-39341a7ee10d
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:04.587929 [INFO] agent: Synced node info
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:04.588063 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:05.668806 [DEBUG] dns: request for name bar.node.consul. type A class IN (took 2.433093ms) from client 127.0.0.1:35688 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:05.673365 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:05.673452 [INFO] consul: shutting down server
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:05.673504 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:05.820183 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.010724 [INFO] manager: shutting down
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.067701 [ERR] agent: failed to sync remote state: No cluster leader
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.353674 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.354088 [INFO] agent: consul server down
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.354158 [INFO] agent: shutdown complete
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.354256 [INFO] agent: Stopping DNS server 127.0.0.1:11615 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.354502 [INFO] agent: Stopping DNS server 127.0.0.1:11615 (udp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.354781 [INFO] agent: Stopping HTTP server 127.0.0.1:11616 (tcp)
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.355138 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_A_SuppressTXT - 2019/11/27 02:18:06.355247 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_A_SuppressTXT (4.37s)
=== RUN   TestDNS_EDNS0
=== PAUSE TestDNS_EDNS0
=== RUN   TestDNS_EDNS0_ECS
=== PAUSE TestDNS_EDNS0_ECS
=== RUN   TestDNS_ReverseLookup
=== PAUSE TestDNS_ReverseLookup
=== RUN   TestDNS_ReverseLookup_CustomDomain
=== PAUSE TestDNS_ReverseLookup_CustomDomain
=== RUN   TestDNS_ReverseLookup_IPV6
=== PAUSE TestDNS_ReverseLookup_IPV6
=== RUN   TestDNS_ServiceReverseLookup
=== PAUSE TestDNS_ServiceReverseLookup
=== RUN   TestDNS_ServiceReverseLookup_IPV6
=== PAUSE TestDNS_ServiceReverseLookup_IPV6
=== RUN   TestDNS_ServiceReverseLookup_CustomDomain
=== PAUSE TestDNS_ServiceReverseLookup_CustomDomain
=== RUN   TestDNS_SOA_Settings
=== PAUSE TestDNS_SOA_Settings
=== RUN   TestDNS_ServiceReverseLookupNodeAddress
=== PAUSE TestDNS_ServiceReverseLookupNodeAddress
=== RUN   TestDNS_ServiceLookupNoMultiCNAME
--- SKIP: TestDNS_ServiceLookupNoMultiCNAME (0.00s)
    dns_test.go:1202: DM-skipped
=== RUN   TestDNS_ServiceLookupPreferNoCNAME
=== PAUSE TestDNS_ServiceLookupPreferNoCNAME
=== RUN   TestDNS_ServiceLookupMultiAddrNoCNAME
=== PAUSE TestDNS_ServiceLookupMultiAddrNoCNAME
=== RUN   TestDNS_ServiceLookup
=== PAUSE TestDNS_ServiceLookup
=== RUN   TestDNS_ServiceLookupWithInternalServiceAddress
=== PAUSE TestDNS_ServiceLookupWithInternalServiceAddress
=== RUN   TestDNS_ConnectServiceLookup
=== PAUSE TestDNS_ConnectServiceLookup
=== RUN   TestDNS_ExternalServiceLookup
=== PAUSE TestDNS_ExternalServiceLookup
=== RUN   TestDNS_InifiniteRecursion
=== PAUSE TestDNS_InifiniteRecursion
=== RUN   TestDNS_ExternalServiceToConsulCNAMELookup
=== PAUSE TestDNS_ExternalServiceToConsulCNAMELookup
=== RUN   TestDNS_NSRecords
--- SKIP: TestDNS_NSRecords (0.00s)
    dns_test.go:1858: DM-skipped
=== RUN   TestDNS_NSRecords_IPV6
=== PAUSE TestDNS_NSRecords_IPV6
=== RUN   TestDNS_ExternalServiceToConsulCNAMENestedLookup
=== PAUSE TestDNS_ExternalServiceToConsulCNAMENestedLookup
=== RUN   TestDNS_ServiceLookup_ServiceAddress_A
=== PAUSE TestDNS_ServiceLookup_ServiceAddress_A
=== RUN   TestDNS_ServiceLookup_ServiceAddress_CNAME
=== PAUSE TestDNS_ServiceLookup_ServiceAddress_CNAME
=== RUN   TestDNS_ServiceLookup_ServiceAddressIPV6
=== PAUSE TestDNS_ServiceLookup_ServiceAddressIPV6
=== RUN   TestDNS_ServiceLookup_WanAddress
--- SKIP: TestDNS_ServiceLookup_WanAddress (0.00s)
    dns_test.go:2352: DM-skipped
=== RUN   TestDNS_CaseInsensitiveServiceLookup
=== PAUSE TestDNS_CaseInsensitiveServiceLookup
=== RUN   TestDNS_ServiceLookup_TagPeriod
=== PAUSE TestDNS_ServiceLookup_TagPeriod
=== RUN   TestDNS_PreparedQueryNearIPEDNS
=== PAUSE TestDNS_PreparedQueryNearIPEDNS
=== RUN   TestDNS_PreparedQueryNearIP
=== PAUSE TestDNS_PreparedQueryNearIP
=== RUN   TestDNS_ServiceLookup_PreparedQueryNamePeriod
=== PAUSE TestDNS_ServiceLookup_PreparedQueryNamePeriod
=== RUN   TestDNS_ServiceLookup_Dedup
=== PAUSE TestDNS_ServiceLookup_Dedup
=== RUN   TestDNS_ServiceLookup_Dedup_SRV
=== PAUSE TestDNS_ServiceLookup_Dedup_SRV
=== RUN   TestDNS_Recurse
=== PAUSE TestDNS_Recurse
=== RUN   TestDNS_Recurse_Truncation
=== PAUSE TestDNS_Recurse_Truncation
=== RUN   TestDNS_RecursorTimeout
=== PAUSE TestDNS_RecursorTimeout
=== RUN   TestDNS_ServiceLookup_FilterCritical
=== PAUSE TestDNS_ServiceLookup_FilterCritical
=== RUN   TestDNS_ServiceLookup_OnlyFailing
=== PAUSE TestDNS_ServiceLookup_OnlyFailing
=== RUN   TestDNS_ServiceLookup_OnlyPassing
=== PAUSE TestDNS_ServiceLookup_OnlyPassing
=== RUN   TestDNS_ServiceLookup_Randomize
=== PAUSE TestDNS_ServiceLookup_Randomize
=== RUN   TestBinarySearch
=== PAUSE TestBinarySearch
=== RUN   TestDNS_TCP_and_UDP_Truncate
--- SKIP: TestDNS_TCP_and_UDP_Truncate (0.00s)
    dns_test.go:3882: DM-skipped
=== RUN   TestDNS_ServiceLookup_Truncate
=== PAUSE TestDNS_ServiceLookup_Truncate
=== RUN   TestDNS_ServiceLookup_LargeResponses
=== PAUSE TestDNS_ServiceLookup_LargeResponses
=== RUN   TestDNS_ServiceLookup_ARecordLimits
--- SKIP: TestDNS_ServiceLookup_ARecordLimits (0.00s)
    dns_test.go:4321: DM-skipped
=== RUN   TestDNS_ServiceLookup_AnswerLimits
=== PAUSE TestDNS_ServiceLookup_AnswerLimits
=== RUN   TestDNS_ServiceLookup_CNAME
--- SKIP: TestDNS_ServiceLookup_CNAME (0.00s)
    dns_test.go:4466: DM-skipped
=== RUN   TestDNS_NodeLookup_TTL
=== PAUSE TestDNS_NodeLookup_TTL
=== RUN   TestDNS_ServiceLookup_TTL
=== PAUSE TestDNS_ServiceLookup_TTL
=== RUN   TestDNS_PreparedQuery_TTL
=== PAUSE TestDNS_PreparedQuery_TTL
=== RUN   TestDNS_PreparedQuery_Failover
=== PAUSE TestDNS_PreparedQuery_Failover
=== RUN   TestDNS_ServiceLookup_SRV_RFC
=== PAUSE TestDNS_ServiceLookup_SRV_RFC
=== RUN   TestDNS_ServiceLookup_SRV_RFC_TCP_Default
=== PAUSE TestDNS_ServiceLookup_SRV_RFC_TCP_Default
=== RUN   TestDNS_ServiceLookup_FilterACL
=== PAUSE TestDNS_ServiceLookup_FilterACL
=== RUN   TestDNS_ServiceLookup_MetaTXT
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:06.742232 [WARN] agent: Node name "Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:06.742689 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:06.742757 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:06.743087 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:06.743233 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bc3dc306-f13d-1351-ae8e-d7540a3a5ddc Address:127.0.0.1:11626}]
2019/11/27 02:18:10 [INFO]  raft: Node at 127.0.0.1:11626 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.060442 [INFO] serf: EventMemberJoin: Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc.dc1 127.0.0.1
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.072197 [INFO] serf: EventMemberJoin: Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc 127.0.0.1
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.079161 [INFO] consul: Handled member-join event for server "Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc.dc1" in area "wan"
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.079259 [INFO] consul: Adding LAN server Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc (Addr: tcp/127.0.0.1:11626) (DC: dc1)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.079847 [INFO] agent: Started DNS server 127.0.0.1:11621 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.083561 [INFO] agent: Started DNS server 127.0.0.1:11621 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.085718 [INFO] agent: Started HTTP server on 127.0.0.1:11622 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:10.085942 [INFO] agent: started state syncer
2019/11/27 02:18:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:10 [INFO]  raft: Node at 127.0.0.1:11626 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:11 [INFO]  raft: Node at 127.0.0.1:11626 [Leader] entering Leader state
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:11.175838 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:11.176338 [INFO] consul: New leader elected: Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:11.781300 [INFO] agent: Synced node info
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:11.781420 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:12.334980 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:13.578523 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 1.067374ms) from client 127.0.0.1:49605 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:13.578745 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:13.578829 [INFO] consul: shutting down server
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:13.578877 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:13.864123 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.237263 [WARN] consul: error getting server health from "Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc": rpc error making call: EOF
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.442605 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.443137 [DEBUG] consul: Skipping self join check for "Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc" since the cluster is too small
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.443329 [INFO] consul: member 'Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc' joined, marking health alive
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.609119 [INFO] manager: shutting down
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.798079 [INFO] agent: consul server down
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.798194 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.798298 [INFO] agent: Stopping DNS server 127.0.0.1:11621 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.798524 [INFO] agent: Stopping DNS server 127.0.0.1:11621 (udp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.798758 [INFO] agent: Stopping HTTP server 127.0.0.1:11622 (tcp)
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.799029 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.799118 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_MetaTXT (8.41s)
=== RUN   TestDNS_ServiceLookup_SuppressTXT
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:14.801500 [ERR] consul: failed to reconcile member: {Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc 127.0.0.1 11624 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:bc3dc306-f13d-1351-ae8e-d7540a3a5ddc port:11626 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:11625] alive 1 5 2 2 5 4}: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:14.923174 [WARN] agent: Node name "Node 82367347-d378-fc29-f182-293269106b7f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:14.924604 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:14.924844 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:14.925516 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:14.926653 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_MetaTXT - 2019/11/27 02:18:15.231616 [WARN] consul: error getting server health from "Node bc3dc306-f13d-1351-ae8e-d7540a3a5ddc": context deadline exceeded
2019/11/27 02:18:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:82367347-d378-fc29-f182-293269106b7f Address:127.0.0.1:11632}]
2019/11/27 02:18:16 [INFO]  raft: Node at 127.0.0.1:11632 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.357188 [INFO] serf: EventMemberJoin: Node 82367347-d378-fc29-f182-293269106b7f.dc1 127.0.0.1
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.361299 [INFO] serf: EventMemberJoin: Node 82367347-d378-fc29-f182-293269106b7f 127.0.0.1
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.362593 [INFO] consul: Adding LAN server Node 82367347-d378-fc29-f182-293269106b7f (Addr: tcp/127.0.0.1:11632) (DC: dc1)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.363452 [INFO] consul: Handled member-join event for server "Node 82367347-d378-fc29-f182-293269106b7f.dc1" in area "wan"
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.364510 [INFO] agent: Started DNS server 127.0.0.1:11627 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.364904 [INFO] agent: Started DNS server 127.0.0.1:11627 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.367385 [INFO] agent: Started HTTP server on 127.0.0.1:11628 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:16.367482 [INFO] agent: started state syncer
2019/11/27 02:18:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:16 [INFO]  raft: Node at 127.0.0.1:11632 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:18 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:18 [INFO]  raft: Node at 127.0.0.1:11632 [Leader] entering Leader state
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:18.731094 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:18.731498 [INFO] consul: New leader elected: Node 82367347-d378-fc29-f182-293269106b7f
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.153509 [INFO] agent: Synced node info
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.455220 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 668.692µs) from client 127.0.0.1:47763 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.455421 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.455529 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.455586 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.563735 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.686139 [INFO] manager: shutting down
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.730484 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.730749 [INFO] agent: consul server down
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.730804 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.730858 [INFO] agent: Stopping DNS server 127.0.0.1:11627 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.731010 [INFO] agent: Stopping DNS server 127.0.0.1:11627 (udp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.731170 [INFO] agent: Stopping HTTP server 127.0.0.1:11628 (tcp)
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.731393 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SuppressTXT - 2019/11/27 02:18:19.731465 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SuppressTXT (4.93s)
=== RUN   TestDNS_AddressLookup
=== PAUSE TestDNS_AddressLookup
=== RUN   TestDNS_AddressLookupIPV6
--- SKIP: TestDNS_AddressLookupIPV6 (0.00s)
    dns_test.go:5329: DM-skipped
=== RUN   TestDNS_NonExistingLookup
=== PAUSE TestDNS_NonExistingLookup
=== RUN   TestDNS_NonExistingLookupEmptyAorAAAA
=== PAUSE TestDNS_NonExistingLookupEmptyAorAAAA
=== RUN   TestDNS_PreparedQuery_AllowStale
=== PAUSE TestDNS_PreparedQuery_AllowStale
=== RUN   TestDNS_InvalidQueries
=== PAUSE TestDNS_InvalidQueries
=== RUN   TestDNS_PreparedQuery_AgentSource
=== PAUSE TestDNS_PreparedQuery_AgentSource
=== RUN   TestDNS_trimUDPResponse_NoTrim
=== PAUSE TestDNS_trimUDPResponse_NoTrim
=== RUN   TestDNS_trimUDPResponse_TrimLimit
=== PAUSE TestDNS_trimUDPResponse_TrimLimit
=== RUN   TestDNS_trimUDPResponse_TrimSize
=== PAUSE TestDNS_trimUDPResponse_TrimSize
=== RUN   TestDNS_trimUDPResponse_TrimSizeEDNS
=== PAUSE TestDNS_trimUDPResponse_TrimSizeEDNS
=== RUN   TestDNS_syncExtra
=== PAUSE TestDNS_syncExtra
=== RUN   TestDNS_Compression_trimUDPResponse
=== PAUSE TestDNS_Compression_trimUDPResponse
=== RUN   TestDNS_Compression_Query
=== PAUSE TestDNS_Compression_Query
=== RUN   TestDNS_Compression_ReverseLookup
=== PAUSE TestDNS_Compression_ReverseLookup
=== RUN   TestDNS_Compression_Recurse
=== PAUSE TestDNS_Compression_Recurse
=== RUN   TestDNSInvalidRegex
=== RUN   TestDNSInvalidRegex/Valid_Hostname
=== RUN   TestDNSInvalidRegex/Valid_Hostname#01
=== RUN   TestDNSInvalidRegex/Invalid_Hostname_with_special_chars
=== RUN   TestDNSInvalidRegex/Invalid_Hostname_with_special_chars_in_the_end
=== RUN   TestDNSInvalidRegex/Whitespace
=== RUN   TestDNSInvalidRegex/Only_special_chars
--- PASS: TestDNSInvalidRegex (0.00s)
    --- PASS: TestDNSInvalidRegex/Valid_Hostname (0.00s)
    --- PASS: TestDNSInvalidRegex/Valid_Hostname#01 (0.00s)
    --- PASS: TestDNSInvalidRegex/Invalid_Hostname_with_special_chars (0.00s)
    --- PASS: TestDNSInvalidRegex/Invalid_Hostname_with_special_chars_in_the_end (0.00s)
    --- PASS: TestDNSInvalidRegex/Whitespace (0.00s)
    --- PASS: TestDNSInvalidRegex/Only_special_chars (0.00s)
=== RUN   TestDNS_formatNodeRecord
--- PASS: TestDNS_formatNodeRecord (0.00s)
=== RUN   TestEventFire
=== PAUSE TestEventFire
=== RUN   TestEventFire_token
=== PAUSE TestEventFire_token
=== RUN   TestEventList
=== PAUSE TestEventList
=== RUN   TestEventList_Filter
=== PAUSE TestEventList_Filter
=== RUN   TestEventList_ACLFilter
=== PAUSE TestEventList_ACLFilter
=== RUN   TestEventList_Blocking
=== PAUSE TestEventList_Blocking
=== RUN   TestEventList_EventBufOrder
=== PAUSE TestEventList_EventBufOrder
=== RUN   TestUUIDToUint64
=== PAUSE TestUUIDToUint64
=== RUN   TestHealthChecksInState
--- SKIP: TestHealthChecksInState (0.00s)
    health_endpoint_test.go:22: DM-skipped
=== RUN   TestHealthChecksInState_NodeMetaFilter
=== PAUSE TestHealthChecksInState_NodeMetaFilter
=== RUN   TestHealthChecksInState_DistanceSort
=== PAUSE TestHealthChecksInState_DistanceSort
=== RUN   TestHealthNodeChecks
=== PAUSE TestHealthNodeChecks
=== RUN   TestHealthServiceChecks
=== PAUSE TestHealthServiceChecks
=== RUN   TestHealthServiceChecks_NodeMetaFilter
=== PAUSE TestHealthServiceChecks_NodeMetaFilter
=== RUN   TestHealthServiceChecks_DistanceSort
=== PAUSE TestHealthServiceChecks_DistanceSort
=== RUN   TestHealthServiceNodes
=== PAUSE TestHealthServiceNodes
=== RUN   TestHealthServiceNodes_NodeMetaFilter
=== PAUSE TestHealthServiceNodes_NodeMetaFilter
=== RUN   TestHealthServiceNodes_DistanceSort
=== PAUSE TestHealthServiceNodes_DistanceSort
=== RUN   TestHealthServiceNodes_PassingFilter
--- SKIP: TestHealthServiceNodes_PassingFilter (0.00s)
    health_endpoint_test.go:664: DM-skipped
=== RUN   TestHealthServiceNodes_WanTranslation
=== PAUSE TestHealthServiceNodes_WanTranslation
=== RUN   TestHealthConnectServiceNodes
=== PAUSE TestHealthConnectServiceNodes
=== RUN   TestHealthConnectServiceNodes_PassingFilter
=== PAUSE TestHealthConnectServiceNodes_PassingFilter
=== RUN   TestFilterNonPassing
=== PAUSE TestFilterNonPassing
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:19.902935 [WARN] agent: Node name "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:19.903348 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:19.907803 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:19.908267 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:19.908385 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ee0bf48f-de34-6655-a3c7-581dae087e0c Address:127.0.0.1:11638}]
2019/11/27 02:18:21 [INFO]  raft: Node at 127.0.0.1:11638 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.245321 [INFO] serf: EventMemberJoin: Node ee0bf48f-de34-6655-a3c7-581dae087e0c.dc1 127.0.0.1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.248842 [INFO] serf: EventMemberJoin: Node ee0bf48f-de34-6655-a3c7-581dae087e0c 127.0.0.1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.249810 [INFO] consul: Adding LAN server Node ee0bf48f-de34-6655-a3c7-581dae087e0c (Addr: tcp/127.0.0.1:11638) (DC: dc1)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.249981 [INFO] consul: Handled member-join event for server "Node ee0bf48f-de34-6655-a3c7-581dae087e0c.dc1" in area "wan"
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.250985 [INFO] agent: Started DNS server 127.0.0.1:11633 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.251070 [INFO] agent: Started DNS server 127.0.0.1:11633 (udp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.253282 [INFO] agent: Started HTTP server on 127.0.0.1:11634 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.253388 [INFO] agent: started state syncer
2019/11/27 02:18:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:21 [INFO]  raft: Node at 127.0.0.1:11638 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:21 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:21 [INFO]  raft: Node at 127.0.0.1:11638 [Leader] entering Leader state
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.741858 [INFO] consul: cluster leadership acquired
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.742279 [INFO] consul: New leader elected: Node ee0bf48f-de34-6655-a3c7-581dae087e0c
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:21.745811 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:22.001152 [INFO] acl: initializing acls
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:22.442541 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:22.442643 [WARN] consul: Configuring a non-UUID master token is deprecated
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:22.445126 [INFO] acl: initializing acls
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:22.445269 [WARN] consul: Configuring a non-UUID master token is deprecated
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:22.709406 [INFO] consul: Bootstrapped ACL master token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.142041 [INFO] consul: Bootstrapped ACL master token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.142751 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.142833 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.144104 [INFO] serf: EventMemberUpdate: Node ee0bf48f-de34-6655-a3c7-581dae087e0c
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.145579 [INFO] serf: EventMemberUpdate: Node ee0bf48f-de34-6655-a3c7-581dae087e0c.dc1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.420114 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.421000 [INFO] serf: EventMemberUpdate: Node ee0bf48f-de34-6655-a3c7-581dae087e0c
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:23.421761 [INFO] serf: EventMemberUpdate: Node ee0bf48f-de34-6655-a3c7-581dae087e0c.dc1
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:25.487045 [INFO] agent: Synced node info
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:25.487176 [DEBUG] agent: Node info in sync
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:27.419586 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:27.419996 [DEBUG] consul: Skipping self join check for "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:27.420168 [INFO] consul: member 'Node ee0bf48f-de34-6655-a3c7-581dae087e0c' joined, marking health alive
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.421779 [DEBUG] consul: Skipping self join check for "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" since the cluster is too small
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.422240 [DEBUG] consul: Skipping self join check for "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" since the cluster is too small
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.448159 [ERR] http: Request GET /v1/query/, error: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:60848
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.452307 [DEBUG] http: Request GET /v1/query/ (4.560507ms) from=127.0.0.1:60848
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.470205 [ERR] http: Request PUT /v1/query/, error: Prepared Query lookup failed: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:60850
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.471175 [DEBUG] http: Request PUT /v1/query/ (1.651396ms) from=127.0.0.1:60850
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.478542 [ERR] http: Request POST /v1/query/, error: method POST not allowed from=127.0.0.1:60852
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.479792 [DEBUG] http: Request POST /v1/query/ (1.30705ms) from=127.0.0.1:60852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.483375 [ERR] http: Request DELETE /v1/query/, error: Prepared Query lookup failed: failed prepared query lookup: index error: UUID must be 36 characters from=127.0.0.1:60854
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.484542 [DEBUG] http: Request DELETE /v1/query/ (1.613395ms) from=127.0.0.1:60854
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.487807 [ERR] http: Request HEAD /v1/query/, error: method HEAD not allowed from=127.0.0.1:60856
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.488070 [DEBUG] http: Request HEAD /v1/query/ (283.344µs) from=127.0.0.1:60856
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.491092 [DEBUG] http: Request OPTIONS /v1/query/ (1.553392ms) from=127.0.0.1:60856
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.494714 [DEBUG] http: Request GET /v1/query/xxx/execute (763.362µs) from=127.0.0.1:60858
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.497503 [ERR] http: Request PUT /v1/query/xxx/execute, error: method PUT not allowed from=127.0.0.1:60860
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.498411 [DEBUG] http: Request PUT /v1/query/xxx/execute (893.034µs) from=127.0.0.1:60860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.503793 [ERR] http: Request POST /v1/query/xxx/execute, error: method POST not allowed from=127.0.0.1:60862
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.504478 [DEBUG] http: Request POST /v1/query/xxx/execute (673.359µs) from=127.0.0.1:60862
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.507247 [ERR] http: Request DELETE /v1/query/xxx/execute, error: method DELETE not allowed from=127.0.0.1:60864
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.507925 [DEBUG] http: Request DELETE /v1/query/xxx/execute (686.026µs) from=127.0.0.1:60864
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.510971 [ERR] http: Request HEAD /v1/query/xxx/execute, error: method HEAD not allowed from=127.0.0.1:60866
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.511120 [DEBUG] http: Request HEAD /v1/query/xxx/execute (164.006µs) from=127.0.0.1:60866
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/execute
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.513549 [DEBUG] http: Request OPTIONS /v1/query/xxx/execute (1.081375ms) from=127.0.0.1:60866
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.517709 [DEBUG] http: Request GET /v1/query/xxx/explain (1.244381ms) from=127.0.0.1:60868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.520941 [ERR] http: Request PUT /v1/query/xxx/explain, error: method PUT not allowed from=127.0.0.1:60870
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.521575 [DEBUG] http: Request PUT /v1/query/xxx/explain (650.025µs) from=127.0.0.1:60870
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.524467 [ERR] http: Request POST /v1/query/xxx/explain, error: method POST not allowed from=127.0.0.1:60872
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.525016 [DEBUG] http: Request POST /v1/query/xxx/explain (553.021µs) from=127.0.0.1:60872
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.527862 [ERR] http: Request DELETE /v1/query/xxx/explain, error: method DELETE not allowed from=127.0.0.1:60874
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.528535 [DEBUG] http: Request DELETE /v1/query/xxx/explain (680.693µs) from=127.0.0.1:60874
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.531865 [ERR] http: Request HEAD /v1/query/xxx/explain, error: method HEAD not allowed from=127.0.0.1:60876
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.532178 [DEBUG] http: Request HEAD /v1/query/xxx/explain (299.012µs) from=127.0.0.1:60876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/explain
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.535060 [DEBUG] http: Request OPTIONS /v1/query/xxx/explain (1.241714ms) from=127.0.0.1:60876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.539776 [DEBUG] http: Request GET /v1/query (1.526058ms) from=127.0.0.1:60878
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.543688 [ERR] http: Request PUT /v1/query, error: method PUT not allowed from=127.0.0.1:60880
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.544371 [DEBUG] http: Request PUT /v1/query (678.359µs) from=127.0.0.1:60880
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.548161 [DEBUG] http: Request POST /v1/query (694.693µs) from=127.0.0.1:60882
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.551415 [ERR] http: Request DELETE /v1/query, error: method DELETE not allowed from=127.0.0.1:60884
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.552102 [DEBUG] http: Request DELETE /v1/query (693.359µs) from=127.0.0.1:60884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.555273 [ERR] http: Request HEAD /v1/query, error: method HEAD not allowed from=127.0.0.1:60886
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.555421 [DEBUG] http: Request HEAD /v1/query (157.672µs) from=127.0.0.1:60886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.556911 [DEBUG] http: Request OPTIONS /v1/query (14.334µs) from=127.0.0.1:60886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.558297 [ERR] http: Request GET /v1/agent/check/deregister/, error: method GET not allowed from=127.0.0.1:60886
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.558782 [DEBUG] http: Request GET /v1/agent/check/deregister/ (489.018µs) from=127.0.0.1:60886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.564163 [ERR] http: Request PUT /v1/agent/check/deregister/, error: Unknown check "" from=127.0.0.1:60888
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.564697 [DEBUG] http: Request PUT /v1/agent/check/deregister/ (3.026782ms) from=127.0.0.1:60888
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.567222 [ERR] http: Request POST /v1/agent/check/deregister/, error: method POST not allowed from=127.0.0.1:60890
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.567885 [DEBUG] http: Request POST /v1/agent/check/deregister/ (669.692µs) from=127.0.0.1:60890
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.570645 [ERR] http: Request DELETE /v1/agent/check/deregister/, error: method DELETE not allowed from=127.0.0.1:60892
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.571302 [DEBUG] http: Request DELETE /v1/agent/check/deregister/ (663.359µs) from=127.0.0.1:60892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.574741 [ERR] http: Request HEAD /v1/agent/check/deregister/, error: method HEAD not allowed from=127.0.0.1:60894
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.574872 [DEBUG] http: Request HEAD /v1/agent/check/deregister/ (153.006µs) from=127.0.0.1:60894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.576151 [DEBUG] http: Request OPTIONS /v1/agent/check/deregister/ (14µs) from=127.0.0.1:60894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.577574 [ERR] http: Request GET /v1/agent/check/pass/, error: method GET not allowed from=127.0.0.1:60894
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.578074 [DEBUG] http: Request GET /v1/agent/check/pass/ (505.685µs) from=127.0.0.1:60894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.580761 [ERR] http: Request PUT /v1/agent/check/pass/, error: Unknown check "" from=127.0.0.1:60896
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.581260 [DEBUG] http: Request PUT /v1/agent/check/pass/ (636.69µs) from=127.0.0.1:60896
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.583932 [ERR] http: Request POST /v1/agent/check/pass/, error: method POST not allowed from=127.0.0.1:60898
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.584435 [DEBUG] http: Request POST /v1/agent/check/pass/ (503.686µs) from=127.0.0.1:60898
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.587061 [ERR] http: Request DELETE /v1/agent/check/pass/, error: method DELETE not allowed from=127.0.0.1:60900
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.587558 [DEBUG] http: Request DELETE /v1/agent/check/pass/ (504.353µs) from=127.0.0.1:60900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.589989 [ERR] http: Request HEAD /v1/agent/check/pass/, error: method HEAD not allowed from=127.0.0.1:60902
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.590112 [DEBUG] http: Request HEAD /v1/agent/check/pass/ (140.006µs) from=127.0.0.1:60902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/pass/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.591548 [DEBUG] http: Request OPTIONS /v1/agent/check/pass/ (15.334µs) from=127.0.0.1:60902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.593081 [DEBUG] http: Request GET /v1/agent/connect/proxy/ (358.68µs) from=127.0.0.1:60902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.595617 [ERR] http: Request PUT /v1/agent/connect/proxy/, error: method PUT not allowed from=127.0.0.1:60904
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.596197 [DEBUG] http: Request PUT /v1/agent/connect/proxy/ (583.022µs) from=127.0.0.1:60904
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.598917 [ERR] http: Request POST /v1/agent/connect/proxy/, error: method POST not allowed from=127.0.0.1:60906
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.599494 [DEBUG] http: Request POST /v1/agent/connect/proxy/ (573.022µs) from=127.0.0.1:60906
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.602727 [ERR] http: Request DELETE /v1/agent/connect/proxy/, error: method DELETE not allowed from=127.0.0.1:60908
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.603295 [DEBUG] http: Request DELETE /v1/agent/connect/proxy/ (566.688µs) from=127.0.0.1:60908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.605835 [ERR] http: Request HEAD /v1/agent/connect/proxy/, error: method HEAD not allowed from=127.0.0.1:60910
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.605973 [DEBUG] http: Request HEAD /v1/agent/connect/proxy/ (155.673µs) from=127.0.0.1:60910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/proxy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.607285 [DEBUG] http: Request OPTIONS /v1/agent/connect/proxy/ (14.001µs) from=127.0.0.1:60910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.608415 [ERR] http: Request GET /v1/agent/service/register, error: method GET not allowed from=127.0.0.1:60910
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.608872 [DEBUG] http: Request GET /v1/agent/service/register (460.351µs) from=127.0.0.1:60910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.611900 [DEBUG] http: Request PUT /v1/agent/service/register (507.686µs) from=127.0.0.1:60912
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.615100 [ERR] http: Request POST /v1/agent/service/register, error: method POST not allowed from=127.0.0.1:60914
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.615584 [DEBUG] http: Request POST /v1/agent/service/register (484.352µs) from=127.0.0.1:60914
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.618279 [ERR] http: Request DELETE /v1/agent/service/register, error: method DELETE not allowed from=127.0.0.1:60916
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.618899 [DEBUG] http: Request DELETE /v1/agent/service/register (623.023µs) from=127.0.0.1:60916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.621758 [ERR] http: Request HEAD /v1/agent/service/register, error: method HEAD not allowed from=127.0.0.1:60918
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.622065 [DEBUG] http: Request HEAD /v1/agent/service/register (365.68µs) from=127.0.0.1:60918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.623398 [DEBUG] http: Request OPTIONS /v1/agent/service/register (14.334µs) from=127.0.0.1:60918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.625091 [ERR] http: Request GET /v1/operator/autopilot/configuration, error: Permission denied from=127.0.0.1:60918
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.625662 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (952.369µs) from=127.0.0.1:60918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.628879 [DEBUG] http: Request PUT /v1/operator/autopilot/configuration (369.681µs) from=127.0.0.1:60920
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.631912 [ERR] http: Request POST /v1/operator/autopilot/configuration, error: method POST not allowed from=127.0.0.1:60922
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.632430 [DEBUG] http: Request POST /v1/operator/autopilot/configuration (515.687µs) from=127.0.0.1:60922
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.635493 [ERR] http: Request DELETE /v1/operator/autopilot/configuration, error: method DELETE not allowed from=127.0.0.1:60924
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.636145 [DEBUG] http: Request DELETE /v1/operator/autopilot/configuration (651.691µs) from=127.0.0.1:60924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.640289 [ERR] http: Request HEAD /v1/operator/autopilot/configuration, error: method HEAD not allowed from=127.0.0.1:60926
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.640424 [DEBUG] http: Request HEAD /v1/operator/autopilot/configuration (145.338µs) from=127.0.0.1:60926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.641931 [DEBUG] http: Request OPTIONS /v1/operator/autopilot/configuration (15.668µs) from=127.0.0.1:60926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.643840 [ERR] http: Request GET /v1/operator/autopilot/health, error: Permission denied from=127.0.0.1:60926
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.644317 [DEBUG] http: Request GET /v1/operator/autopilot/health (973.371µs) from=127.0.0.1:60926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.646933 [ERR] http: Request PUT /v1/operator/autopilot/health, error: method PUT not allowed from=127.0.0.1:60928
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.647447 [DEBUG] http: Request PUT /v1/operator/autopilot/health (522.354µs) from=127.0.0.1:60928
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.650033 [ERR] http: Request POST /v1/operator/autopilot/health, error: method POST not allowed from=127.0.0.1:60930
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.650533 [DEBUG] http: Request POST /v1/operator/autopilot/health (499.018µs) from=127.0.0.1:60930
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.653128 [ERR] http: Request DELETE /v1/operator/autopilot/health, error: method DELETE not allowed from=127.0.0.1:60932
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.653783 [DEBUG] http: Request DELETE /v1/operator/autopilot/health (661.025µs) from=127.0.0.1:60932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.656427 [ERR] http: Request HEAD /v1/operator/autopilot/health, error: method HEAD not allowed from=127.0.0.1:60934
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.656561 [DEBUG] http: Request HEAD /v1/operator/autopilot/health (154.006µs) from=127.0.0.1:60934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/health
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.657914 [DEBUG] http: Request OPTIONS /v1/operator/autopilot/health (13.667µs) from=127.0.0.1:60934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.659589 [DEBUG] http: Request GET /v1/session/node/ (363.348µs) from=127.0.0.1:60934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.662852 [ERR] http: Request PUT /v1/session/node/, error: method PUT not allowed from=127.0.0.1:60936
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.663518 [DEBUG] http: Request PUT /v1/session/node/ (671.025µs) from=127.0.0.1:60936
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.666257 [ERR] http: Request POST /v1/session/node/, error: method POST not allowed from=127.0.0.1:60938
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.666977 [DEBUG] http: Request POST /v1/session/node/ (721.694µs) from=127.0.0.1:60938
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.669920 [ERR] http: Request DELETE /v1/session/node/, error: method DELETE not allowed from=127.0.0.1:60940
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.670571 [DEBUG] http: Request DELETE /v1/session/node/ (655.691µs) from=127.0.0.1:60940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.674071 [ERR] http: Request HEAD /v1/session/node/, error: method HEAD not allowed from=127.0.0.1:60942
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.674215 [DEBUG] http: Request HEAD /v1/session/node/ (179.007µs) from=127.0.0.1:60942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.677483 [DEBUG] http: Request OPTIONS /v1/session/node/ (16µs) from=127.0.0.1:60942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.678945 [ERR] http: Request GET /v1/acl/token/, error: Bad request: Missing token ID from=127.0.0.1:60942
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.679466 [DEBUG] http: Request GET /v1/acl/token/ (528.354µs) from=127.0.0.1:60942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.690000 [ERR] http: Request PUT /v1/acl/token/, error: Bad request: Token decoding failed: EOF from=127.0.0.1:60944
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.691642 [DEBUG] http: Request PUT /v1/acl/token/ (1.763733ms) from=127.0.0.1:60944
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.696325 [ERR] http: Request POST /v1/acl/token/, error: method POST not allowed from=127.0.0.1:60946
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.703163 [DEBUG] http: Request POST /v1/acl/token/ (6.791925ms) from=127.0.0.1:60946
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.706501 [ERR] http: Request DELETE /v1/acl/token/, error: Bad request: Missing token ID from=127.0.0.1:60948
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.707339 [DEBUG] http: Request DELETE /v1/acl/token/ (838.031µs) from=127.0.0.1:60948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.733753 [ERR] http: Request HEAD /v1/acl/token/, error: method HEAD not allowed from=127.0.0.1:60950
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.733917 [DEBUG] http: Request HEAD /v1/acl/token/ (182.34µs) from=127.0.0.1:60950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.735744 [DEBUG] http: Request OPTIONS /v1/acl/token/ (14.334µs) from=127.0.0.1:60950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.740677 [DEBUG] http: Request GET /v1/health/state/ (1.518724ms) from=127.0.0.1:60950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.744868 [ERR] http: Request PUT /v1/health/state/, error: method PUT not allowed from=127.0.0.1:60952
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.745728 [DEBUG] http: Request PUT /v1/health/state/ (824.698µs) from=127.0.0.1:60952
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.748419 [ERR] http: Request POST /v1/health/state/, error: method POST not allowed from=127.0.0.1:60954
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.748923 [DEBUG] http: Request POST /v1/health/state/ (516.02µs) from=127.0.0.1:60954
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.752429 [ERR] http: Request DELETE /v1/health/state/, error: method DELETE not allowed from=127.0.0.1:60956
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.752891 [DEBUG] http: Request DELETE /v1/health/state/ (470.684µs) from=127.0.0.1:60956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.755905 [ERR] http: Request HEAD /v1/health/state/, error: method HEAD not allowed from=127.0.0.1:60958
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.756043 [DEBUG] http: Request HEAD /v1/health/state/ (155.339µs) from=127.0.0.1:60958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/state/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.757473 [DEBUG] http: Request OPTIONS /v1/health/state/ (17.001µs) from=127.0.0.1:60958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.759775 [DEBUG] http: Request GET /v1/kv/ (557.021µs) from=127.0.0.1:60958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.764442 [DEBUG] http: Request PUT /v1/kv/ (756.029µs) from=127.0.0.1:60960
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.767855 [ERR] http: Request POST /v1/kv/, error: method POST not allowed from=127.0.0.1:60962
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.768331 [DEBUG] http: Request POST /v1/kv/ (487.352µs) from=127.0.0.1:60962
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.771456 [DEBUG] http: Request DELETE /v1/kv/ (551.021µs) from=127.0.0.1:60964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.774338 [ERR] http: Request HEAD /v1/kv/, error: method HEAD not allowed from=127.0.0.1:60966
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.774487 [DEBUG] http: Request HEAD /v1/kv/ (163.673µs) from=127.0.0.1:60966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/kv/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.776215 [DEBUG] http: Request OPTIONS /v1/kv/ (20.334µs) from=127.0.0.1:60966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.778529 [ERR] http: Request GET /v1/operator/keyring, error: Reading keyring denied by ACLs from=127.0.0.1:60966
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.779426 [DEBUG] http: Request GET /v1/operator/keyring (1.392053ms) from=127.0.0.1:60966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.782798 [DEBUG] http: Request PUT /v1/operator/keyring (614.023µs) from=127.0.0.1:60968
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.786134 [DEBUG] http: Request POST /v1/operator/keyring (582.355µs) from=127.0.0.1:60970
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.790736 [DEBUG] http: Request DELETE /v1/operator/keyring (611.023µs) from=127.0.0.1:60972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.793787 [ERR] http: Request HEAD /v1/operator/keyring, error: method HEAD not allowed from=127.0.0.1:60974
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.793919 [DEBUG] http: Request HEAD /v1/operator/keyring (152.006µs) from=127.0.0.1:60974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/keyring
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.795751 [DEBUG] http: Request OPTIONS /v1/operator/keyring (15µs) from=127.0.0.1:60974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.797149 [ERR] http: Request GET /v1/agent/check/register, error: method GET not allowed from=127.0.0.1:60974
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.797969 [DEBUG] http: Request GET /v1/agent/check/register (794.031µs) from=127.0.0.1:60974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.801574 [DEBUG] http: Request PUT /v1/agent/check/register (406.682µs) from=127.0.0.1:60976
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.805429 [ERR] http: Request POST /v1/agent/check/register, error: method POST not allowed from=127.0.0.1:60978
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.805949 [DEBUG] http: Request POST /v1/agent/check/register (545.687µs) from=127.0.0.1:60978
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.808631 [ERR] http: Request DELETE /v1/agent/check/register, error: method DELETE not allowed from=127.0.0.1:60980
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.809228 [DEBUG] http: Request DELETE /v1/agent/check/register (597.356µs) from=127.0.0.1:60980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.812497 [ERR] http: Request HEAD /v1/agent/check/register, error: method HEAD not allowed from=127.0.0.1:60982
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.812673 [DEBUG] http: Request HEAD /v1/agent/check/register (205.675µs) from=127.0.0.1:60982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.814623 [DEBUG] http: Request OPTIONS /v1/agent/check/register (30.001µs) from=127.0.0.1:60982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.816873 [ERR] http: Request GET /v1/coordinate/node/, error: Permission denied from=127.0.0.1:60982
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.817589 [DEBUG] http: Request GET /v1/coordinate/node/ (1.374719ms) from=127.0.0.1:60982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.820878 [ERR] http: Request PUT /v1/coordinate/node/, error: method PUT not allowed from=127.0.0.1:60984
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.821361 [DEBUG] http: Request PUT /v1/coordinate/node/ (486.018µs) from=127.0.0.1:60984
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.824186 [ERR] http: Request POST /v1/coordinate/node/, error: method POST not allowed from=127.0.0.1:60986
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.825100 [DEBUG] http: Request POST /v1/coordinate/node/ (902.368µs) from=127.0.0.1:60986
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.828481 [ERR] http: Request DELETE /v1/coordinate/node/, error: method DELETE not allowed from=127.0.0.1:60988
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.829224 [DEBUG] http: Request DELETE /v1/coordinate/node/ (745.028µs) from=127.0.0.1:60988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.832152 [ERR] http: Request HEAD /v1/coordinate/node/, error: method HEAD not allowed from=127.0.0.1:60990
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.832302 [DEBUG] http: Request HEAD /v1/coordinate/node/ (164.339µs) from=127.0.0.1:60990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.834144 [DEBUG] http: Request OPTIONS /v1/coordinate/node/ (19.334µs) from=127.0.0.1:60990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.836270 [ERR] http: Request GET /v1/agent/health/service/id/, error: Bad request: Missing serviceID from=127.0.0.1:60990
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.836848 [DEBUG] http: Request GET /v1/agent/health/service/id/ (586.022µs) from=127.0.0.1:60990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.839702 [ERR] http: Request PUT /v1/agent/health/service/id/, error: method PUT not allowed from=127.0.0.1:60992
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.840285 [DEBUG] http: Request PUT /v1/agent/health/service/id/ (598.356µs) from=127.0.0.1:60992
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.848816 [ERR] http: Request POST /v1/agent/health/service/id/, error: method POST not allowed from=127.0.0.1:60994
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.849612 [DEBUG] http: Request POST /v1/agent/health/service/id/ (803.031µs) from=127.0.0.1:60994
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.852463 [ERR] http: Request DELETE /v1/agent/health/service/id/, error: method DELETE not allowed from=127.0.0.1:60996
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.853147 [DEBUG] http: Request DELETE /v1/agent/health/service/id/ (674.359µs) from=127.0.0.1:60996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.856231 [ERR] http: Request HEAD /v1/agent/health/service/id/, error: method HEAD not allowed from=127.0.0.1:60998
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.856384 [DEBUG] http: Request HEAD /v1/agent/health/service/id/ (169.673µs) from=127.0.0.1:60998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/id/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.858048 [DEBUG] http: Request OPTIONS /v1/agent/health/service/id/ (17.334µs) from=127.0.0.1:60998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.859370 [ERR] http: Request GET /v1/acl/token, error: method GET not allowed from=127.0.0.1:60998
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.859838 [DEBUG] http: Request GET /v1/acl/token (453.35µs) from=127.0.0.1:60998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.862847 [ERR] http: Request PUT /v1/acl/token, error: Bad request: Token decoding failed: EOF from=127.0.0.1:32768
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.863678 [DEBUG] http: Request PUT /v1/acl/token (904.701µs) from=127.0.0.1:32768
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.866644 [ERR] http: Request POST /v1/acl/token, error: method POST not allowed from=127.0.0.1:32770
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.867601 [DEBUG] http: Request POST /v1/acl/token (947.369µs) from=127.0.0.1:32770
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.870855 [ERR] http: Request DELETE /v1/acl/token, error: method DELETE not allowed from=127.0.0.1:32772
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.871583 [DEBUG] http: Request DELETE /v1/acl/token (708.027µs) from=127.0.0.1:32772
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.875259 [ERR] http: Request HEAD /v1/acl/token, error: method HEAD not allowed from=127.0.0.1:32774
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.875494 [DEBUG] http: Request HEAD /v1/acl/token (250.676µs) from=127.0.0.1:32774
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.877117 [DEBUG] http: Request OPTIONS /v1/acl/token (13.667µs) from=127.0.0.1:32774
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.879014 [ERR] http: Request GET /v1/agent/self, error: Permission denied from=127.0.0.1:32774
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.879937 [DEBUG] http: Request GET /v1/agent/self (1.065374ms) from=127.0.0.1:32774
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.883291 [ERR] http: Request PUT /v1/agent/self, error: method PUT not allowed from=127.0.0.1:32776
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.883776 [DEBUG] http: Request PUT /v1/agent/self (496.019µs) from=127.0.0.1:32776
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.886966 [ERR] http: Request POST /v1/agent/self, error: method POST not allowed from=127.0.0.1:32778
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.887563 [DEBUG] http: Request POST /v1/agent/self (593.689µs) from=127.0.0.1:32778
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.891902 [ERR] http: Request DELETE /v1/agent/self, error: method DELETE not allowed from=127.0.0.1:32780
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.892784 [DEBUG] http: Request DELETE /v1/agent/self (897.368µs) from=127.0.0.1:32780
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.895950 [ERR] http: Request HEAD /v1/agent/self, error: method HEAD not allowed from=127.0.0.1:32782
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.896122 [DEBUG] http: Request HEAD /v1/agent/self (170.006µs) from=127.0.0.1:32782
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.897844 [DEBUG] http: Request OPTIONS /v1/agent/self (16.667µs) from=127.0.0.1:32782
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.899819 [DEBUG] consul: dropping service "consul" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.900980 [DEBUG] http: Request GET /v1/catalog/services (1.608061ms) from=127.0.0.1:32782
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.906252 [ERR] http: Request PUT /v1/catalog/services, error: method PUT not allowed from=127.0.0.1:32784
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.907118 [DEBUG] http: Request PUT /v1/catalog/services (846.699µs) from=127.0.0.1:32784
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.910434 [ERR] http: Request POST /v1/catalog/services, error: method POST not allowed from=127.0.0.1:32786
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.911141 [DEBUG] http: Request POST /v1/catalog/services (702.026µs) from=127.0.0.1:32786
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.914533 [ERR] http: Request DELETE /v1/catalog/services, error: method DELETE not allowed from=127.0.0.1:32788
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.915148 [DEBUG] http: Request DELETE /v1/catalog/services (620.023µs) from=127.0.0.1:32788
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.917773 [ERR] http: Request HEAD /v1/catalog/services, error: method HEAD not allowed from=127.0.0.1:32790
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.918015 [DEBUG] http: Request HEAD /v1/catalog/services (256.677µs) from=127.0.0.1:32790
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.919378 [DEBUG] http: Request OPTIONS /v1/catalog/services (14µs) from=127.0.0.1:32790
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.922722 [DEBUG] http: Request GET /v1/connect/ca/roots (1.938074ms) from=127.0.0.1:32790
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.926438 [ERR] http: Request PUT /v1/connect/ca/roots, error: method PUT not allowed from=127.0.0.1:32792
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.927087 [DEBUG] http: Request PUT /v1/connect/ca/roots (651.358µs) from=127.0.0.1:32792
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.930110 [ERR] http: Request POST /v1/connect/ca/roots, error: method POST not allowed from=127.0.0.1:32794
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.931038 [DEBUG] http: Request POST /v1/connect/ca/roots (912.701µs) from=127.0.0.1:32794
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.934114 [ERR] http: Request DELETE /v1/connect/ca/roots, error: method DELETE not allowed from=127.0.0.1:32796
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.934705 [DEBUG] http: Request DELETE /v1/connect/ca/roots (597.69µs) from=127.0.0.1:32796
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.938572 [ERR] http: Request HEAD /v1/connect/ca/roots, error: method HEAD not allowed from=127.0.0.1:32798
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.938713 [DEBUG] http: Request HEAD /v1/connect/ca/roots (155.673µs) from=127.0.0.1:32798
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.940019 [DEBUG] http: Request OPTIONS /v1/connect/ca/roots (15.667µs) from=127.0.0.1:32798
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.941328 [ERR] http: Request GET /v1/connect/intentions/match, error: required query parameter 'by' not set from=127.0.0.1:32798
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.941935 [DEBUG] http: Request GET /v1/connect/intentions/match (605.69µs) from=127.0.0.1:32798
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.946203 [ERR] http: Request PUT /v1/connect/intentions/match, error: method PUT not allowed from=127.0.0.1:32800
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.946791 [DEBUG] http: Request PUT /v1/connect/intentions/match (499.685µs) from=127.0.0.1:32800
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.949460 [ERR] http: Request POST /v1/connect/intentions/match, error: method POST not allowed from=127.0.0.1:32802
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.950775 [DEBUG] http: Request POST /v1/connect/intentions/match (1.311384ms) from=127.0.0.1:32802
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.954085 [ERR] http: Request DELETE /v1/connect/intentions/match, error: method DELETE not allowed from=127.0.0.1:32804
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.955182 [DEBUG] http: Request DELETE /v1/connect/intentions/match (1.060374ms) from=127.0.0.1:32804
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.960100 [ERR] http: Request HEAD /v1/connect/intentions/match, error: method HEAD not allowed from=127.0.0.1:32806
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.960259 [DEBUG] http: Request HEAD /v1/connect/intentions/match (162.006µs) from=127.0.0.1:32806
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/match
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.961993 [DEBUG] http: Request OPTIONS /v1/connect/intentions/match (20µs) from=127.0.0.1:32806
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.963695 [ERR] http: Request GET /v1/coordinate/update, error: method GET not allowed from=127.0.0.1:32806
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.964214 [DEBUG] http: Request GET /v1/coordinate/update (522.687µs) from=127.0.0.1:32806
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.967820 [DEBUG] http: Request PUT /v1/coordinate/update (770.696µs) from=127.0.0.1:32808
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.970978 [ERR] http: Request POST /v1/coordinate/update, error: method POST not allowed from=127.0.0.1:32810
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.971866 [DEBUG] http: Request POST /v1/coordinate/update (735.361µs) from=127.0.0.1:32810
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.975035 [ERR] http: Request DELETE /v1/coordinate/update, error: method DELETE not allowed from=127.0.0.1:32812
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.975721 [DEBUG] http: Request DELETE /v1/coordinate/update (681.025µs) from=127.0.0.1:32812
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.979037 [ERR] http: Request HEAD /v1/coordinate/update, error: method HEAD not allowed from=127.0.0.1:32814
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.979187 [DEBUG] http: Request HEAD /v1/coordinate/update (167.34µs) from=127.0.0.1:32814
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.980556 [DEBUG] http: Request OPTIONS /v1/coordinate/update (15.001µs) from=127.0.0.1:32814
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.982153 [ERR] http: Request GET /v1/event/fire/, error: method GET not allowed from=127.0.0.1:32814
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.982668 [DEBUG] http: Request GET /v1/event/fire/ (522.02µs) from=127.0.0.1:32814
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.987888 [DEBUG] http: Request PUT /v1/event/fire/ (583.356µs) from=127.0.0.1:32816
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.992209 [ERR] http: Request POST /v1/event/fire/, error: method POST not allowed from=127.0.0.1:32818
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.992876 [DEBUG] http: Request POST /v1/event/fire/ (647.024µs) from=127.0.0.1:32818
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.996766 [ERR] http: Request DELETE /v1/event/fire/, error: method DELETE not allowed from=127.0.0.1:32820
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:28.997530 [DEBUG] http: Request DELETE /v1/event/fire/ (825.365µs) from=127.0.0.1:32820
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.002749 [ERR] http: Request HEAD /v1/event/fire/, error: method HEAD not allowed from=127.0.0.1:32822
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.003035 [DEBUG] http: Request HEAD /v1/event/fire/ (302.011µs) from=127.0.0.1:32822
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/fire/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.005251 [DEBUG] http: Request OPTIONS /v1/event/fire/ (22.667µs) from=127.0.0.1:32822
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.007351 [ERR] http: Request GET /v1/acl/update, error: method GET not allowed from=127.0.0.1:32822
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.008217 [DEBUG] http: Request GET /v1/acl/update (860.699µs) from=127.0.0.1:32822
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.013740 [DEBUG] http: Request PUT /v1/acl/update (462.684µs) from=127.0.0.1:32824
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.016660 [ERR] http: Request POST /v1/acl/update, error: method POST not allowed from=127.0.0.1:32826
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.017926 [DEBUG] http: Request POST /v1/acl/update (1.245381ms) from=127.0.0.1:32826
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.021581 [ERR] http: Request DELETE /v1/acl/update, error: method DELETE not allowed from=127.0.0.1:32828
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.022476 [DEBUG] http: Request DELETE /v1/acl/update (890.034µs) from=127.0.0.1:32828
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.026877 [ERR] http: Request HEAD /v1/acl/update, error: method HEAD not allowed from=127.0.0.1:32830
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.027056 [DEBUG] http: Request HEAD /v1/acl/update (188.674µs) from=127.0.0.1:32830
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/update
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.029089 [DEBUG] http: Request OPTIONS /v1/acl/update (19.001µs) from=127.0.0.1:32830
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.032041 [DEBUG] http: Request GET /v1/internal/ui/node/ (558.688µs) from=127.0.0.1:32830
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.035353 [ERR] http: Request PUT /v1/internal/ui/node/, error: method PUT not allowed from=127.0.0.1:32832
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.036117 [DEBUG] http: Request PUT /v1/internal/ui/node/ (757.029µs) from=127.0.0.1:32832
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.040238 [ERR] http: Request POST /v1/internal/ui/node/, error: method POST not allowed from=127.0.0.1:32834
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.040860 [DEBUG] http: Request POST /v1/internal/ui/node/ (622.024µs) from=127.0.0.1:32834
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.044349 [ERR] http: Request DELETE /v1/internal/ui/node/, error: method DELETE not allowed from=127.0.0.1:32836
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.045292 [DEBUG] http: Request DELETE /v1/internal/ui/node/ (932.702µs) from=127.0.0.1:32836
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.049353 [ERR] http: Request HEAD /v1/internal/ui/node/, error: method HEAD not allowed from=127.0.0.1:32838
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.049504 [DEBUG] http: Request HEAD /v1/internal/ui/node/ (166.673µs) from=127.0.0.1:32838
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.050963 [DEBUG] http: Request OPTIONS /v1/internal/ui/node/ (13µs) from=127.0.0.1:32838
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.052965 [DEBUG] http: Request GET /v1/status/leader (448.017µs) from=127.0.0.1:32838
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.055454 [ERR] http: Request PUT /v1/status/leader, error: method PUT not allowed from=127.0.0.1:32840
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.056308 [DEBUG] http: Request PUT /v1/status/leader (836.699µs) from=127.0.0.1:32840
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.060659 [ERR] http: Request POST /v1/status/leader, error: method POST not allowed from=127.0.0.1:32842
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.061194 [DEBUG] http: Request POST /v1/status/leader (536.687µs) from=127.0.0.1:32842
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.064742 [ERR] http: Request DELETE /v1/status/leader, error: method DELETE not allowed from=127.0.0.1:32844
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.065735 [DEBUG] http: Request DELETE /v1/status/leader (964.37µs) from=127.0.0.1:32844
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.070469 [ERR] http: Request HEAD /v1/status/leader, error: method HEAD not allowed from=127.0.0.1:32846
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.070662 [DEBUG] http: Request HEAD /v1/status/leader (206.341µs) from=127.0.0.1:32846
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/status/leader
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.072145 [DEBUG] http: Request OPTIONS /v1/status/leader (15.001µs) from=127.0.0.1:32846
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.074773 [DEBUG] http: Request GET /v1/event/list (1.119042ms) from=127.0.0.1:32846
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.078125 [ERR] http: Request PUT /v1/event/list, error: method PUT not allowed from=127.0.0.1:32848
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.078667 [DEBUG] http: Request PUT /v1/event/list (548.354µs) from=127.0.0.1:32848
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.081538 [ERR] http: Request POST /v1/event/list, error: method POST not allowed from=127.0.0.1:32850
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.082256 [DEBUG] http: Request POST /v1/event/list (723.361µs) from=127.0.0.1:32850
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.085551 [ERR] http: Request DELETE /v1/event/list, error: method DELETE not allowed from=127.0.0.1:32852
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.086064 [DEBUG] http: Request DELETE /v1/event/list (520.353µs) from=127.0.0.1:32852
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.089107 [ERR] http: Request HEAD /v1/event/list, error: method HEAD not allowed from=127.0.0.1:32854
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.089239 [DEBUG] http: Request HEAD /v1/event/list (151.673µs) from=127.0.0.1:32854
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.090506 [DEBUG] http: Request OPTIONS /v1/event/list (15.001µs) from=127.0.0.1:32854
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.092254 [ERR] http: Request GET /v1/acl/list, error: Permission denied from=127.0.0.1:32854
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.092719 [DEBUG] http: Request GET /v1/acl/list (904.701µs) from=127.0.0.1:32854
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.095458 [ERR] http: Request PUT /v1/acl/list, error: method PUT not allowed from=127.0.0.1:32856
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.095944 [DEBUG] http: Request PUT /v1/acl/list (494.352µs) from=127.0.0.1:32856
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.098659 [ERR] http: Request POST /v1/acl/list, error: method POST not allowed from=127.0.0.1:32858
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.099134 [DEBUG] http: Request POST /v1/acl/list (481.018µs) from=127.0.0.1:32858
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.101670 [ERR] http: Request DELETE /v1/acl/list, error: method DELETE not allowed from=127.0.0.1:32860
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.102410 [DEBUG] http: Request DELETE /v1/acl/list (752.029µs) from=127.0.0.1:32860
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.106116 [ERR] http: Request HEAD /v1/acl/list, error: method HEAD not allowed from=127.0.0.1:32862
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.106249 [DEBUG] http: Request HEAD /v1/acl/list (158.672µs) from=127.0.0.1:32862
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.107695 [DEBUG] http: Request OPTIONS /v1/acl/list (14.001µs) from=127.0.0.1:32862
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.109885 [DEBUG] http: Request GET /v1/acl/replication (954.37µs) from=127.0.0.1:32862
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.117571 [ERR] http: Request PUT /v1/acl/replication, error: method PUT not allowed from=127.0.0.1:32864
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.118419 [DEBUG] http: Request PUT /v1/acl/replication (838.699µs) from=127.0.0.1:32864
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.121620 [ERR] http: Request POST /v1/acl/replication, error: method POST not allowed from=127.0.0.1:32866
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.122350 [DEBUG] http: Request POST /v1/acl/replication (718.027µs) from=127.0.0.1:32866
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.125221 [ERR] http: Request DELETE /v1/acl/replication, error: method DELETE not allowed from=127.0.0.1:32868
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.125756 [DEBUG] http: Request DELETE /v1/acl/replication (530.353µs) from=127.0.0.1:32868
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.130932 [ERR] http: Request HEAD /v1/acl/replication, error: method HEAD not allowed from=127.0.0.1:32870
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.131076 [DEBUG] http: Request HEAD /v1/acl/replication (160.006µs) from=127.0.0.1:32870
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/replication
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.133548 [DEBUG] http: Request OPTIONS /v1/acl/replication (17µs) from=127.0.0.1:32870
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.134927 [ERR] http: Request GET /v1/agent/token/, error: method GET not allowed from=127.0.0.1:32870
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.135408 [DEBUG] http: Request GET /v1/agent/token/ (484.018µs) from=127.0.0.1:32870
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.138426 [ERR] http: Request PUT /v1/agent/token/, error: Permission denied from=127.0.0.1:32872
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.139068 [DEBUG] http: Request PUT /v1/agent/token/ (748.695µs) from=127.0.0.1:32872
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.142710 [ERR] http: Request POST /v1/agent/token/, error: method POST not allowed from=127.0.0.1:32874
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.143321 [DEBUG] http: Request POST /v1/agent/token/ (611.023µs) from=127.0.0.1:32874
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.146459 [ERR] http: Request DELETE /v1/agent/token/, error: method DELETE not allowed from=127.0.0.1:32876
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.147176 [DEBUG] http: Request DELETE /v1/agent/token/ (718.361µs) from=127.0.0.1:32876
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.149796 [ERR] http: Request HEAD /v1/agent/token/, error: method HEAD not allowed from=127.0.0.1:32878
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.150065 [DEBUG] http: Request HEAD /v1/agent/token/ (282.678µs) from=127.0.0.1:32878
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/token/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.151555 [DEBUG] http: Request OPTIONS /v1/agent/token/ (14µs) from=127.0.0.1:32878
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.153157 [ERR] http: Request GET /v1/agent/maintenance, error: method GET not allowed from=127.0.0.1:32878
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.153773 [DEBUG] http: Request GET /v1/agent/maintenance (617.69µs) from=127.0.0.1:32878
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.157400 [DEBUG] http: Request PUT /v1/agent/maintenance (693.026µs) from=127.0.0.1:32880
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.160086 [ERR] http: Request POST /v1/agent/maintenance, error: method POST not allowed from=127.0.0.1:32882
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.160833 [DEBUG] http: Request POST /v1/agent/maintenance (737.028µs) from=127.0.0.1:32882
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.163588 [ERR] http: Request DELETE /v1/agent/maintenance, error: method DELETE not allowed from=127.0.0.1:32884
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.164201 [DEBUG] http: Request DELETE /v1/agent/maintenance (599.689µs) from=127.0.0.1:32884
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.167035 [ERR] http: Request HEAD /v1/agent/maintenance, error: method HEAD not allowed from=127.0.0.1:32886
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.167180 [DEBUG] http: Request HEAD /v1/agent/maintenance (157.006µs) from=127.0.0.1:32886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/maintenance
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.168640 [DEBUG] http: Request OPTIONS /v1/agent/maintenance (14.001µs) from=127.0.0.1:32886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.169963 [ERR] http: Request GET /v1/agent/join/, error: method GET not allowed from=127.0.0.1:32886
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.170424 [DEBUG] http: Request GET /v1/agent/join/ (460.684µs) from=127.0.0.1:32886
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.173628 [ERR] http: Request PUT /v1/agent/join/, error: Permission denied from=127.0.0.1:32888
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.174456 [DEBUG] http: Request PUT /v1/agent/join/ (964.703µs) from=127.0.0.1:32888
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.178994 [ERR] http: Request POST /v1/agent/join/, error: method POST not allowed from=127.0.0.1:32890
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.179597 [DEBUG] http: Request POST /v1/agent/join/ (608.69µs) from=127.0.0.1:32890
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.184878 [ERR] http: Request DELETE /v1/agent/join/, error: method DELETE not allowed from=127.0.0.1:32892
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.185632 [DEBUG] http: Request DELETE /v1/agent/join/ (745.695µs) from=127.0.0.1:32892
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.189209 [ERR] http: Request HEAD /v1/agent/join/, error: method HEAD not allowed from=127.0.0.1:32894
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.189535 [DEBUG] http: Request HEAD /v1/agent/join/ (335.013µs) from=127.0.0.1:32894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/join/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.193726 [DEBUG] http: Request OPTIONS /v1/agent/join/ (18.668µs) from=127.0.0.1:32894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.195555 [ERR] http: Request GET /v1/agent/health/service/name/, error: Bad request: Missing service Name from=127.0.0.1:32894
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.196068 [DEBUG] http: Request GET /v1/agent/health/service/name/ (520.02µs) from=127.0.0.1:32894
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.199943 [ERR] http: Request PUT /v1/agent/health/service/name/, error: method PUT not allowed from=127.0.0.1:32896
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.200442 [DEBUG] http: Request PUT /v1/agent/health/service/name/ (500.019µs) from=127.0.0.1:32896
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.203232 [ERR] http: Request POST /v1/agent/health/service/name/, error: method POST not allowed from=127.0.0.1:32898
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.203946 [DEBUG] http: Request POST /v1/agent/health/service/name/ (706.027µs) from=127.0.0.1:32898
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.207449 [ERR] http: Request DELETE /v1/agent/health/service/name/, error: method DELETE not allowed from=127.0.0.1:32900
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.208050 [DEBUG] http: Request DELETE /v1/agent/health/service/name/ (601.356µs) from=127.0.0.1:32900
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.211333 [ERR] http: Request HEAD /v1/agent/health/service/name/, error: method HEAD not allowed from=127.0.0.1:32902
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.211484 [DEBUG] http: Request HEAD /v1/agent/health/service/name/ (180.007µs) from=127.0.0.1:32902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/name/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.213431 [DEBUG] http: Request OPTIONS /v1/agent/health/service/name/ (15µs) from=127.0.0.1:32902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.215402 [DEBUG] http: Request GET /v1/acl/info/ (383.015µs) from=127.0.0.1:32902
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.219129 [ERR] http: Request PUT /v1/acl/info/, error: method PUT not allowed from=127.0.0.1:32904
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.219870 [DEBUG] http: Request PUT /v1/acl/info/ (743.029µs) from=127.0.0.1:32904
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.223598 [ERR] http: Request POST /v1/acl/info/, error: method POST not allowed from=127.0.0.1:32906
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.224469 [DEBUG] http: Request POST /v1/acl/info/ (853.699µs) from=127.0.0.1:32906
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.227760 [ERR] http: Request DELETE /v1/acl/info/, error: method DELETE not allowed from=127.0.0.1:32908
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.228620 [DEBUG] http: Request DELETE /v1/acl/info/ (847.698µs) from=127.0.0.1:32908
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.232767 [ERR] http: Request HEAD /v1/acl/info/, error: method HEAD not allowed from=127.0.0.1:32910
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.233155 [DEBUG] http: Request HEAD /v1/acl/info/ (400.349µs) from=127.0.0.1:32910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.235162 [DEBUG] http: Request OPTIONS /v1/acl/info/ (16µs) from=127.0.0.1:32910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.237107 [ERR] http: Request GET /v1/acl/policy, error: method GET not allowed from=127.0.0.1:32910
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.238050 [DEBUG] http: Request GET /v1/acl/policy (947.036µs) from=127.0.0.1:32910
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.242486 [ERR] http: Request PUT /v1/acl/policy, error: Bad request: Policy decoding failed: EOF from=127.0.0.1:32912
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.243147 [DEBUG] http: Request PUT /v1/acl/policy (703.027µs) from=127.0.0.1:32912
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.246869 [ERR] http: Request POST /v1/acl/policy, error: method POST not allowed from=127.0.0.1:32914
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.247662 [DEBUG] http: Request POST /v1/acl/policy (805.697µs) from=127.0.0.1:32914
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.251208 [ERR] http: Request DELETE /v1/acl/policy, error: method DELETE not allowed from=127.0.0.1:32916
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.251791 [DEBUG] http: Request DELETE /v1/acl/policy (496.352µs) from=127.0.0.1:32916
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.254359 [ERR] http: Request HEAD /v1/acl/policy, error: method HEAD not allowed from=127.0.0.1:32918
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.254489 [DEBUG] http: Request HEAD /v1/acl/policy (145.338µs) from=127.0.0.1:32918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.255704 [DEBUG] http: Request OPTIONS /v1/acl/policy (15.001µs) from=127.0.0.1:32918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.256874 [ERR] http: Request GET /v1/agent/check/update/, error: method GET not allowed from=127.0.0.1:32918
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.257371 [DEBUG] http: Request GET /v1/agent/check/update/ (482.685µs) from=127.0.0.1:32918
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.260382 [DEBUG] http: Request PUT /v1/agent/check/update/ (354.347µs) from=127.0.0.1:32920
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.262914 [ERR] http: Request POST /v1/agent/check/update/, error: method POST not allowed from=127.0.0.1:32922
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.263419 [DEBUG] http: Request POST /v1/agent/check/update/ (503.685µs) from=127.0.0.1:32922
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.266649 [ERR] http: Request DELETE /v1/agent/check/update/, error: method DELETE not allowed from=127.0.0.1:32924
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.267259 [DEBUG] http: Request DELETE /v1/agent/check/update/ (607.023µs) from=127.0.0.1:32924
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.270242 [ERR] http: Request HEAD /v1/agent/check/update/, error: method HEAD not allowed from=127.0.0.1:32926
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.270453 [DEBUG] http: Request HEAD /v1/agent/check/update/ (230.675µs) from=127.0.0.1:32926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/update/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.272096 [DEBUG] http: Request OPTIONS /v1/agent/check/update/ (15.667µs) from=127.0.0.1:32926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.273407 [ERR] http: Request GET /v1/catalog/register, error: method GET not allowed from=127.0.0.1:32926
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.273917 [DEBUG] http: Request GET /v1/catalog/register (505.686µs) from=127.0.0.1:32926
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.277419 [DEBUG] http: Request PUT /v1/catalog/register (554.021µs) from=127.0.0.1:32928
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.280544 [ERR] http: Request POST /v1/catalog/register, error: method POST not allowed from=127.0.0.1:32930
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.281059 [DEBUG] http: Request POST /v1/catalog/register (520.353µs) from=127.0.0.1:32930
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.284446 [ERR] http: Request DELETE /v1/catalog/register, error: method DELETE not allowed from=127.0.0.1:32932
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.284971 [DEBUG] http: Request DELETE /v1/catalog/register (528.687µs) from=127.0.0.1:32932
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.288679 [ERR] http: Request HEAD /v1/catalog/register, error: method HEAD not allowed from=127.0.0.1:32934
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.288831 [DEBUG] http: Request HEAD /v1/catalog/register (164.339µs) from=127.0.0.1:32934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/register
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.292736 [DEBUG] http: Request OPTIONS /v1/catalog/register (17.667µs) from=127.0.0.1:32934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.294817 [DEBUG] http: Request GET /v1/health/node/ (497.353µs) from=127.0.0.1:32934
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.297773 [ERR] http: Request PUT /v1/health/node/, error: method PUT not allowed from=127.0.0.1:32936
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.298488 [DEBUG] http: Request PUT /v1/health/node/ (719.694µs) from=127.0.0.1:32936
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.301297 [ERR] http: Request POST /v1/health/node/, error: method POST not allowed from=127.0.0.1:32938
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.301910 [DEBUG] http: Request POST /v1/health/node/ (612.357µs) from=127.0.0.1:32938
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.304521 [ERR] http: Request DELETE /v1/health/node/, error: method DELETE not allowed from=127.0.0.1:32940
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.305052 [DEBUG] http: Request DELETE /v1/health/node/ (534.687µs) from=127.0.0.1:32940
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.308139 [ERR] http: Request HEAD /v1/health/node/, error: method HEAD not allowed from=127.0.0.1:32942
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.308302 [DEBUG] http: Request HEAD /v1/health/node/ (186.34µs) from=127.0.0.1:32942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.309848 [DEBUG] http: Request OPTIONS /v1/health/node/ (15.334µs) from=127.0.0.1:32942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.311393 [ERR] http: Request GET /v1/acl/create, error: method GET not allowed from=127.0.0.1:32942
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.311976 [DEBUG] http: Request GET /v1/acl/create (566.355µs) from=127.0.0.1:32942
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.315233 [ERR] http: Request PUT /v1/acl/create, error: Permission denied from=127.0.0.1:32944
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.315713 [DEBUG] http: Request PUT /v1/acl/create (910.368µs) from=127.0.0.1:32944
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.318138 [ERR] http: Request POST /v1/acl/create, error: method POST not allowed from=127.0.0.1:32946
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.318721 [DEBUG] http: Request POST /v1/acl/create (584.356µs) from=127.0.0.1:32946
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.321341 [ERR] http: Request DELETE /v1/acl/create, error: method DELETE not allowed from=127.0.0.1:32948
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.322015 [DEBUG] http: Request DELETE /v1/acl/create (626.69µs) from=127.0.0.1:32948
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.324606 [ERR] http: Request HEAD /v1/acl/create, error: method HEAD not allowed from=127.0.0.1:32950
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.324838 [DEBUG] http: Request HEAD /v1/acl/create (297.678µs) from=127.0.0.1:32950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.326478 [DEBUG] http: Request OPTIONS /v1/acl/create (16.334µs) from=127.0.0.1:32950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.328546 [ERR] http: Request GET /v1/agent/metrics, error: Permission denied from=127.0.0.1:32950
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.329092 [DEBUG] http: Request GET /v1/agent/metrics (649.691µs) from=127.0.0.1:32950
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.332892 [ERR] http: Request PUT /v1/agent/metrics, error: method PUT not allowed from=127.0.0.1:32952
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.333344 [DEBUG] http: Request PUT /v1/agent/metrics (511.02µs) from=127.0.0.1:32952
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.335894 [ERR] http: Request POST /v1/agent/metrics, error: method POST not allowed from=127.0.0.1:32954
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.336376 [DEBUG] http: Request POST /v1/agent/metrics (485.018µs) from=127.0.0.1:32954
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.339268 [ERR] http: Request DELETE /v1/agent/metrics, error: method DELETE not allowed from=127.0.0.1:32956
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.339747 [DEBUG] http: Request DELETE /v1/agent/metrics (488.686µs) from=127.0.0.1:32956
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.342692 [ERR] http: Request HEAD /v1/agent/metrics, error: method HEAD not allowed from=127.0.0.1:32958
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.342872 [DEBUG] http: Request HEAD /v1/agent/metrics (193.341µs) from=127.0.0.1:32958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/metrics
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.344299 [DEBUG] http: Request OPTIONS /v1/agent/metrics (13.333µs) from=127.0.0.1:32958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.345528 [ERR] http: Request GET /v1/agent/leave, error: method GET not allowed from=127.0.0.1:32958
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.346202 [DEBUG] http: Request GET /v1/agent/leave (665.358µs) from=127.0.0.1:32958
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.349351 [ERR] http: Request PUT /v1/agent/leave, error: Permission denied from=127.0.0.1:32960
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.349785 [DEBUG] http: Request PUT /v1/agent/leave (616.69µs) from=127.0.0.1:32960
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.352324 [ERR] http: Request POST /v1/agent/leave, error: method POST not allowed from=127.0.0.1:32962
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.352778 [DEBUG] http: Request POST /v1/agent/leave (467.351µs) from=127.0.0.1:32962
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.355609 [ERR] http: Request DELETE /v1/agent/leave, error: method DELETE not allowed from=127.0.0.1:32964
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.356118 [DEBUG] http: Request DELETE /v1/agent/leave (510.353µs) from=127.0.0.1:32964
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.358807 [ERR] http: Request HEAD /v1/agent/leave, error: method HEAD not allowed from=127.0.0.1:32966
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.358934 [DEBUG] http: Request HEAD /v1/agent/leave (145.672µs) from=127.0.0.1:32966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.360590 [DEBUG] http: Request OPTIONS /v1/agent/leave (17.334µs) from=127.0.0.1:32966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.361861 [ERR] http: Request GET /v1/agent/check/warn/, error: method GET not allowed from=127.0.0.1:32966
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.362376 [DEBUG] http: Request GET /v1/agent/check/warn/ (519.02µs) from=127.0.0.1:32966
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.365255 [ERR] http: Request PUT /v1/agent/check/warn/, error: Unknown check "" from=127.0.0.1:32968
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.365781 [DEBUG] http: Request PUT /v1/agent/check/warn/ (704.027µs) from=127.0.0.1:32968
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.368415 [ERR] http: Request POST /v1/agent/check/warn/, error: method POST not allowed from=127.0.0.1:32970
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.368957 [DEBUG] http: Request POST /v1/agent/check/warn/ (527.686µs) from=127.0.0.1:32970
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.371575 [ERR] http: Request DELETE /v1/agent/check/warn/, error: method DELETE not allowed from=127.0.0.1:32972
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.372231 [DEBUG] http: Request DELETE /v1/agent/check/warn/ (713.027µs) from=127.0.0.1:32972
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.375103 [ERR] http: Request HEAD /v1/agent/check/warn/, error: method HEAD not allowed from=127.0.0.1:32974
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.375265 [DEBUG] http: Request HEAD /v1/agent/check/warn/ (175.674µs) from=127.0.0.1:32974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/warn/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.376912 [DEBUG] http: Request OPTIONS /v1/agent/check/warn/ (15.667µs) from=127.0.0.1:32974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.378456 [ERR] http: Request GET /v1/agent/service/deregister/, error: method GET not allowed from=127.0.0.1:32974
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.378979 [DEBUG] http: Request GET /v1/agent/service/deregister/ (538.02µs) from=127.0.0.1:32974
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.382213 [ERR] http: Request PUT /v1/agent/service/deregister/, error: Unknown service "" from=127.0.0.1:32976
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.382631 [DEBUG] http: Request PUT /v1/agent/service/deregister/ (545.021µs) from=127.0.0.1:32976
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.385162 [ERR] http: Request POST /v1/agent/service/deregister/, error: method POST not allowed from=127.0.0.1:32978
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.385594 [DEBUG] http: Request POST /v1/agent/service/deregister/ (445.683µs) from=127.0.0.1:32978
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.388556 [ERR] http: Request DELETE /v1/agent/service/deregister/, error: method DELETE not allowed from=127.0.0.1:32980
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.389007 [DEBUG] http: Request DELETE /v1/agent/service/deregister/ (452.017µs) from=127.0.0.1:32980
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.391600 [ERR] http: Request HEAD /v1/agent/service/deregister/, error: method HEAD not allowed from=127.0.0.1:32982
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.391788 [DEBUG] http: Request HEAD /v1/agent/service/deregister/ (197.34µs) from=127.0.0.1:32982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/deregister/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.393083 [DEBUG] http: Request OPTIONS /v1/agent/service/deregister/ (14.668µs) from=127.0.0.1:32982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.395462 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.180378ms) from=127.0.0.1:32982
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.400793 [ERR] http: Request PUT /v1/coordinate/datacenters, error: method PUT not allowed from=127.0.0.1:32984
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.401377 [DEBUG] http: Request PUT /v1/coordinate/datacenters (608.023µs) from=127.0.0.1:32984
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.405416 [ERR] http: Request POST /v1/coordinate/datacenters, error: method POST not allowed from=127.0.0.1:32986
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.407157 [DEBUG] http: Request POST /v1/coordinate/datacenters (1.718732ms) from=127.0.0.1:32986
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.412378 [ERR] http: Request DELETE /v1/coordinate/datacenters, error: method DELETE not allowed from=127.0.0.1:32988
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.414012 [DEBUG] http: Request DELETE /v1/coordinate/datacenters (1.610395ms) from=127.0.0.1:32988
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.424076 [ERR] http: Request HEAD /v1/coordinate/datacenters, error: method HEAD not allowed from=127.0.0.1:32990
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.424483 [DEBUG] http: Request HEAD /v1/coordinate/datacenters (425.016µs) from=127.0.0.1:32990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.427089 [DEBUG] http: Request OPTIONS /v1/coordinate/datacenters (20.001µs) from=127.0.0.1:32990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.431364 [DEBUG] http: Request GET /v1/coordinate/nodes (1.643729ms) from=127.0.0.1:32990
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.442042 [ERR] http: Request PUT /v1/coordinate/nodes, error: method PUT not allowed from=127.0.0.1:32992
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.442534 [DEBUG] http: Request PUT /v1/coordinate/nodes (507.686µs) from=127.0.0.1:32992
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.445320 [ERR] http: Request POST /v1/coordinate/nodes, error: method POST not allowed from=127.0.0.1:32994
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.445804 [DEBUG] http: Request POST /v1/coordinate/nodes (487.018µs) from=127.0.0.1:32994
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.448726 [ERR] http: Request DELETE /v1/coordinate/nodes, error: method DELETE not allowed from=127.0.0.1:32996
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.449255 [DEBUG] http: Request DELETE /v1/coordinate/nodes (533.02µs) from=127.0.0.1:32996
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.452027 [ERR] http: Request HEAD /v1/coordinate/nodes, error: method HEAD not allowed from=127.0.0.1:32998
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.452165 [DEBUG] http: Request HEAD /v1/coordinate/nodes (154.006µs) from=127.0.0.1:32998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.453413 [DEBUG] http: Request OPTIONS /v1/coordinate/nodes (12.667µs) from=127.0.0.1:32998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.455006 [ERR] http: Request GET /v1/operator/raft/configuration, error: Permission denied from=127.0.0.1:32998
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.455453 [DEBUG] http: Request GET /v1/operator/raft/configuration (801.031µs) from=127.0.0.1:32998
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.458264 [ERR] http: Request PUT /v1/operator/raft/configuration, error: method PUT not allowed from=127.0.0.1:33000
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.459019 [DEBUG] http: Request PUT /v1/operator/raft/configuration (727.028µs) from=127.0.0.1:33000
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.467954 [ERR] http: Request POST /v1/operator/raft/configuration, error: method POST not allowed from=127.0.0.1:33002
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.468634 [DEBUG] http: Request POST /v1/operator/raft/configuration (686.692µs) from=127.0.0.1:33002
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.471947 [ERR] http: Request DELETE /v1/operator/raft/configuration, error: method DELETE not allowed from=127.0.0.1:33004
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.472733 [DEBUG] http: Request DELETE /v1/operator/raft/configuration (791.363µs) from=127.0.0.1:33004
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.476166 [ERR] http: Request HEAD /v1/operator/raft/configuration, error: method HEAD not allowed from=127.0.0.1:33006
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.476335 [DEBUG] http: Request HEAD /v1/operator/raft/configuration (185.673µs) from=127.0.0.1:33006
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.478028 [DEBUG] http: Request OPTIONS /v1/operator/raft/configuration (14.001µs) from=127.0.0.1:33006
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.480069 [ERR] http: Request GET /v1/acl/tokens, error: Permission denied from=127.0.0.1:33006
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.480637 [DEBUG] http: Request GET /v1/acl/tokens (1.038039ms) from=127.0.0.1:33006
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.485895 [ERR] http: Request PUT /v1/acl/tokens, error: method PUT not allowed from=127.0.0.1:33008
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.486489 [DEBUG] http: Request PUT /v1/acl/tokens (594.023µs) from=127.0.0.1:33008
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.489647 [ERR] http: Request POST /v1/acl/tokens, error: method POST not allowed from=127.0.0.1:33010
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.490305 [DEBUG] http: Request POST /v1/acl/tokens (664.025µs) from=127.0.0.1:33010
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.493300 [ERR] http: Request DELETE /v1/acl/tokens, error: method DELETE not allowed from=127.0.0.1:33012
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.493888 [DEBUG] http: Request DELETE /v1/acl/tokens (596.689µs) from=127.0.0.1:33012
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.496561 [ERR] http: Request HEAD /v1/acl/tokens, error: method HEAD not allowed from=127.0.0.1:33014
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.496877 [DEBUG] http: Request HEAD /v1/acl/tokens (339.013µs) from=127.0.0.1:33014
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/tokens
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.498247 [DEBUG] http: Request OPTIONS /v1/acl/tokens (15µs) from=127.0.0.1:33014
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.499467 [ERR] http: Request GET /v1/txn, error: method GET not allowed from=127.0.0.1:33014
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.500069 [DEBUG] http: Request GET /v1/txn (602.357µs) from=127.0.0.1:33014
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.503194 [DEBUG] http: Request PUT /v1/txn (449.684µs) from=127.0.0.1:33016
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.512062 [ERR] http: Request POST /v1/txn, error: method POST not allowed from=127.0.0.1:33018
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.512746 [DEBUG] http: Request POST /v1/txn (682.026µs) from=127.0.0.1:33018
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.516487 [ERR] http: Request DELETE /v1/txn, error: method DELETE not allowed from=127.0.0.1:33020
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.517060 [DEBUG] http: Request DELETE /v1/txn (571.355µs) from=127.0.0.1:33020
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.519889 [ERR] http: Request HEAD /v1/txn, error: method HEAD not allowed from=127.0.0.1:33022
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.520071 [DEBUG] http: Request HEAD /v1/txn (191.674µs) from=127.0.0.1:33022
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/txn
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.521622 [DEBUG] http: Request OPTIONS /v1/txn (14.001µs) from=127.0.0.1:33022
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.523321 [ERR] http: Request GET /v1/agent/connect/authorize, error: method GET not allowed from=127.0.0.1:33022
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.523909 [DEBUG] http: Request GET /v1/agent/connect/authorize (577.689µs) from=127.0.0.1:33022
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.529299 [ERR] http: Request PUT /v1/agent/connect/authorize, error: method PUT not allowed from=127.0.0.1:33024
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.529814 [DEBUG] http: Request PUT /v1/agent/connect/authorize (515.687µs) from=127.0.0.1:33024
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.557830 [ERR] http: Request POST /v1/agent/connect/authorize, error: Bad request: Request decode failed: EOF from=127.0.0.1:33026
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.558505 [DEBUG] http: Request POST /v1/agent/connect/authorize (705.36µs) from=127.0.0.1:33026
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.561570 [ERR] http: Request DELETE /v1/agent/connect/authorize, error: method DELETE not allowed from=127.0.0.1:33028
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.562362 [DEBUG] http: Request DELETE /v1/agent/connect/authorize (782.697µs) from=127.0.0.1:33028
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.566674 [ERR] http: Request HEAD /v1/agent/connect/authorize, error: method HEAD not allowed from=127.0.0.1:33030
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.566875 [DEBUG] http: Request HEAD /v1/agent/connect/authorize (222.342µs) from=127.0.0.1:33030
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/authorize
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.568602 [DEBUG] http: Request OPTIONS /v1/agent/connect/authorize (19.334µs) from=127.0.0.1:33030
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.570513 [DEBUG] http: Request GET /v1/health/service/ (576.022µs) from=127.0.0.1:33030
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.573390 [ERR] http: Request PUT /v1/health/service/, error: method PUT not allowed from=127.0.0.1:33032
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.573880 [DEBUG] http: Request PUT /v1/health/service/ (502.019µs) from=127.0.0.1:33032
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.577078 [ERR] http: Request POST /v1/health/service/, error: method POST not allowed from=127.0.0.1:33034
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.577760 [DEBUG] http: Request POST /v1/health/service/ (683.359µs) from=127.0.0.1:33034
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.582031 [ERR] http: Request DELETE /v1/health/service/, error: method DELETE not allowed from=127.0.0.1:33036
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.582727 [DEBUG] http: Request DELETE /v1/health/service/ (673.692µs) from=127.0.0.1:33036
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.587198 [ERR] http: Request HEAD /v1/health/service/, error: method HEAD not allowed from=127.0.0.1:33038
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.587348 [DEBUG] http: Request HEAD /v1/health/service/ (162.672µs) from=127.0.0.1:33038
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.589031 [DEBUG] http: Request OPTIONS /v1/health/service/ (16.334µs) from=127.0.0.1:33038
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.591196 [DEBUG] consul: dropping node "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.592324 [DEBUG] http: Request GET /v1/internal/ui/services (1.779067ms) from=127.0.0.1:33038
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.596140 [ERR] http: Request PUT /v1/internal/ui/services, error: method PUT not allowed from=127.0.0.1:33040
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.597176 [DEBUG] http: Request PUT /v1/internal/ui/services (1.008039ms) from=127.0.0.1:33040
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.600659 [ERR] http: Request POST /v1/internal/ui/services, error: method POST not allowed from=127.0.0.1:33042
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.601314 [DEBUG] http: Request POST /v1/internal/ui/services (703.027µs) from=127.0.0.1:33042
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.604332 [ERR] http: Request DELETE /v1/internal/ui/services, error: method DELETE not allowed from=127.0.0.1:33044
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.604902 [DEBUG] http: Request DELETE /v1/internal/ui/services (577.356µs) from=127.0.0.1:33044
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.607936 [ERR] http: Request HEAD /v1/internal/ui/services, error: method HEAD not allowed from=127.0.0.1:33046
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.608178 [DEBUG] http: Request HEAD /v1/internal/ui/services (260.343µs) from=127.0.0.1:33046
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.609814 [DEBUG] http: Request OPTIONS /v1/internal/ui/services (15µs) from=127.0.0.1:33046
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.612860 [DEBUG] http: Request GET /v1/session/list (1.371386ms) from=127.0.0.1:33046
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.619418 [ERR] http: Request PUT /v1/session/list, error: method PUT not allowed from=127.0.0.1:33048
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.620179 [DEBUG] http: Request PUT /v1/session/list (770.363µs) from=127.0.0.1:33048
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.625657 [ERR] http: Request POST /v1/session/list, error: method POST not allowed from=127.0.0.1:33050
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.626354 [DEBUG] http: Request POST /v1/session/list (699.027µs) from=127.0.0.1:33050
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.631273 [ERR] http: Request DELETE /v1/session/list, error: method DELETE not allowed from=127.0.0.1:33052
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.632123 [DEBUG] http: Request DELETE /v1/session/list (898.034µs) from=127.0.0.1:33052
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.636298 [ERR] http: Request HEAD /v1/session/list, error: method HEAD not allowed from=127.0.0.1:33054
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.636663 [DEBUG] http: Request HEAD /v1/session/list (375.681µs) from=127.0.0.1:33054
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/list
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.638467 [DEBUG] http: Request OPTIONS /v1/session/list (17.667µs) from=127.0.0.1:33054
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.643502 [DEBUG] http: Request GET /v1/agent/services (3.450131ms) from=127.0.0.1:33054
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.648291 [ERR] http: Request PUT /v1/agent/services, error: method PUT not allowed from=127.0.0.1:33056
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.648948 [DEBUG] http: Request PUT /v1/agent/services (660.358µs) from=127.0.0.1:33056
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.653927 [ERR] http: Request POST /v1/agent/services, error: method POST not allowed from=127.0.0.1:33058
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.654601 [DEBUG] http: Request POST /v1/agent/services (667.359µs) from=127.0.0.1:33058
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.658907 [ERR] http: Request DELETE /v1/agent/services, error: method DELETE not allowed from=127.0.0.1:33060
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.659486 [DEBUG] http: Request DELETE /v1/agent/services (587.022µs) from=127.0.0.1:33060
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.666291 [ERR] http: Request HEAD /v1/agent/services, error: method HEAD not allowed from=127.0.0.1:33062
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.666535 [DEBUG] http: Request HEAD /v1/agent/services (306.345µs) from=127.0.0.1:33062
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/services
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.684256 [DEBUG] http: Request OPTIONS /v1/agent/services (18µs) from=127.0.0.1:33062
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.686459 [ERR] http: Request GET /v1/acl/destroy/, error: method GET not allowed from=127.0.0.1:33062
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.687082 [DEBUG] http: Request GET /v1/acl/destroy/ (629.691µs) from=127.0.0.1:33062
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.690581 [DEBUG] http: Request PUT /v1/acl/destroy/ (349.679µs) from=127.0.0.1:33064
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.693834 [ERR] http: Request POST /v1/acl/destroy/, error: method POST not allowed from=127.0.0.1:33066
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.694320 [DEBUG] http: Request POST /v1/acl/destroy/ (491.352µs) from=127.0.0.1:33066
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.697942 [ERR] http: Request DELETE /v1/acl/destroy/, error: method DELETE not allowed from=127.0.0.1:33068
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.698555 [DEBUG] http: Request DELETE /v1/acl/destroy/ (617.024µs) from=127.0.0.1:33068
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.702709 [ERR] http: Request HEAD /v1/acl/destroy/, error: method HEAD not allowed from=127.0.0.1:33070
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.702851 [DEBUG] http: Request HEAD /v1/acl/destroy/ (154.005µs) from=127.0.0.1:33070
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.704681 [DEBUG] http: Request OPTIONS /v1/acl/destroy/ (16.334µs) from=127.0.0.1:33070
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.706818 [ERR] http: Request GET /v1/connect/intentions/, error: Bad request: failed intention lookup: index error: UUID must be 36 characters from=127.0.0.1:33070
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.707982 [DEBUG] http: Request GET /v1/connect/intentions/ (979.704µs) from=127.0.0.1:33070
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.712842 [DEBUG] http: Request PUT /v1/connect/intentions/ (1.146043ms) from=127.0.0.1:33072
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.716485 [ERR] http: Request POST /v1/connect/intentions/, error: method POST not allowed from=127.0.0.1:33074
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.717145 [DEBUG] http: Request POST /v1/connect/intentions/ (659.358µs) from=127.0.0.1:33074
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.724716 [ERR] http: Request DELETE /v1/connect/intentions/, error: Intention lookup failed: failed intention lookup: index error: UUID must be 36 characters from=127.0.0.1:33076
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.725167 [DEBUG] http: Request DELETE /v1/connect/intentions/ (940.703µs) from=127.0.0.1:33076
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.739202 [ERR] http: Request HEAD /v1/connect/intentions/, error: method HEAD not allowed from=127.0.0.1:33078
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.739433 [DEBUG] http: Request HEAD /v1/connect/intentions/ (2.304421ms) from=127.0.0.1:33078
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.749326 [DEBUG] http: Request OPTIONS /v1/connect/intentions/ (17.001µs) from=127.0.0.1:33078
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.752779 [ERR] http: Request GET /v1/session/destroy/, error: method GET not allowed from=127.0.0.1:33078
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.753389 [DEBUG] http: Request GET /v1/session/destroy/ (614.69µs) from=127.0.0.1:33078
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.757590 [DEBUG] http: Request PUT /v1/session/destroy/ (945.703µs) from=127.0.0.1:33080
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.761546 [ERR] http: Request POST /v1/session/destroy/, error: method POST not allowed from=127.0.0.1:33082
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.762179 [DEBUG] http: Request POST /v1/session/destroy/ (634.024µs) from=127.0.0.1:33082
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.765809 [ERR] http: Request DELETE /v1/session/destroy/, error: method DELETE not allowed from=127.0.0.1:33084
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.766486 [DEBUG] http: Request DELETE /v1/session/destroy/ (663.025µs) from=127.0.0.1:33084
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.769789 [ERR] http: Request HEAD /v1/session/destroy/, error: method HEAD not allowed from=127.0.0.1:33086
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.769932 [DEBUG] http: Request HEAD /v1/session/destroy/ (154.673µs) from=127.0.0.1:33086
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/destroy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.772097 [DEBUG] http: Request OPTIONS /v1/session/destroy/ (22.668µs) from=127.0.0.1:33086
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.773653 [ERR] http: Request GET /v1/acl/bootstrap, error: method GET not allowed from=127.0.0.1:33086
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.776346 [DEBUG] http: Request GET /v1/acl/bootstrap (2.655434ms) from=127.0.0.1:33086
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.784558 [DEBUG] http: Request PUT /v1/acl/bootstrap (3.302126ms) from=127.0.0.1:33088
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.790284 [ERR] http: Request POST /v1/acl/bootstrap, error: method POST not allowed from=127.0.0.1:33090
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.790922 [DEBUG] http: Request POST /v1/acl/bootstrap (630.024µs) from=127.0.0.1:33090
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.796241 [ERR] http: Request DELETE /v1/acl/bootstrap, error: method DELETE not allowed from=127.0.0.1:33092
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.797146 [DEBUG] http: Request DELETE /v1/acl/bootstrap (900.035µs) from=127.0.0.1:33092
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.801991 [ERR] http: Request HEAD /v1/acl/bootstrap, error: method HEAD not allowed from=127.0.0.1:33094
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.802210 [DEBUG] http: Request HEAD /v1/acl/bootstrap (235.342µs) from=127.0.0.1:33094
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/bootstrap
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.807222 [DEBUG] http: Request OPTIONS /v1/acl/bootstrap (16.667µs) from=127.0.0.1:33094
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.810225 [DEBUG] http: Request GET /v1/agent/connect/ca/roots (1.550059ms) from=127.0.0.1:33094
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.822673 [ERR] http: Request PUT /v1/agent/connect/ca/roots, error: method PUT not allowed from=127.0.0.1:33096
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.823377 [DEBUG] http: Request PUT /v1/agent/connect/ca/roots (732.028µs) from=127.0.0.1:33096
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.827719 [ERR] http: Request POST /v1/agent/connect/ca/roots, error: method POST not allowed from=127.0.0.1:33098
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.828676 [DEBUG] http: Request POST /v1/agent/connect/ca/roots (964.37µs) from=127.0.0.1:33098
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.832867 [ERR] http: Request DELETE /v1/agent/connect/ca/roots, error: method DELETE not allowed from=127.0.0.1:33100
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.834228 [DEBUG] http: Request DELETE /v1/agent/connect/ca/roots (1.314383ms) from=127.0.0.1:33100
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.842218 [ERR] http: Request HEAD /v1/agent/connect/ca/roots, error: method HEAD not allowed from=127.0.0.1:33102
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.842343 [DEBUG] http: Request HEAD /v1/agent/connect/ca/roots (143.339µs) from=127.0.0.1:33102
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/roots
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.845867 [DEBUG] http: Request OPTIONS /v1/agent/connect/ca/roots (14.668µs) from=127.0.0.1:33102
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.848046 [ERR] http: Request GET /v1/connect/intentions/check, error: required query parameter 'source' not set from=127.0.0.1:33102
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.848749 [DEBUG] http: Request GET /v1/connect/intentions/check (710.027µs) from=127.0.0.1:33102
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.853389 [ERR] http: Request PUT /v1/connect/intentions/check, error: method PUT not allowed from=127.0.0.1:33104
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.855664 [DEBUG] http: Request PUT /v1/connect/intentions/check (2.413425ms) from=127.0.0.1:33104
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.859587 [ERR] http: Request POST /v1/connect/intentions/check, error: method POST not allowed from=127.0.0.1:33106
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.860228 [DEBUG] http: Request POST /v1/connect/intentions/check (655.358µs) from=127.0.0.1:33106
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.864477 [ERR] http: Request DELETE /v1/connect/intentions/check, error: method DELETE not allowed from=127.0.0.1:33108
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.865047 [DEBUG] http: Request DELETE /v1/connect/intentions/check (559.021µs) from=127.0.0.1:33108
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.868496 [ERR] http: Request HEAD /v1/connect/intentions/check, error: method HEAD not allowed from=127.0.0.1:33110
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.868625 [DEBUG] http: Request HEAD /v1/connect/intentions/check (141.672µs) from=127.0.0.1:33110
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/check
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.870603 [DEBUG] http: Request OPTIONS /v1/connect/intentions/check (19.334µs) from=127.0.0.1:33110
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.873372 [DEBUG] http: Request GET /v1/agent/checks (1.24538ms) from=127.0.0.1:33110
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.876648 [ERR] http: Request PUT /v1/agent/checks, error: method PUT not allowed from=127.0.0.1:33112
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.877808 [DEBUG] http: Request PUT /v1/agent/checks (1.162378ms) from=127.0.0.1:33112
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.881416 [ERR] http: Request POST /v1/agent/checks, error: method POST not allowed from=127.0.0.1:33114
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.882074 [DEBUG] http: Request POST /v1/agent/checks (654.025µs) from=127.0.0.1:33114
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.885453 [ERR] http: Request DELETE /v1/agent/checks, error: method DELETE not allowed from=127.0.0.1:33116
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.886093 [DEBUG] http: Request DELETE /v1/agent/checks (640.691µs) from=127.0.0.1:33116
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.894684 [ERR] http: Request HEAD /v1/agent/checks, error: method HEAD not allowed from=127.0.0.1:33118
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.894859 [DEBUG] http: Request HEAD /v1/agent/checks (187.007µs) from=127.0.0.1:33118
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/checks
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.898653 [DEBUG] http: Request OPTIONS /v1/agent/checks (16.334µs) from=127.0.0.1:33118
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.907485 [DEBUG] agent: dropping node "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.908576 [DEBUG] http: Request GET /v1/agent/members (1.168044ms) from=127.0.0.1:33118
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.926108 [ERR] http: Request PUT /v1/agent/members, error: method PUT not allowed from=127.0.0.1:33120
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.926601 [DEBUG] http: Request PUT /v1/agent/members (487.018µs) from=127.0.0.1:33120
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.932008 [ERR] http: Request POST /v1/agent/members, error: method POST not allowed from=127.0.0.1:33122
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.932622 [DEBUG] http: Request POST /v1/agent/members (612.69µs) from=127.0.0.1:33122
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.936618 [ERR] http: Request DELETE /v1/agent/members, error: method DELETE not allowed from=127.0.0.1:33124
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.937369 [DEBUG] http: Request DELETE /v1/agent/members (745.695µs) from=127.0.0.1:33124
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.942739 [ERR] http: Request HEAD /v1/agent/members, error: method HEAD not allowed from=127.0.0.1:33126
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.942979 [DEBUG] http: Request HEAD /v1/agent/members (367.347µs) from=127.0.0.1:33126
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/members
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.945215 [DEBUG] http: Request OPTIONS /v1/agent/members (20.667µs) from=127.0.0.1:33126
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.946882 [ERR] http: Request GET /v1/agent/check/fail/, error: method GET not allowed from=127.0.0.1:33126
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.947428 [DEBUG] http: Request GET /v1/agent/check/fail/ (547.688µs) from=127.0.0.1:33126
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.952379 [ERR] http: Request PUT /v1/agent/check/fail/, error: Unknown check "" from=127.0.0.1:33128
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.952926 [DEBUG] http: Request PUT /v1/agent/check/fail/ (686.693µs) from=127.0.0.1:33128
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.957405 [ERR] http: Request POST /v1/agent/check/fail/, error: method POST not allowed from=127.0.0.1:33130
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.958222 [DEBUG] http: Request POST /v1/agent/check/fail/ (785.363µs) from=127.0.0.1:33130
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.962056 [ERR] http: Request DELETE /v1/agent/check/fail/, error: method DELETE not allowed from=127.0.0.1:33132
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.962580 [DEBUG] http: Request DELETE /v1/agent/check/fail/ (555.688µs) from=127.0.0.1:33132
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.966538 [ERR] http: Request HEAD /v1/agent/check/fail/, error: method HEAD not allowed from=127.0.0.1:33134
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.966871 [DEBUG] http: Request HEAD /v1/agent/check/fail/ (300.011µs) from=127.0.0.1:33134
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/fail/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.968700 [DEBUG] http: Request OPTIONS /v1/agent/check/fail/ (17µs) from=127.0.0.1:33134
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.971438 [DEBUG] consul: dropping node "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.972516 [DEBUG] http: Request GET /v1/catalog/nodes (1.566726ms) from=127.0.0.1:33134
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.975785 [ERR] http: Request PUT /v1/catalog/nodes, error: method PUT not allowed from=127.0.0.1:33136
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.976422 [DEBUG] http: Request PUT /v1/catalog/nodes (693.36µs) from=127.0.0.1:33136
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.979544 [ERR] http: Request POST /v1/catalog/nodes, error: method POST not allowed from=127.0.0.1:33138
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.980149 [DEBUG] http: Request POST /v1/catalog/nodes (604.689µs) from=127.0.0.1:33138
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.983895 [ERR] http: Request DELETE /v1/catalog/nodes, error: method DELETE not allowed from=127.0.0.1:33140
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.984563 [DEBUG] http: Request DELETE /v1/catalog/nodes (661.691µs) from=127.0.0.1:33140
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.990578 [ERR] http: Request HEAD /v1/catalog/nodes, error: method HEAD not allowed from=127.0.0.1:33142
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.990724 [DEBUG] http: Request HEAD /v1/catalog/nodes (160.006µs) from=127.0.0.1:33142
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.992799 [DEBUG] http: Request OPTIONS /v1/catalog/nodes (14.001µs) from=127.0.0.1:33142
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.994236 [ERR] http: Request GET /v1/agent/host, error: Permission denied from=127.0.0.1:33142
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.994694 [DEBUG] http: Request GET /v1/agent/host (623.024µs) from=127.0.0.1:33142
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:29.999748 [ERR] http: Request PUT /v1/agent/host, error: method PUT not allowed from=127.0.0.1:33144
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.000297 [DEBUG] http: Request PUT /v1/agent/host (549.688µs) from=127.0.0.1:33144
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.009698 [ERR] http: Request POST /v1/agent/host, error: method POST not allowed from=127.0.0.1:33146
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.012580 [DEBUG] http: Request POST /v1/agent/host (2.895444ms) from=127.0.0.1:33146
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.020447 [ERR] http: Request DELETE /v1/agent/host, error: method DELETE not allowed from=127.0.0.1:33148
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.020969 [DEBUG] http: Request DELETE /v1/agent/host (533.02µs) from=127.0.0.1:33148
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.028100 [ERR] http: Request HEAD /v1/agent/host, error: method HEAD not allowed from=127.0.0.1:33150
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.028243 [DEBUG] http: Request HEAD /v1/agent/host (163.34µs) from=127.0.0.1:33150
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/host
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.031958 [DEBUG] http: Request OPTIONS /v1/agent/host (15.667µs) from=127.0.0.1:33150
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.033807 [ERR] http: Request GET /v1/agent/connect/ca/leaf/, error: Permission denied from=127.0.0.1:33150
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.034463 [DEBUG] http: Request GET /v1/agent/connect/ca/leaf/ (835.698µs) from=127.0.0.1:33150
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.037981 [ERR] http: Request PUT /v1/agent/connect/ca/leaf/, error: method PUT not allowed from=127.0.0.1:33152
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.038581 [DEBUG] http: Request PUT /v1/agent/connect/ca/leaf/ (531.02µs) from=127.0.0.1:33152
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.043186 [ERR] http: Request POST /v1/agent/connect/ca/leaf/, error: method POST not allowed from=127.0.0.1:33154
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.043893 [DEBUG] http: Request POST /v1/agent/connect/ca/leaf/ (702.694µs) from=127.0.0.1:33154
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.047466 [ERR] http: Request DELETE /v1/agent/connect/ca/leaf/, error: method DELETE not allowed from=127.0.0.1:33156
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.048096 [DEBUG] http: Request DELETE /v1/agent/connect/ca/leaf/ (670.025µs) from=127.0.0.1:33156
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.051497 [ERR] http: Request HEAD /v1/agent/connect/ca/leaf/, error: method HEAD not allowed from=127.0.0.1:33158
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.051871 [DEBUG] http: Request HEAD /v1/agent/connect/ca/leaf/ (392.015µs) from=127.0.0.1:33158
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/leaf/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.053825 [DEBUG] http: Request OPTIONS /v1/agent/connect/ca/leaf/ (17.334µs) from=127.0.0.1:33158
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.055663 [ERR] http: Request GET /v1/agent/service/maintenance/, error: method GET not allowed from=127.0.0.1:33158
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.056322 [DEBUG] http: Request GET /v1/agent/service/maintenance/ (659.358µs) from=127.0.0.1:33158
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.063088 [DEBUG] http: Request PUT /v1/agent/service/maintenance/ (493.019µs) from=127.0.0.1:33160
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.106519 [ERR] http: Request POST /v1/agent/service/maintenance/, error: method POST not allowed from=127.0.0.1:33162
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.107392 [DEBUG] http: Request POST /v1/agent/service/maintenance/ (831.031µs) from=127.0.0.1:33162
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.111575 [ERR] http: Request DELETE /v1/agent/service/maintenance/, error: method DELETE not allowed from=127.0.0.1:33164
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.112319 [DEBUG] http: Request DELETE /v1/agent/service/maintenance/ (741.028µs) from=127.0.0.1:33164
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.117231 [ERR] http: Request HEAD /v1/agent/service/maintenance/, error: method HEAD not allowed from=127.0.0.1:33166
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.117387 [DEBUG] http: Request HEAD /v1/agent/service/maintenance/ (188.34µs) from=127.0.0.1:33166
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/maintenance/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.119650 [DEBUG] http: Request OPTIONS /v1/agent/service/maintenance/ (18.334µs) from=127.0.0.1:33166
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.122109 [DEBUG] http: Request GET /v1/health/checks/ (590.356µs) from=127.0.0.1:33166
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.126490 [ERR] http: Request PUT /v1/health/checks/, error: method PUT not allowed from=127.0.0.1:33168
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.127625 [DEBUG] http: Request PUT /v1/health/checks/ (1.021372ms) from=127.0.0.1:33168
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.131141 [ERR] http: Request POST /v1/health/checks/, error: method POST not allowed from=127.0.0.1:33170
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.132065 [DEBUG] http: Request POST /v1/health/checks/ (907.701µs) from=127.0.0.1:33170
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.135152 [ERR] http: Request DELETE /v1/health/checks/, error: method DELETE not allowed from=127.0.0.1:33172
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.135816 [DEBUG] http: Request DELETE /v1/health/checks/ (644.358µs) from=127.0.0.1:33172
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.139079 [ERR] http: Request HEAD /v1/health/checks/, error: method HEAD not allowed from=127.0.0.1:33174
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.139225 [DEBUG] http: Request HEAD /v1/health/checks/ (163.007µs) from=127.0.0.1:33174
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/checks/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.140950 [DEBUG] http: Request OPTIONS /v1/health/checks/ (18.667µs) from=127.0.0.1:33174
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.143495 [DEBUG] http: Request GET /v1/health/connect/ (632.691µs) from=127.0.0.1:33174
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.147147 [ERR] http: Request PUT /v1/health/connect/, error: method PUT not allowed from=127.0.0.1:33176
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.147890 [DEBUG] http: Request PUT /v1/health/connect/ (733.028µs) from=127.0.0.1:33176
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.151212 [ERR] http: Request POST /v1/health/connect/, error: method POST not allowed from=127.0.0.1:33178
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.151906 [DEBUG] http: Request POST /v1/health/connect/ (675.692µs) from=127.0.0.1:33178
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.158727 [ERR] http: Request DELETE /v1/health/connect/, error: method DELETE not allowed from=127.0.0.1:33180
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.159466 [DEBUG] http: Request DELETE /v1/health/connect/ (717.027µs) from=127.0.0.1:33180
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.162604 [ERR] http: Request HEAD /v1/health/connect/, error: method HEAD not allowed from=127.0.0.1:33182
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.162769 [DEBUG] http: Request HEAD /v1/health/connect/ (182.34µs) from=127.0.0.1:33182
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.165316 [DEBUG] http: Request OPTIONS /v1/health/connect/ (19.001µs) from=127.0.0.1:33182
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.167648 [DEBUG] http: Request GET /v1/agent/service/ (631.357µs) from=127.0.0.1:33182
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.170770 [ERR] http: Request PUT /v1/agent/service/, error: method PUT not allowed from=127.0.0.1:33184
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.171325 [DEBUG] http: Request PUT /v1/agent/service/ (556.355µs) from=127.0.0.1:33184
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.177883 [ERR] http: Request POST /v1/agent/service/, error: method POST not allowed from=127.0.0.1:33186
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.178446 [DEBUG] http: Request POST /v1/agent/service/ (562.355µs) from=127.0.0.1:33186
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.182158 [ERR] http: Request DELETE /v1/agent/service/, error: method DELETE not allowed from=127.0.0.1:33188
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.182801 [DEBUG] http: Request DELETE /v1/agent/service/ (627.023µs) from=127.0.0.1:33188
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.185738 [ERR] http: Request HEAD /v1/agent/service/, error: method HEAD not allowed from=127.0.0.1:33190
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.185896 [DEBUG] http: Request HEAD /v1/agent/service/ (177.007µs) from=127.0.0.1:33190
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.187588 [DEBUG] http: Request OPTIONS /v1/agent/service/ (17.668µs) from=127.0.0.1:33190
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.189202 [ERR] http: Request GET /v1/acl/token/self, error: ACL not found from=127.0.0.1:33190
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.189761 [DEBUG] http: Request GET /v1/acl/token/self (751.695µs) from=127.0.0.1:33190
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.192703 [ERR] http: Request PUT /v1/acl/token/self, error: method PUT not allowed from=127.0.0.1:33192
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.193279 [DEBUG] http: Request PUT /v1/acl/token/self (576.022µs) from=127.0.0.1:33192
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.196658 [ERR] http: Request POST /v1/acl/token/self, error: method POST not allowed from=127.0.0.1:33194
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.197395 [DEBUG] http: Request POST /v1/acl/token/self (739.028µs) from=127.0.0.1:33194
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.201516 [ERR] http: Request DELETE /v1/acl/token/self, error: method DELETE not allowed from=127.0.0.1:33196
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.202153 [DEBUG] http: Request DELETE /v1/acl/token/self (631.357µs) from=127.0.0.1:33196
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.205328 [ERR] http: Request HEAD /v1/acl/token/self, error: method HEAD not allowed from=127.0.0.1:33198
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.205471 [DEBUG] http: Request HEAD /v1/acl/token/self (157.006µs) from=127.0.0.1:33198
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/self
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.207067 [DEBUG] http: Request OPTIONS /v1/acl/token/self (15.001µs) from=127.0.0.1:33198
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.208805 [ERR] http: Request GET /v1/agent/force-leave/, error: method GET not allowed from=127.0.0.1:33198
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.210212 [DEBUG] http: Request GET /v1/agent/force-leave/ (1.40272ms) from=127.0.0.1:33198
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.213578 [ERR] http: Request PUT /v1/agent/force-leave/, error: Permission denied from=127.0.0.1:33200
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.214177 [DEBUG] http: Request PUT /v1/agent/force-leave/ (722.694µs) from=127.0.0.1:33200
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.217882 [ERR] http: Request POST /v1/agent/force-leave/, error: method POST not allowed from=127.0.0.1:33202
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.218759 [DEBUG] http: Request POST /v1/agent/force-leave/ (856.699µs) from=127.0.0.1:33202
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.223412 [ERR] http: Request DELETE /v1/agent/force-leave/, error: method DELETE not allowed from=127.0.0.1:33204
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.224123 [DEBUG] http: Request DELETE /v1/agent/force-leave/ (701.36µs) from=127.0.0.1:33204
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.228829 [ERR] http: Request HEAD /v1/agent/force-leave/, error: method HEAD not allowed from=127.0.0.1:33206
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.228990 [DEBUG] http: Request HEAD /v1/agent/force-leave/ (175.34µs) from=127.0.0.1:33206
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/force-leave/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.230900 [DEBUG] http: Request OPTIONS /v1/agent/force-leave/ (17.001µs) from=127.0.0.1:33206
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.232697 [ERR] http: Request GET /v1/operator/raft/peer, error: method GET not allowed from=127.0.0.1:33206
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.233310 [DEBUG] http: Request GET /v1/operator/raft/peer (588.022µs) from=127.0.0.1:33206
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.236503 [ERR] http: Request PUT /v1/operator/raft/peer, error: method PUT not allowed from=127.0.0.1:33208
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.237266 [DEBUG] http: Request PUT /v1/operator/raft/peer (723.694µs) from=127.0.0.1:33208
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.241914 [ERR] http: Request POST /v1/operator/raft/peer, error: method POST not allowed from=127.0.0.1:33210
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.243124 [DEBUG] http: Request POST /v1/operator/raft/peer (2.137414ms) from=127.0.0.1:33210
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.249511 [DEBUG] http: Request DELETE /v1/operator/raft/peer (972.37µs) from=127.0.0.1:33212
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.252542 [ERR] http: Request HEAD /v1/operator/raft/peer, error: method HEAD not allowed from=127.0.0.1:33214
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.252707 [DEBUG] http: Request HEAD /v1/operator/raft/peer (181.34µs) from=127.0.0.1:33214
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/peer
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.254323 [DEBUG] http: Request OPTIONS /v1/operator/raft/peer (15.334µs) from=127.0.0.1:33214
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.255859 [ERR] http: Request GET /v1/acl/policy/, error: Bad request: Missing policy ID from=127.0.0.1:33214
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.256415 [DEBUG] http: Request GET /v1/acl/policy/ (565.355µs) from=127.0.0.1:33214
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.260111 [ERR] http: Request PUT /v1/acl/policy/, error: Bad request: Policy decoding failed: EOF from=127.0.0.1:33216
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.260633 [DEBUG] http: Request PUT /v1/acl/policy/ (599.023µs) from=127.0.0.1:33216
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.263520 [ERR] http: Request POST /v1/acl/policy/, error: method POST not allowed from=127.0.0.1:33218
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.264049 [DEBUG] http: Request POST /v1/acl/policy/ (536.354µs) from=127.0.0.1:33218
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.267122 [ERR] http: Request DELETE /v1/acl/policy/, error: Bad request: Missing policy ID from=127.0.0.1:33220
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.267825 [DEBUG] http: Request DELETE /v1/acl/policy/ (709.693µs) from=127.0.0.1:33220
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.270610 [ERR] http: Request HEAD /v1/acl/policy/, error: method HEAD not allowed from=127.0.0.1:33222
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.270762 [DEBUG] http: Request HEAD /v1/acl/policy/ (173.674µs) from=127.0.0.1:33222
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.272289 [DEBUG] http: Request OPTIONS /v1/acl/policy/ (15.334µs) from=127.0.0.1:33222
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.273743 [ERR] http: Request GET /v1/acl/rules/translate/, error: Bad request: Missing token ID from=127.0.0.1:33222
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.274234 [DEBUG] http: Request GET /v1/acl/rules/translate/ (491.352µs) from=127.0.0.1:33222
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.277243 [ERR] http: Request PUT /v1/acl/rules/translate/, error: method PUT not allowed from=127.0.0.1:33224
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.277788 [DEBUG] http: Request PUT /v1/acl/rules/translate/ (514.353µs) from=127.0.0.1:33224
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.280816 [ERR] http: Request POST /v1/acl/rules/translate/, error: method POST not allowed from=127.0.0.1:33226
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.281531 [DEBUG] http: Request POST /v1/acl/rules/translate/ (723.361µs) from=127.0.0.1:33226
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.284460 [ERR] http: Request DELETE /v1/acl/rules/translate/, error: method DELETE not allowed from=127.0.0.1:33228
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.285024 [DEBUG] http: Request DELETE /v1/acl/rules/translate/ (561.688µs) from=127.0.0.1:33228
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.287952 [ERR] http: Request HEAD /v1/acl/rules/translate/, error: method HEAD not allowed from=127.0.0.1:33230
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.288102 [DEBUG] http: Request HEAD /v1/acl/rules/translate/ (168.673µs) from=127.0.0.1:33230
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.289802 [DEBUG] http: Request OPTIONS /v1/acl/rules/translate/ (20.667µs) from=127.0.0.1:33230
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.292122 [DEBUG] http: Request GET /v1/catalog/datacenters (915.368µs) from=127.0.0.1:33230
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.296158 [ERR] http: Request PUT /v1/catalog/datacenters, error: method PUT not allowed from=127.0.0.1:33232
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.297054 [DEBUG] http: Request PUT /v1/catalog/datacenters (896.7µs) from=127.0.0.1:33232
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.300030 [ERR] http: Request POST /v1/catalog/datacenters, error: method POST not allowed from=127.0.0.1:33234
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.300779 [DEBUG] http: Request POST /v1/catalog/datacenters (749.695µs) from=127.0.0.1:33234
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.303868 [ERR] http: Request DELETE /v1/catalog/datacenters, error: method DELETE not allowed from=127.0.0.1:33236
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.304425 [DEBUG] http: Request DELETE /v1/catalog/datacenters (558.354µs) from=127.0.0.1:33236
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.307435 [ERR] http: Request HEAD /v1/catalog/datacenters, error: method HEAD not allowed from=127.0.0.1:33238
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.307583 [DEBUG] http: Request HEAD /v1/catalog/datacenters (163.673µs) from=127.0.0.1:33238
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/datacenters
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.309224 [DEBUG] http: Request OPTIONS /v1/catalog/datacenters (14.667µs) from=127.0.0.1:33238
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.311172 [ERR] http: Request GET /v1/connect/ca/configuration, error: Permission denied from=127.0.0.1:33238
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.311837 [DEBUG] http: Request GET /v1/connect/ca/configuration (980.704µs) from=127.0.0.1:33238
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.324444 [DEBUG] http: Request PUT /v1/connect/ca/configuration (577.355µs) from=127.0.0.1:33240
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.327848 [ERR] http: Request POST /v1/connect/ca/configuration, error: method POST not allowed from=127.0.0.1:33242
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.328442 [DEBUG] http: Request POST /v1/connect/ca/configuration (605.023µs) from=127.0.0.1:33242
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.332865 [ERR] http: Request DELETE /v1/connect/ca/configuration, error: method DELETE not allowed from=127.0.0.1:33244
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.333429 [DEBUG] http: Request DELETE /v1/connect/ca/configuration (562.022µs) from=127.0.0.1:33244
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.340107 [ERR] http: Request HEAD /v1/connect/ca/configuration, error: method HEAD not allowed from=127.0.0.1:33246
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.340261 [DEBUG] http: Request HEAD /v1/connect/ca/configuration (175.006µs) from=127.0.0.1:33246
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/configuration
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.342244 [DEBUG] http: Request OPTIONS /v1/connect/ca/configuration (14.668µs) from=127.0.0.1:33246
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.344802 [DEBUG] http: Request GET /v1/session/info/ (496.686µs) from=127.0.0.1:33246
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.348380 [ERR] http: Request PUT /v1/session/info/, error: method PUT not allowed from=127.0.0.1:33248
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.349094 [DEBUG] http: Request PUT /v1/session/info/ (703.36µs) from=127.0.0.1:33248
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.352166 [ERR] http: Request POST /v1/session/info/, error: method POST not allowed from=127.0.0.1:33250
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.352758 [DEBUG] http: Request POST /v1/session/info/ (580.689µs) from=127.0.0.1:33250
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.357099 [ERR] http: Request DELETE /v1/session/info/, error: method DELETE not allowed from=127.0.0.1:33252
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.357718 [DEBUG] http: Request DELETE /v1/session/info/ (615.357µs) from=127.0.0.1:33252
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.361288 [ERR] http: Request HEAD /v1/session/info/, error: method HEAD not allowed from=127.0.0.1:33254
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.361444 [DEBUG] http: Request HEAD /v1/session/info/ (169.34µs) from=127.0.0.1:33254
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/info/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.364228 [DEBUG] http: Request OPTIONS /v1/session/info/ (17µs) from=127.0.0.1:33254
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.366352 [ERR] http: Request GET /v1/acl/policies, error: Permission denied from=127.0.0.1:33254
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.367050 [DEBUG] http: Request GET /v1/acl/policies (1.157711ms) from=127.0.0.1:33254
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.371146 [ERR] http: Request PUT /v1/acl/policies, error: method PUT not allowed from=127.0.0.1:33256
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.371833 [DEBUG] http: Request PUT /v1/acl/policies (670.359µs) from=127.0.0.1:33256
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.376192 [ERR] http: Request POST /v1/acl/policies, error: method POST not allowed from=127.0.0.1:33258
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.376924 [DEBUG] http: Request POST /v1/acl/policies (721.028µs) from=127.0.0.1:33258
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.380437 [ERR] http: Request DELETE /v1/acl/policies, error: method DELETE not allowed from=127.0.0.1:33260
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.381071 [DEBUG] http: Request DELETE /v1/acl/policies (625.357µs) from=127.0.0.1:33260
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.385213 [ERR] http: Request HEAD /v1/acl/policies, error: method HEAD not allowed from=127.0.0.1:33262
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.385375 [DEBUG] http: Request HEAD /v1/acl/policies (176.007µs) from=127.0.0.1:33262
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policies
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.387986 [DEBUG] http: Request OPTIONS /v1/acl/policies (18.001µs) from=127.0.0.1:33262
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.390141 [DEBUG] http: Request GET /v1/catalog/connect/ (537.354µs) from=127.0.0.1:33262
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.393720 [ERR] http: Request PUT /v1/catalog/connect/, error: method PUT not allowed from=127.0.0.1:33264
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.394315 [DEBUG] http: Request PUT /v1/catalog/connect/ (599.689µs) from=127.0.0.1:33264
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.402573 [ERR] http: Request POST /v1/catalog/connect/, error: method POST not allowed from=127.0.0.1:33266
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.403171 [DEBUG] http: Request POST /v1/catalog/connect/ (581.355µs) from=127.0.0.1:33266
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.406468 [ERR] http: Request DELETE /v1/catalog/connect/, error: method DELETE not allowed from=127.0.0.1:33268
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.407350 [DEBUG] http: Request DELETE /v1/catalog/connect/ (866.366µs) from=127.0.0.1:33268
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.411079 [ERR] http: Request HEAD /v1/catalog/connect/, error: method HEAD not allowed from=127.0.0.1:33270
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.411232 [DEBUG] http: Request HEAD /v1/catalog/connect/ (166.339µs) from=127.0.0.1:33270
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/connect/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.413421 [DEBUG] http: Request OPTIONS /v1/catalog/connect/ (17.667µs) from=127.0.0.1:33270
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.415769 [DEBUG] http: Request GET /v1/catalog/service/ (465.684µs) from=127.0.0.1:33270
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.419214 [ERR] http: Request PUT /v1/catalog/service/, error: method PUT not allowed from=127.0.0.1:33272
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.419866 [DEBUG] http: Request PUT /v1/catalog/service/ (644.024µs) from=127.0.0.1:33272
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.424280 [ERR] http: Request POST /v1/catalog/service/, error: method POST not allowed from=127.0.0.1:33274
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.424922 [DEBUG] http: Request POST /v1/catalog/service/ (635.357µs) from=127.0.0.1:33274
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.428747 [ERR] http: Request DELETE /v1/catalog/service/, error: method DELETE not allowed from=127.0.0.1:33276
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.429467 [DEBUG] http: Request DELETE /v1/catalog/service/ (697.027µs) from=127.0.0.1:33276
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.433711 [ERR] http: Request HEAD /v1/catalog/service/, error: method HEAD not allowed from=127.0.0.1:33278
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.433870 [DEBUG] http: Request HEAD /v1/catalog/service/ (176.007µs) from=127.0.0.1:33278
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/service/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.435488 [DEBUG] http: Request OPTIONS /v1/catalog/service/ (20.001µs) from=127.0.0.1:33278
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.438514 [DEBUG] http: Request GET /v1/connect/intentions (1.39872ms) from=127.0.0.1:33278
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.442242 [ERR] http: Request PUT /v1/connect/intentions, error: method PUT not allowed from=127.0.0.1:33280
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.442934 [DEBUG] http: Request PUT /v1/connect/intentions (683.026µs) from=127.0.0.1:33280
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.446127 [ERR] http: Request POST /v1/connect/intentions, error: Failed to decode request body: EOF from=127.0.0.1:33282
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.446930 [DEBUG] http: Request POST /v1/connect/intentions (794.697µs) from=127.0.0.1:33282
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.450252 [ERR] http: Request DELETE /v1/connect/intentions, error: method DELETE not allowed from=127.0.0.1:33284
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.450864 [DEBUG] http: Request DELETE /v1/connect/intentions (619.023µs) from=127.0.0.1:33284
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.454094 [ERR] http: Request HEAD /v1/connect/intentions, error: method HEAD not allowed from=127.0.0.1:33286
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.454242 [DEBUG] http: Request HEAD /v1/connect/intentions (167.006µs) from=127.0.0.1:33286
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.455813 [DEBUG] http: Request OPTIONS /v1/connect/intentions (16.334µs) from=127.0.0.1:33286
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.457971 [DEBUG] consul: dropping node "Node ee0bf48f-de34-6655-a3c7-581dae087e0c" from result due to ACLs
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.460669 [DEBUG] http: Request GET /v1/internal/ui/nodes (3.191122ms) from=127.0.0.1:33286
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.465263 [ERR] http: Request PUT /v1/internal/ui/nodes, error: method PUT not allowed from=127.0.0.1:33288
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.465948 [DEBUG] http: Request PUT /v1/internal/ui/nodes (671.358µs) from=127.0.0.1:33288
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.469852 [ERR] http: Request POST /v1/internal/ui/nodes, error: method POST not allowed from=127.0.0.1:33290
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.470510 [DEBUG] http: Request POST /v1/internal/ui/nodes (645.024µs) from=127.0.0.1:33290
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.474083 [ERR] http: Request DELETE /v1/internal/ui/nodes, error: method DELETE not allowed from=127.0.0.1:33292
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.474925 [DEBUG] http: Request DELETE /v1/internal/ui/nodes (832.699µs) from=127.0.0.1:33292
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.480996 [ERR] http: Request HEAD /v1/internal/ui/nodes, error: method HEAD not allowed from=127.0.0.1:33294
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.481175 [DEBUG] http: Request HEAD /v1/internal/ui/nodes (204.341µs) from=127.0.0.1:33294
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/nodes
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.483260 [DEBUG] http: Request OPTIONS /v1/internal/ui/nodes (20.001µs) from=127.0.0.1:33294
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.486582 [ERR] http: Request GET /v1/session/create, error: method GET not allowed from=127.0.0.1:33294
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.487282 [DEBUG] http: Request GET /v1/session/create (704.026µs) from=127.0.0.1:33294
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.491115 [ERR] http: Request PUT /v1/session/create, error: Permission denied from=127.0.0.1:33296
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.491903 [DEBUG] http: Request PUT /v1/session/create (1.23138ms) from=127.0.0.1:33296
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.495315 [ERR] http: Request POST /v1/session/create, error: method POST not allowed from=127.0.0.1:33298
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.495903 [DEBUG] http: Request POST /v1/session/create (604.356µs) from=127.0.0.1:33298
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.499331 [ERR] http: Request DELETE /v1/session/create, error: method DELETE not allowed from=127.0.0.1:33300
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.499961 [DEBUG] http: Request DELETE /v1/session/create (718.694µs) from=127.0.0.1:33300
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.503974 [ERR] http: Request HEAD /v1/session/create, error: method HEAD not allowed from=127.0.0.1:33302
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.504133 [DEBUG] http: Request HEAD /v1/session/create (175.673µs) from=127.0.0.1:33302
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/create
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.506297 [DEBUG] http: Request OPTIONS /v1/session/create (15.334µs) from=127.0.0.1:33302
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.508315 [ERR] http: Request GET /v1/session/renew/, error: method GET not allowed from=127.0.0.1:33302
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.509030 [DEBUG] http: Request GET /v1/session/renew/ (700.36µs) from=127.0.0.1:33302
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.512801 [DEBUG] http: Request PUT /v1/session/renew/ (705.36µs) from=127.0.0.1:33304
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.517588 [ERR] http: Request POST /v1/session/renew/, error: method POST not allowed from=127.0.0.1:33306
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.519857 [DEBUG] http: Request POST /v1/session/renew/ (2.243419ms) from=127.0.0.1:33306
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.525055 [ERR] http: Request DELETE /v1/session/renew/, error: method DELETE not allowed from=127.0.0.1:33308
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.526071 [DEBUG] http: Request DELETE /v1/session/renew/ (1.002705ms) from=127.0.0.1:33308
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.531153 [ERR] http: Request HEAD /v1/session/renew/, error: method HEAD not allowed from=127.0.0.1:33310
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.531305 [DEBUG] http: Request HEAD /v1/session/renew/ (168.339µs) from=127.0.0.1:33310
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/renew/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.533053 [DEBUG] http: Request OPTIONS /v1/session/renew/ (16.667µs) from=127.0.0.1:33310
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.534859 [ERR] http: Request GET /v1/acl/clone/, error: method GET not allowed from=127.0.0.1:33310
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.535877 [DEBUG] http: Request GET /v1/acl/clone/ (734.695µs) from=127.0.0.1:33310
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.540733 [DEBUG] http: Request PUT /v1/acl/clone/ (803.363µs) from=127.0.0.1:33312
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.548661 [ERR] http: Request POST /v1/acl/clone/, error: method POST not allowed from=127.0.0.1:33314
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.549321 [DEBUG] http: Request POST /v1/acl/clone/ (653.358µs) from=127.0.0.1:33314
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.553784 [ERR] http: Request DELETE /v1/acl/clone/, error: method DELETE not allowed from=127.0.0.1:33316
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.554516 [DEBUG] http: Request DELETE /v1/acl/clone/ (731.361µs) from=127.0.0.1:33316
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.558438 [ERR] http: Request HEAD /v1/acl/clone/, error: method HEAD not allowed from=127.0.0.1:33318
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.558598 [DEBUG] http: Request HEAD /v1/acl/clone/ (172.34µs) from=127.0.0.1:33318
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/clone/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.560232 [DEBUG] http: Request OPTIONS /v1/acl/clone/ (16µs) from=127.0.0.1:33318
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.561965 [ERR] http: Request GET /v1/catalog/deregister, error: method GET not allowed from=127.0.0.1:33318
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.563241 [DEBUG] http: Request GET /v1/catalog/deregister (1.237047ms) from=127.0.0.1:33318
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.567900 [DEBUG] http: Request PUT /v1/catalog/deregister (627.691µs) from=127.0.0.1:33320
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.571492 [ERR] http: Request POST /v1/catalog/deregister, error: method POST not allowed from=127.0.0.1:33322
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.572248 [DEBUG] http: Request POST /v1/catalog/deregister (736.028µs) from=127.0.0.1:33322
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.575194 [ERR] http: Request DELETE /v1/catalog/deregister, error: method DELETE not allowed from=127.0.0.1:33324
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.575906 [DEBUG] http: Request DELETE /v1/catalog/deregister (690.026µs) from=127.0.0.1:33324
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.579318 [ERR] http: Request HEAD /v1/catalog/deregister, error: method HEAD not allowed from=127.0.0.1:33326
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.579469 [DEBUG] http: Request HEAD /v1/catalog/deregister (174.34µs) from=127.0.0.1:33326
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/deregister
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.581405 [DEBUG] http: Request OPTIONS /v1/catalog/deregister (16.334µs) from=127.0.0.1:33326
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.583590 [DEBUG] http: Request GET /v1/catalog/node/ (565.355µs) from=127.0.0.1:33326
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.586436 [ERR] http: Request PUT /v1/catalog/node/, error: method PUT not allowed from=127.0.0.1:33328
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.587731 [DEBUG] http: Request PUT /v1/catalog/node/ (1.277049ms) from=127.0.0.1:33328
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.590922 [ERR] http: Request POST /v1/catalog/node/, error: method POST not allowed from=127.0.0.1:33330
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.591605 [DEBUG] http: Request POST /v1/catalog/node/ (670.358µs) from=127.0.0.1:33330
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.594721 [ERR] http: Request DELETE /v1/catalog/node/, error: method DELETE not allowed from=127.0.0.1:33332
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.595555 [DEBUG] http: Request DELETE /v1/catalog/node/ (820.698µs) from=127.0.0.1:33332
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.598745 [ERR] http: Request HEAD /v1/catalog/node/, error: method HEAD not allowed from=127.0.0.1:33334
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.598881 [DEBUG] http: Request HEAD /v1/catalog/node/ (160.672µs) from=127.0.0.1:33334
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/node/
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.600807 [DEBUG] http: Request OPTIONS /v1/catalog/node/ (17.334µs) from=127.0.0.1:33334
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.602971 [ERR] http: Request GET /v1/snapshot, error: Permission denied from=127.0.0.1:33334
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.603471 [DEBUG] http: Request GET /v1/snapshot (775.029µs) from=127.0.0.1:33334
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.607040 [ERR] http: Request PUT /v1/snapshot, error: Permission denied from=127.0.0.1:33336
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.607833 [DEBUG] http: Request PUT /v1/snapshot (1.217046ms) from=127.0.0.1:33336
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.612281 [ERR] http: Request POST /v1/snapshot, error: method POST not allowed from=127.0.0.1:33338
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.612952 [DEBUG] http: Request POST /v1/snapshot (673.359µs) from=127.0.0.1:33338
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.617722 [ERR] http: Request DELETE /v1/snapshot, error: method DELETE not allowed from=127.0.0.1:33340
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.618466 [DEBUG] http: Request DELETE /v1/snapshot (781.363µs) from=127.0.0.1:33340
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.621627 [ERR] http: Request HEAD /v1/snapshot, error: method HEAD not allowed from=127.0.0.1:33342
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.621850 [DEBUG] http: Request HEAD /v1/snapshot (245.676µs) from=127.0.0.1:33342
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/snapshot
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.623183 [DEBUG] http: Request OPTIONS /v1/snapshot (15.334µs) from=127.0.0.1:33342
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.624547 [ERR] http: Request GET /v1/acl/rules/translate, error: method GET not allowed from=127.0.0.1:33342
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.625258 [DEBUG] http: Request GET /v1/acl/rules/translate (693.027µs) from=127.0.0.1:33342
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.627993 [ERR] http: Request PUT /v1/acl/rules/translate, error: method PUT not allowed from=127.0.0.1:33344
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.628659 [DEBUG] http: Request PUT /v1/acl/rules/translate (659.358µs) from=127.0.0.1:33344
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.632252 [ERR] http: Request POST /v1/acl/rules/translate, error: Permission denied from=127.0.0.1:33346
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.632857 [DEBUG] http: Request POST /v1/acl/rules/translate (742.028µs) from=127.0.0.1:33346
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.636291 [ERR] http: Request DELETE /v1/acl/rules/translate, error: method DELETE not allowed from=127.0.0.1:33348
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.637110 [DEBUG] http: Request DELETE /v1/acl/rules/translate (790.697µs) from=127.0.0.1:33348
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.641379 [ERR] http: Request HEAD /v1/acl/rules/translate, error: method HEAD not allowed from=127.0.0.1:33350
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.641514 [DEBUG] http: Request HEAD /v1/acl/rules/translate (150.339µs) from=127.0.0.1:33350
=== RUN   TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.643370 [DEBUG] http: Request OPTIONS /v1/acl/rules/translate (17.334µs) from=127.0.0.1:33350
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.647315 [INFO] agent: Requesting shutdown
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.647402 [INFO] consul: shutting down server
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.647454 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.785367 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.918728 [INFO] manager: shutting down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.919624 [INFO] agent: consul server down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.919678 [INFO] agent: shutdown complete
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.919730 [INFO] agent: Stopping DNS server 127.0.0.1:11633 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.919897 [INFO] agent: Stopping DNS server 127.0.0.1:11633 (udp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.920067 [INFO] agent: Stopping HTTP server 127.0.0.1:11634 (tcp)
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.920545 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_MethodNotAllowed_OSS - 2019/11/27 02:18:30.920717 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_MethodNotAllowed_OSS (11.18s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/execute (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/deregister/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/id/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/match (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/fire/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/status/leader (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/replication (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/join/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/datacenters (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/datacenters (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/datacenters (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/coordinate/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/configuration (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/txn (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/authorize (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/authorize (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/list (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/services (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/services (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/ (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/bootstrap (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/bootstrap (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/bootstrap (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/bootstrap (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/roots (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions/check (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions/check (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/checks (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/checks (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/members (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/members (0.02s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/check/fail/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/check/fail/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/host (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/host (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/host (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/maintenance/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/maintenance/ (0.04s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/maintenance/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/health/connect/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/service/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/agent/force-leave/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/operator/raft/peer (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/operator/raft/peer (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/ca/configuration (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/ca/configuration (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/ca/configuration (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/connect/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/service/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/internal/ui/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/internal/ui/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/internal/ui/nodes (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/session/renew/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/session/renew/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/clone/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/clone/ (0.01s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/GET_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/PUT_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/POST_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/DELETE_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/HEAD_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_MethodNotAllowed_OSS/OPTIONS_/v1/acl/rules/translate (0.00s)
=== RUN   TestHTTPAPI_OptionMethod_OSS
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:31.019951 [WARN] agent: Node name "Node 3867cc14-2dcb-2c33-3338-0847be35b906" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:31.020538 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:31.020609 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:31.020839 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:31.020947 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3867cc14-2dcb-2c33-3338-0847be35b906 Address:127.0.0.1:11644}]
2019/11/27 02:18:33 [INFO]  raft: Node at 127.0.0.1:11644 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.045219 [INFO] serf: EventMemberJoin: Node 3867cc14-2dcb-2c33-3338-0847be35b906.dc1 127.0.0.1
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.057544 [INFO] serf: EventMemberJoin: Node 3867cc14-2dcb-2c33-3338-0847be35b906 127.0.0.1
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.058249 [INFO] consul: Adding LAN server Node 3867cc14-2dcb-2c33-3338-0847be35b906 (Addr: tcp/127.0.0.1:11644) (DC: dc1)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.058250 [INFO] consul: Handled member-join event for server "Node 3867cc14-2dcb-2c33-3338-0847be35b906.dc1" in area "wan"
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.059062 [INFO] agent: Started DNS server 127.0.0.1:11639 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.059150 [INFO] agent: Started DNS server 127.0.0.1:11639 (udp)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.062156 [INFO] agent: Started HTTP server on 127.0.0.1:11640 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:33.062281 [INFO] agent: started state syncer
2019/11/27 02:18:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:33 [INFO]  raft: Node at 127.0.0.1:11644 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:36 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:36 [INFO]  raft: Node at 127.0.0.1:11644 [Leader] entering Leader state
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:36.365096 [INFO] consul: cluster leadership acquired
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:36.365638 [INFO] consul: New leader elected: Node 3867cc14-2dcb-2c33-3338-0847be35b906
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:36.503861 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:37.844124 [INFO] acl: initializing acls
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:38.100451 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:38.344511 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:38.345396 [INFO] serf: EventMemberUpdate: Node 3867cc14-2dcb-2c33-3338-0847be35b906
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:38.346072 [INFO] serf: EventMemberUpdate: Node 3867cc14-2dcb-2c33-3338-0847be35b906.dc1
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:39.388206 [INFO] agent: Synced node info
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:39.388349 [DEBUG] agent: Node info in sync
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:41.719016 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:41.719545 [DEBUG] consul: Skipping self join check for "Node 3867cc14-2dcb-2c33-3338-0847be35b906" since the cluster is too small
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:41.719776 [INFO] consul: member 'Node 3867cc14-2dcb-2c33-3338-0847be35b906' joined, marking health alive
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.020050 [DEBUG] consul: Skipping self join check for "Node 3867cc14-2dcb-2c33-3338-0847be35b906" since the cluster is too small
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.021843 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/query (82.003µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.023038 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/query/ (555.354µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/execute
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.023810 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/query/xxx/execute (100.671µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/explain
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.024535 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/query/xxx/explain (91.337µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.025127 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/nodes (14.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/host
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.025695 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/host (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/members
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.026242 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/members (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/fail/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.026835 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/check/fail/ (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/checks/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.027405 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/health/checks/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/connect/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.027992 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/health/connect/ (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.028561 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/service/ (14.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/leaf/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.029156 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/connect/ca/leaf/ (12.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/maintenance/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.029726 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/service/maintenance/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/peer
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.030264 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/operator/raft/peer (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.030895 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/policy/ (17.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/self
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.031488 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/token/self (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/force-leave/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.032141 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/force-leave/ (15.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.032787 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/connect/ca/configuration (15.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/info/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.033413 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/session/info/ (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policies
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.033993 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/policies (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.034568 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/rules/translate/ (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/datacenters
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.035158 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/datacenters (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.035718 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/connect/intentions (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.036289 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/internal/ui/nodes (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/create
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.037761 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/session/create (15.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/renew/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.038425 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/session/renew/ (16.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/clone/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.038894 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/clone/ (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/connect/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.039329 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/connect/ (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/service/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.039751 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/service/ (12.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/snapshot
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.040169 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/snapshot (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.040614 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/rules/translate (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/deregister
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.041026 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/deregister (13.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/node/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.041435 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/node/ (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/proxy/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.041950 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/connect/proxy/ (12.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/register
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.042401 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/service/register (16.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.042851 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/operator/autopilot/configuration (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/health
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.043269 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/operator/autopilot/health (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/node/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.043768 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/session/node/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.044367 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/token/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/deregister/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.044983 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/check/deregister/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/pass/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.045698 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/check/pass/ (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/keyring
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.046285 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/operator/keyring (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/register
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.046909 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/check/register (14.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/state/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.047537 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/health/state/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/kv/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.048314 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/kv/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/id/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.048876 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/health/service/id/ (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/node/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.049443 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/coordinate/node/ (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/services
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.050005 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/services (18.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/roots
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.050556 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/connect/ca/roots (13.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/match
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.051189 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/connect/intentions/match (16.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/update
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.051835 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/coordinate/update (15.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/fire/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.052480 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/event/fire/ (27.668µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/update
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.053169 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/update (15.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.053749 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/token (14.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/self
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.054328 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/self (15µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/list
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.054895 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/event/list (11µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/node/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.055462 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/internal/ui/node/ (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/status/leader
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.056044 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/status/leader (14.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/token/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.056659 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/token/ (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/maintenance
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.057298 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/maintenance (13.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/join/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.057886 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/join/ (14.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/name/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.058655 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/health/service/name/ (16.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/info/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.059074 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/info/ (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/list
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.059498 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/list (13µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/replication
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.059921 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/replication (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/update/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.060323 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/check/update/ (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/register
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.060724 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/catalog/register (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/node/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.061146 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/health/node/ (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/create
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.061635 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/create (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.062348 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/policy (19.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/warn/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.063168 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/check/warn/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/deregister/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.063600 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/service/deregister/ (14.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/datacenters
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.064017 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/coordinate/datacenters (12.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/nodes
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.064421 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/coordinate/nodes (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/configuration
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.064839 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/operator/raft/configuration (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/tokens
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.065237 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/tokens (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/metrics
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.065639 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/metrics (11.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/leave
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.066036 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/leave (11.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/txn
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.066448 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/txn (15.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/services
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.066943 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/internal/ui/services (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/list
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.067362 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/session/list (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/services
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.067759 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/services (12.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/authorize
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.068173 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/connect/authorize (12µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/service/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.068570 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/health/service/ (11.333µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/destroy/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.068988 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/session/destroy/ (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/bootstrap
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.069439 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/bootstrap (14µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/destroy/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.069992 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/acl/destroy/ (12.667µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.070435 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/connect/intentions/ (15.001µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/checks
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.070853 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/checks (11µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/roots
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.071287 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/agent/connect/ca/roots (13.334µs) from=
=== RUN   TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/check
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.071774 [DEBUG] http: Request OPTIONS http://127.0.0.1:11640/v1/connect/intentions/check (13.667µs) from=
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.071922 [INFO] agent: Requesting shutdown
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.071992 [INFO] consul: shutting down server
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.072092 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.151338 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.152179 [INFO] manager: shutting down
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.152894 [INFO] agent: consul server down
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.152964 [INFO] agent: shutdown complete
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.153030 [INFO] agent: Stopping DNS server 127.0.0.1:11639 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.153171 [INFO] agent: Stopping DNS server 127.0.0.1:11639 (udp)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.153323 [INFO] agent: Stopping HTTP server 127.0.0.1:11640 (tcp)
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.153523 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_OptionMethod_OSS - 2019/11/27 02:18:42.153593 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_OptionMethod_OSS (11.23s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/execute (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/query/xxx/explain (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/host (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/members (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/checks/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/connect/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/leaf/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/self (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/info/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policies (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/datacenters (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/connect/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/proxy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/autopilot/health (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/state/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/id/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/match (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/self (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/event/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/status/leader (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/health/service/name/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/info/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/replication (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/node/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/datacenters (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/coordinate/nodes (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/operator/raft/configuration (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/tokens (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/metrics (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/txn (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/internal/ui/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/list (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/services (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/health/service/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/checks (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/agent/connect/ca/roots (0.00s)
    --- PASS: TestHTTPAPI_OptionMethod_OSS/OPTIONS_/v1/connect/intentions/check (0.00s)
=== RUN   TestHTTPAPI_AllowedNets_OSS
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:42.226518 [WARN] agent: Node name "Node fb221cfb-404c-a39e-5b11-45342e02d595" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:42.227239 [DEBUG] tlsutil: Update with version 1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:42.227312 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:42.227468 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:42.227576 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fb221cfb-404c-a39e-5b11-45342e02d595 Address:127.0.0.1:11650}]
2019/11/27 02:18:43 [INFO]  raft: Node at 127.0.0.1:11650 [Follower] entering Follower state (Leader: "")
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.266093 [INFO] serf: EventMemberJoin: Node fb221cfb-404c-a39e-5b11-45342e02d595.dc1 127.0.0.1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.273399 [INFO] serf: EventMemberJoin: Node fb221cfb-404c-a39e-5b11-45342e02d595 127.0.0.1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.274789 [INFO] consul: Adding LAN server Node fb221cfb-404c-a39e-5b11-45342e02d595 (Addr: tcp/127.0.0.1:11650) (DC: dc1)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.275115 [INFO] consul: Handled member-join event for server "Node fb221cfb-404c-a39e-5b11-45342e02d595.dc1" in area "wan"
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.275464 [INFO] agent: Started DNS server 127.0.0.1:11645 (udp)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.275667 [INFO] agent: Started DNS server 127.0.0.1:11645 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.278043 [INFO] agent: Started HTTP server on 127.0.0.1:11646 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.278145 [INFO] agent: started state syncer
2019/11/27 02:18:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:43 [INFO]  raft: Node at 127.0.0.1:11650 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:43 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:43 [INFO]  raft: Node at 127.0.0.1:11650 [Leader] entering Leader state
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.985069 [INFO] consul: cluster leadership acquired
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:43.985508 [INFO] consul: New leader elected: Node fb221cfb-404c-a39e-5b11-45342e02d595
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.026546 [INFO] acl: initializing acls
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.177138 [ERR] agent: failed to sync remote state: ACL not found
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.440521 [INFO] acl: initializing acls
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.440852 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.586232 [INFO] consul: Created ACL 'global-management' policy
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.796538 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.796647 [DEBUG] acl: transitioning out of legacy ACL mode
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.797596 [INFO] serf: EventMemberUpdate: Node fb221cfb-404c-a39e-5b11-45342e02d595
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.798262 [INFO] serf: EventMemberUpdate: Node fb221cfb-404c-a39e-5b11-45342e02d595.dc1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.974314 [INFO] consul: Created ACL anonymous token from configuration
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.975213 [INFO] serf: EventMemberUpdate: Node fb221cfb-404c-a39e-5b11-45342e02d595
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:44.975917 [INFO] serf: EventMemberUpdate: Node fb221cfb-404c-a39e-5b11-45342e02d595.dc1
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:46.686656 [INFO] agent: Synced node info
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:46.686919 [DEBUG] agent: Node info in sync
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.564986 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.565430 [DEBUG] consul: Skipping self join check for "Node fb221cfb-404c-a39e-5b11-45342e02d595" since the cluster is too small
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.565606 [INFO] consul: member 'Node fb221cfb-404c-a39e-5b11-45342e02d595' joined, marking health alive
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.754514 [DEBUG] consul: Skipping self join check for "Node fb221cfb-404c-a39e-5b11-45342e02d595" since the cluster is too small
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.755136 [DEBUG] consul: Skipping self join check for "Node fb221cfb-404c-a39e-5b11-45342e02d595" since the cluster is too small
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/agent/connect/authorize
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.758324 [ERR] http: Request POST http://127.0.0.1:11646/v1/agent/connect/authorize, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.758464 [DEBUG] http: Request POST http://127.0.0.1:11646/v1/agent/connect/authorize (153.339µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/bootstrap
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.759020 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/bootstrap, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.759122 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/bootstrap (102.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/destroy/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.759657 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/destroy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.759757 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/destroy/ (100.338µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/intentions/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.760260 [ERR] http: Request PUT http://127.0.0.1:11646/v1/connect/intentions/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.760352 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/connect/intentions/ (97.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/connect/intentions/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.760835 [ERR] http: Request DELETE http://127.0.0.1:11646/v1/connect/intentions/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.760935 [DEBUG] http: Request DELETE http://127.0.0.1:11646/v1/connect/intentions/ (96.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/destroy/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.761450 [ERR] http: Request PUT http://127.0.0.1:11646/v1/session/destroy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.761549 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/session/destroy/ (98.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/fail/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.762212 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/check/fail/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.762306 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/check/fail/ (97.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/maintenance/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.762796 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/service/maintenance/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.762888 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/service/maintenance/ (93.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.763378 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/policy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.763469 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/policy/ (94.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/policy/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.763949 [ERR] http: Request DELETE http://127.0.0.1:11646/v1/acl/policy/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.764043 [DEBUG] http: Request DELETE http://127.0.0.1:11646/v1/acl/policy/ (92.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/force-leave/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.765186 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/force-leave/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.765299 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/force-leave/ (186.007µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/raft/peer
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.766064 [ERR] http: Request DELETE http://127.0.0.1:11646/v1/operator/raft/peer, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.766163 [DEBUG] http: Request DELETE http://127.0.0.1:11646/v1/operator/raft/peer (102.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/ca/configuration
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.766995 [ERR] http: Request PUT http://127.0.0.1:11646/v1/connect/ca/configuration, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.767104 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/connect/ca/configuration (108.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/create
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.767816 [ERR] http: Request PUT http://127.0.0.1:11646/v1/session/create, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.767915 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/session/create (115.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/renew/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.768722 [ERR] http: Request PUT http://127.0.0.1:11646/v1/session/renew/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.768820 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/session/renew/ (98.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/clone/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.769465 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/clone/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.769558 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/clone/ (91.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/connect/intentions
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.770248 [ERR] http: Request POST http://127.0.0.1:11646/v1/connect/intentions, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.770352 [DEBUG] http: Request POST http://127.0.0.1:11646/v1/connect/intentions (106.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/rules/translate
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.771086 [ERR] http: Request POST http://127.0.0.1:11646/v1/acl/rules/translate, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.771189 [DEBUG] http: Request POST http://127.0.0.1:11646/v1/acl/rules/translate (104.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/deregister
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.771933 [ERR] http: Request PUT http://127.0.0.1:11646/v1/catalog/deregister, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.772042 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/catalog/deregister (112.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/snapshot
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.772911 [ERR] http: Request PUT http://127.0.0.1:11646/v1/snapshot, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.773031 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/snapshot (120.005µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/autopilot/configuration
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.773797 [ERR] http: Request PUT http://127.0.0.1:11646/v1/operator/autopilot/configuration, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.773899 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/operator/autopilot/configuration (106.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.774650 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.774750 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/token/ (103.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/token/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.775463 [ERR] http: Request DELETE http://127.0.0.1:11646/v1/acl/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.775564 [DEBUG] http: Request DELETE http://127.0.0.1:11646/v1/acl/token/ (103.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/deregister/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.776261 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/check/deregister/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.776362 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/check/deregister/ (119.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/pass/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.777143 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/check/pass/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.777247 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/check/pass/ (106.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/register
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.777943 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/service/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.778042 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/service/register (101.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/register
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.778745 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/check/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.778852 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/check/register (109.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/kv/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.779590 [ERR] http: Request PUT http://127.0.0.1:11646/v1/kv/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.779687 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/kv/ (99.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/kv/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.780383 [ERR] http: Request DELETE http://127.0.0.1:11646/v1/kv/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.780480 [DEBUG] http: Request DELETE http://127.0.0.1:11646/v1/kv/ (113.338µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/POST_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.781171 [ERR] http: Request POST http://127.0.0.1:11646/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.781267 [DEBUG] http: Request POST http://127.0.0.1:11646/v1/operator/keyring (99.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.781979 [ERR] http: Request PUT http://127.0.0.1:11646/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.782084 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/operator/keyring (102.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/keyring
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.782887 [ERR] http: Request DELETE http://127.0.0.1:11646/v1/operator/keyring, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.782989 [DEBUG] http: Request DELETE http://127.0.0.1:11646/v1/operator/keyring (103.338µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/coordinate/update
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.783711 [ERR] http: Request PUT http://127.0.0.1:11646/v1/coordinate/update, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.783810 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/coordinate/update (99.338µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/event/fire/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.784481 [ERR] http: Request PUT http://127.0.0.1:11646/v1/event/fire/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.784581 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/event/fire/ (101.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/update
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.785261 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/update, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.785441 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/update (180.674µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.786144 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/token, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.786246 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/token (102.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/join/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.787079 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/join/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.787184 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/join/ (103.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/token/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.787908 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/token/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.788008 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/token/ (100.67µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/maintenance
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.788678 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/maintenance, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.788777 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/maintenance (104.337µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/register
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.789501 [ERR] http: Request PUT http://127.0.0.1:11646/v1/catalog/register, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.789600 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/catalog/register (98.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/create
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.790274 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/create, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.790369 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/create (98.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.791034 [ERR] http: Request PUT http://127.0.0.1:11646/v1/acl/policy, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.791129 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/acl/policy (95.003µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/update/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.791907 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/check/update/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.792008 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/check/update/ (104.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/leave
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.792679 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/leave, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.792790 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/leave (110.671µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/warn/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.793667 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/check/warn/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.793768 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/check/warn/ (101.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/deregister/
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.794507 [ERR] http: Request PUT http://127.0.0.1:11646/v1/agent/service/deregister/, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.794609 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/agent/service/deregister/ (102.004µs) from=192.168.1.2:5555
=== RUN   TestHTTPAPI_AllowedNets_OSS/PUT_/v1/txn
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.795285 [ERR] http: Request PUT http://127.0.0.1:11646/v1/txn, error: Access is restricted from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.795383 [DEBUG] http: Request PUT http://127.0.0.1:11646/v1/txn (99.337µs) from=192.168.1.2:5555
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.795729 [INFO] agent: Requesting shutdown
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.795877 [INFO] consul: shutting down server
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.796088 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.908268 [WARN] serf: Shutdown without a Leave
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.962144 [INFO] manager: shutting down
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.962846 [INFO] agent: consul server down
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.962922 [INFO] agent: shutdown complete
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.962986 [INFO] agent: Stopping DNS server 127.0.0.1:11645 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.963142 [INFO] agent: Stopping DNS server 127.0.0.1:11645 (udp)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.963302 [INFO] agent: Stopping HTTP server 127.0.0.1:11646 (tcp)
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.963519 [INFO] agent: Waiting for endpoints to shut down
TestHTTPAPI_AllowedNets_OSS - 2019/11/27 02:18:47.963593 [INFO] agent: Endpoints down
--- PASS: TestHTTPAPI_AllowedNets_OSS (5.81s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/agent/connect/authorize (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/bootstrap (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/connect/intentions/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/destroy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/fail/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/maintenance/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/policy/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/force-leave/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/raft/peer (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/connect/ca/configuration (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/create (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/session/renew/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/clone/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/connect/intentions (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/acl/rules/translate (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/deregister (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/snapshot (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/autopilot/configuration (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/acl/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/pass/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/kv/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/POST_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/DELETE_/v1/operator/keyring (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/coordinate/update (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/event/fire/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/update (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/token (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/join/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/token/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/maintenance (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/catalog/register (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/create (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/acl/policy (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/update/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/leave (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/check/warn/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/agent/service/deregister/ (0.00s)
    --- PASS: TestHTTPAPI_AllowedNets_OSS/PUT_/v1/txn (0.00s)
=== RUN   TestHTTPServer_UnixSocket
=== PAUSE TestHTTPServer_UnixSocket
=== RUN   TestHTTPServer_UnixSocket_FileExists
=== PAUSE TestHTTPServer_UnixSocket_FileExists
=== RUN   TestHTTPServer_H2
=== PAUSE TestHTTPServer_H2
=== RUN   TestSetIndex
=== PAUSE TestSetIndex
=== RUN   TestSetKnownLeader
=== PAUSE TestSetKnownLeader
=== RUN   TestSetLastContact
=== PAUSE TestSetLastContact
=== RUN   TestSetMeta
=== PAUSE TestSetMeta
=== RUN   TestHTTPAPI_BlockEndpoints
=== PAUSE TestHTTPAPI_BlockEndpoints
=== RUN   TestHTTPAPI_Ban_Nonprintable_Characters
--- SKIP: TestHTTPAPI_Ban_Nonprintable_Characters (0.00s)
    http_test.go:323: DM-skipped
=== RUN   TestHTTPAPI_Allow_Nonprintable_Characters_With_Flag
--- SKIP: TestHTTPAPI_Allow_Nonprintable_Characters_With_Flag (0.00s)
    http_test.go:336: DM-skipped
=== RUN   TestHTTPAPI_TranslateAddrHeader
=== PAUSE TestHTTPAPI_TranslateAddrHeader
=== RUN   TestHTTPAPIResponseHeaders
=== PAUSE TestHTTPAPIResponseHeaders
=== RUN   TestContentTypeIsJSON
=== PAUSE TestContentTypeIsJSON
=== RUN   TestHTTP_wrap_obfuscateLog
=== PAUSE TestHTTP_wrap_obfuscateLog
=== RUN   TestPrettyPrint
=== PAUSE TestPrettyPrint
=== RUN   TestPrettyPrintBare
=== PAUSE TestPrettyPrintBare
=== RUN   TestParseSource
=== PAUSE TestParseSource
=== RUN   TestParseCacheControl
=== RUN   TestParseCacheControl/empty_header
=== RUN   TestParseCacheControl/simple_max-age
=== RUN   TestParseCacheControl/zero_max-age
=== RUN   TestParseCacheControl/must-revalidate
=== RUN   TestParseCacheControl/mixes_age,_must-revalidate
=== RUN   TestParseCacheControl/quoted_max-age
=== RUN   TestParseCacheControl/mixed_case_max-age
=== RUN   TestParseCacheControl/simple_stale-if-error
=== RUN   TestParseCacheControl/combined_with_space
=== RUN   TestParseCacheControl/combined_no_space
=== RUN   TestParseCacheControl/unsupported_directive
=== RUN   TestParseCacheControl/mixed_unsupported_directive
=== RUN   TestParseCacheControl/garbage_value
=== RUN   TestParseCacheControl/garbage_value_with_quotes
--- PASS: TestParseCacheControl (0.01s)
    --- PASS: TestParseCacheControl/empty_header (0.00s)
    --- PASS: TestParseCacheControl/simple_max-age (0.00s)
    --- PASS: TestParseCacheControl/zero_max-age (0.00s)
    --- PASS: TestParseCacheControl/must-revalidate (0.00s)
    --- PASS: TestParseCacheControl/mixes_age,_must-revalidate (0.00s)
    --- PASS: TestParseCacheControl/quoted_max-age (0.00s)
    --- PASS: TestParseCacheControl/mixed_case_max-age (0.00s)
    --- PASS: TestParseCacheControl/simple_stale-if-error (0.00s)
    --- PASS: TestParseCacheControl/combined_with_space (0.00s)
    --- PASS: TestParseCacheControl/combined_no_space (0.00s)
    --- PASS: TestParseCacheControl/unsupported_directive (0.00s)
    --- PASS: TestParseCacheControl/mixed_unsupported_directive (0.00s)
    --- PASS: TestParseCacheControl/garbage_value (0.00s)
    --- PASS: TestParseCacheControl/garbage_value_with_quotes (0.00s)
=== RUN   TestParseWait
=== PAUSE TestParseWait
=== RUN   TestPProfHandlers_EnableDebug
=== PAUSE TestPProfHandlers_EnableDebug
=== RUN   TestPProfHandlers_DisableDebugNoACLs
--- SKIP: TestPProfHandlers_DisableDebugNoACLs (0.00s)
    http_test.go:746: DM-skipped
=== RUN   TestPProfHandlers_ACLs
=== PAUSE TestPProfHandlers_ACLs
=== RUN   TestParseWait_InvalidTime
=== PAUSE TestParseWait_InvalidTime
=== RUN   TestParseWait_InvalidIndex
=== PAUSE TestParseWait_InvalidIndex
=== RUN   TestParseConsistency
=== PAUSE TestParseConsistency
=== RUN   TestParseConsistencyAndMaxStale
WARNING: bootstrap = true: do not enable unless necessary
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:48.093128 [WARN] agent: Node name "Node 7ac792f1-3344-363f-c934-7eee0f5e0425" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:48.093646 [DEBUG] tlsutil: Update with version 1
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:48.093720 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:48.093902 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:48.094026 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7ac792f1-3344-363f-c934-7eee0f5e0425 Address:127.0.0.1:11656}]
2019/11/27 02:18:49 [INFO]  raft: Node at 127.0.0.1:11656 [Follower] entering Follower state (Leader: "")
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.780238 [INFO] serf: EventMemberJoin: Node 7ac792f1-3344-363f-c934-7eee0f5e0425.dc1 127.0.0.1
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.786796 [INFO] serf: EventMemberJoin: Node 7ac792f1-3344-363f-c934-7eee0f5e0425 127.0.0.1
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.788509 [INFO] consul: Adding LAN server Node 7ac792f1-3344-363f-c934-7eee0f5e0425 (Addr: tcp/127.0.0.1:11656) (DC: dc1)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.789242 [INFO] consul: Handled member-join event for server "Node 7ac792f1-3344-363f-c934-7eee0f5e0425.dc1" in area "wan"
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.790961 [INFO] agent: Started DNS server 127.0.0.1:11651 (tcp)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.791651 [INFO] agent: Started DNS server 127.0.0.1:11651 (udp)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.794253 [INFO] agent: Started HTTP server on 127.0.0.1:11652 (tcp)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:49.794365 [INFO] agent: started state syncer
2019/11/27 02:18:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:49 [INFO]  raft: Node at 127.0.0.1:11656 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:51 [INFO]  raft: Node at 127.0.0.1:11656 [Leader] entering Leader state
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.374305 [INFO] consul: cluster leadership acquired
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.374759 [INFO] consul: New leader elected: Node 7ac792f1-3344-363f-c934-7eee0f5e0425
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.616346 [INFO] agent: Requesting shutdown
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.616463 [INFO] consul: shutting down server
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.616528 [WARN] serf: Shutdown without a Leave
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.731517 [WARN] serf: Shutdown without a Leave
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.810175 [INFO] agent: Synced node info
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.810338 [DEBUG] agent: Node info in sync
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.812017 [INFO] manager: shutting down
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.973376 [INFO] agent: consul server down
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.973467 [INFO] agent: shutdown complete
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.973544 [INFO] agent: Stopping DNS server 127.0.0.1:11651 (tcp)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.973699 [INFO] agent: Stopping DNS server 127.0.0.1:11651 (udp)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.973857 [INFO] agent: Stopping HTTP server 127.0.0.1:11652 (tcp)
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.973960 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.974087 [INFO] agent: Waiting for endpoints to shut down
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.974158 [INFO] agent: Endpoints down
TestParseConsistencyAndMaxStale - 2019/11/27 02:18:51.974237 [ERR] consul: failed to establish leadership: raft is already shutdown
--- PASS: TestParseConsistencyAndMaxStale (4.00s)
=== RUN   TestParseConsistency_Invalid
=== PAUSE TestParseConsistency_Invalid
=== RUN   TestACLResolution
=== PAUSE TestACLResolution
=== RUN   TestEnableWebUI
=== PAUSE TestEnableWebUI
=== RUN   TestParseToken_ProxyTokenResolve
=== PAUSE TestParseToken_ProxyTokenResolve
=== RUN   TestAllowedNets
--- SKIP: TestAllowedNets (0.00s)
    http_test.go:1212: DM-skipped
=== RUN   TestIntentionsList_empty
=== PAUSE TestIntentionsList_empty
=== RUN   TestIntentionsList_values
=== PAUSE TestIntentionsList_values
=== RUN   TestIntentionsMatch_basic
=== PAUSE TestIntentionsMatch_basic
=== RUN   TestIntentionsMatch_noBy
=== PAUSE TestIntentionsMatch_noBy
=== RUN   TestIntentionsMatch_byInvalid
=== PAUSE TestIntentionsMatch_byInvalid
=== RUN   TestIntentionsMatch_noName
=== PAUSE TestIntentionsMatch_noName
=== RUN   TestIntentionsCheck_basic
=== PAUSE TestIntentionsCheck_basic
=== RUN   TestIntentionsCheck_noSource
=== PAUSE TestIntentionsCheck_noSource
=== RUN   TestIntentionsCheck_noDestination
=== PAUSE TestIntentionsCheck_noDestination
=== RUN   TestIntentionsCreate_good
=== PAUSE TestIntentionsCreate_good
=== RUN   TestIntentionsCreate_noBody
=== PAUSE TestIntentionsCreate_noBody
=== RUN   TestIntentionsSpecificGet_good
=== PAUSE TestIntentionsSpecificGet_good
=== RUN   TestIntentionsSpecificGet_invalidId
=== PAUSE TestIntentionsSpecificGet_invalidId
=== RUN   TestIntentionsSpecificUpdate_good
=== PAUSE TestIntentionsSpecificUpdate_good
=== RUN   TestIntentionsSpecificDelete_good
=== PAUSE TestIntentionsSpecificDelete_good
=== RUN   TestParseIntentionMatchEntry
=== RUN   TestParseIntentionMatchEntry/foo
=== RUN   TestParseIntentionMatchEntry/foo/bar
=== RUN   TestParseIntentionMatchEntry/foo/bar/baz
--- PASS: TestParseIntentionMatchEntry (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo/bar (0.00s)
    --- PASS: TestParseIntentionMatchEntry/foo/bar/baz (0.00s)
=== RUN   TestAgent_LoadKeyrings
=== PAUSE TestAgent_LoadKeyrings
=== RUN   TestAgent_InmemKeyrings
=== PAUSE TestAgent_InmemKeyrings
=== RUN   TestAgent_InitKeyring
=== PAUSE TestAgent_InitKeyring
=== RUN   TestAgentKeyring_ACL
=== PAUSE TestAgentKeyring_ACL
=== RUN   TestKVSEndpoint_PUT_GET_DELETE
=== PAUSE TestKVSEndpoint_PUT_GET_DELETE
=== RUN   TestKVSEndpoint_Recurse
=== PAUSE TestKVSEndpoint_Recurse
=== RUN   TestKVSEndpoint_DELETE_CAS
=== PAUSE TestKVSEndpoint_DELETE_CAS
=== RUN   TestKVSEndpoint_CAS
=== PAUSE TestKVSEndpoint_CAS
=== RUN   TestKVSEndpoint_ListKeys
=== PAUSE TestKVSEndpoint_ListKeys
=== RUN   TestKVSEndpoint_AcquireRelease
=== PAUSE TestKVSEndpoint_AcquireRelease
=== RUN   TestKVSEndpoint_GET_Raw
=== PAUSE TestKVSEndpoint_GET_Raw
=== RUN   TestKVSEndpoint_PUT_ConflictingFlags
=== PAUSE TestKVSEndpoint_PUT_ConflictingFlags
=== RUN   TestKVSEndpoint_DELETE_ConflictingFlags
=== PAUSE TestKVSEndpoint_DELETE_ConflictingFlags
=== RUN   TestNotifyGroup
--- PASS: TestNotifyGroup (0.00s)
=== RUN   TestNotifyGroup_Clear
--- PASS: TestNotifyGroup_Clear (0.00s)
=== RUN   TestOperator_RaftConfiguration
=== PAUSE TestOperator_RaftConfiguration
=== RUN   TestOperator_RaftPeer
=== PAUSE TestOperator_RaftPeer
=== RUN   TestOperator_KeyringInstall
=== PAUSE TestOperator_KeyringInstall
=== RUN   TestOperator_KeyringList
=== PAUSE TestOperator_KeyringList
=== RUN   TestOperator_KeyringRemove
=== PAUSE TestOperator_KeyringRemove
=== RUN   TestOperator_KeyringUse
=== PAUSE TestOperator_KeyringUse
=== RUN   TestOperator_Keyring_InvalidRelayFactor
=== PAUSE TestOperator_Keyring_InvalidRelayFactor
=== RUN   TestOperator_AutopilotGetConfiguration
=== PAUSE TestOperator_AutopilotGetConfiguration
=== RUN   TestOperator_AutopilotSetConfiguration
=== PAUSE TestOperator_AutopilotSetConfiguration
=== RUN   TestOperator_AutopilotCASConfiguration
=== PAUSE TestOperator_AutopilotCASConfiguration
=== RUN   TestOperator_ServerHealth
=== PAUSE TestOperator_ServerHealth
=== RUN   TestOperator_ServerHealth_Unhealthy
=== PAUSE TestOperator_ServerHealth_Unhealthy
=== RUN   TestPreparedQuery_Create
=== PAUSE TestPreparedQuery_Create
=== RUN   TestPreparedQuery_List
=== PAUSE TestPreparedQuery_List
=== RUN   TestPreparedQuery_Execute
=== PAUSE TestPreparedQuery_Execute
=== RUN   TestPreparedQuery_ExecuteCached
=== PAUSE TestPreparedQuery_ExecuteCached
=== RUN   TestPreparedQuery_Explain
=== PAUSE TestPreparedQuery_Explain
=== RUN   TestPreparedQuery_Get
=== PAUSE TestPreparedQuery_Get
=== RUN   TestPreparedQuery_Update
=== PAUSE TestPreparedQuery_Update
=== RUN   TestPreparedQuery_Delete
=== PAUSE TestPreparedQuery_Delete
=== RUN   TestPreparedQuery_parseLimit
=== PAUSE TestPreparedQuery_parseLimit
=== RUN   TestPreparedQuery_Integration
--- SKIP: TestPreparedQuery_Integration (0.00s)
    prepared_query_endpoint_test.go:990: DM-skipped
=== RUN   TestRexecWriter
--- PASS: TestRexecWriter (0.30s)
=== RUN   TestRemoteExecGetSpec
=== PAUSE TestRemoteExecGetSpec
=== RUN   TestRemoteExecGetSpec_ACLToken
=== PAUSE TestRemoteExecGetSpec_ACLToken
=== RUN   TestRemoteExecGetSpec_ACLAgentToken
=== PAUSE TestRemoteExecGetSpec_ACLAgentToken
=== RUN   TestRemoteExecGetSpec_ACLDeny
=== PAUSE TestRemoteExecGetSpec_ACLDeny
=== RUN   TestRemoteExecWrites
=== PAUSE TestRemoteExecWrites
=== RUN   TestRemoteExecWrites_ACLToken
=== PAUSE TestRemoteExecWrites_ACLToken
=== RUN   TestRemoteExecWrites_ACLAgentToken
=== PAUSE TestRemoteExecWrites_ACLAgentToken
=== RUN   TestRemoteExecWrites_ACLDeny
=== PAUSE TestRemoteExecWrites_ACLDeny
=== RUN   TestHandleRemoteExec
=== PAUSE TestHandleRemoteExec
=== RUN   TestHandleRemoteExecFailed
=== PAUSE TestHandleRemoteExecFailed
=== RUN   TestSessionCreate
=== PAUSE TestSessionCreate
=== RUN   TestSessionCreate_Delete
=== PAUSE TestSessionCreate_Delete
=== RUN   TestSessionCreate_DefaultCheck
=== PAUSE TestSessionCreate_DefaultCheck
=== RUN   TestSessionCreate_NoCheck
=== PAUSE TestSessionCreate_NoCheck
=== RUN   TestFixupLockDelay
=== PAUSE TestFixupLockDelay
=== RUN   TestSessionDestroy
=== PAUSE TestSessionDestroy
=== RUN   TestSessionCustomTTL
=== PAUSE TestSessionCustomTTL
=== RUN   TestSessionTTLRenew
--- SKIP: TestSessionTTLRenew (0.00s)
    session_endpoint_test.go:372: DM-skipped
=== RUN   TestSessionGet
=== PAUSE TestSessionGet
=== RUN   TestSessionList
=== RUN   TestSessionList/#00
WARNING: bootstrap = true: do not enable unless necessary
TestSessionList/#00 - 2019/11/27 02:18:52.392533 [WARN] agent: Node name "Node 97b5d47e-e682-a124-adba-a5ee837e6935" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSessionList/#00 - 2019/11/27 02:18:52.393104 [DEBUG] tlsutil: Update with version 1
TestSessionList/#00 - 2019/11/27 02:18:52.393283 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSessionList/#00 - 2019/11/27 02:18:52.393547 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestSessionList/#00 - 2019/11/27 02:18:52.393871 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:53 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:97b5d47e-e682-a124-adba-a5ee837e6935 Address:127.0.0.1:11662}]
2019/11/27 02:18:53 [INFO]  raft: Node at 127.0.0.1:11662 [Follower] entering Follower state (Leader: "")
TestSessionList/#00 - 2019/11/27 02:18:53.466004 [INFO] serf: EventMemberJoin: Node 97b5d47e-e682-a124-adba-a5ee837e6935.dc1 127.0.0.1
TestSessionList/#00 - 2019/11/27 02:18:53.472221 [INFO] serf: EventMemberJoin: Node 97b5d47e-e682-a124-adba-a5ee837e6935 127.0.0.1
TestSessionList/#00 - 2019/11/27 02:18:53.473313 [INFO] consul: Adding LAN server Node 97b5d47e-e682-a124-adba-a5ee837e6935 (Addr: tcp/127.0.0.1:11662) (DC: dc1)
TestSessionList/#00 - 2019/11/27 02:18:53.473647 [INFO] consul: Handled member-join event for server "Node 97b5d47e-e682-a124-adba-a5ee837e6935.dc1" in area "wan"
TestSessionList/#00 - 2019/11/27 02:18:53.473907 [INFO] agent: Started DNS server 127.0.0.1:11657 (udp)
TestSessionList/#00 - 2019/11/27 02:18:53.474162 [INFO] agent: Started DNS server 127.0.0.1:11657 (tcp)
TestSessionList/#00 - 2019/11/27 02:18:53.476405 [INFO] agent: Started HTTP server on 127.0.0.1:11658 (tcp)
TestSessionList/#00 - 2019/11/27 02:18:53.476544 [INFO] agent: started state syncer
2019/11/27 02:18:53 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:53 [INFO]  raft: Node at 127.0.0.1:11662 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:53 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:53 [INFO]  raft: Node at 127.0.0.1:11662 [Leader] entering Leader state
TestSessionList/#00 - 2019/11/27 02:18:53.973874 [INFO] consul: cluster leadership acquired
TestSessionList/#00 - 2019/11/27 02:18:53.974372 [INFO] consul: New leader elected: Node 97b5d47e-e682-a124-adba-a5ee837e6935
TestSessionList/#00 - 2019/11/27 02:18:54.307296 [INFO] agent: Synced node info
TestSessionList/#00 - 2019/11/27 02:18:54.307435 [DEBUG] agent: Node info in sync
TestSessionList/#00 - 2019/11/27 02:18:55.394692 [DEBUG] agent: Node info in sync
TestSessionList/#00 - 2019/11/27 02:18:56.018034 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSessionList/#00 - 2019/11/27 02:18:56.018552 [DEBUG] consul: Skipping self join check for "Node 97b5d47e-e682-a124-adba-a5ee837e6935" since the cluster is too small
TestSessionList/#00 - 2019/11/27 02:18:56.018717 [INFO] consul: member 'Node 97b5d47e-e682-a124-adba-a5ee837e6935' joined, marking health alive
TestSessionList/#00 - 2019/11/27 02:18:56.456897 [INFO] agent: Requesting shutdown
TestSessionList/#00 - 2019/11/27 02:18:56.457026 [INFO] consul: shutting down server
TestSessionList/#00 - 2019/11/27 02:18:56.457089 [WARN] serf: Shutdown without a Leave
TestSessionList/#00 - 2019/11/27 02:18:56.550574 [WARN] serf: Shutdown without a Leave
TestSessionList/#00 - 2019/11/27 02:18:56.639581 [INFO] manager: shutting down
TestSessionList/#00 - 2019/11/27 02:18:56.639968 [INFO] agent: consul server down
TestSessionList/#00 - 2019/11/27 02:18:56.640306 [INFO] agent: shutdown complete
TestSessionList/#00 - 2019/11/27 02:18:56.640532 [INFO] agent: Stopping DNS server 127.0.0.1:11657 (tcp)
TestSessionList/#00 - 2019/11/27 02:18:56.641242 [INFO] agent: Stopping DNS server 127.0.0.1:11657 (udp)
TestSessionList/#00 - 2019/11/27 02:18:56.642258 [INFO] agent: Stopping HTTP server 127.0.0.1:11658 (tcp)
TestSessionList/#00 - 2019/11/27 02:18:56.643105 [INFO] agent: Waiting for endpoints to shut down
TestSessionList/#00 - 2019/11/27 02:18:56.643554 [INFO] agent: Endpoints down
=== RUN   TestSessionList/#01
WARNING: bootstrap = true: do not enable unless necessary
TestSessionList/#01 - 2019/11/27 02:18:56.759883 [WARN] agent: Node name "Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSessionList/#01 - 2019/11/27 02:18:56.760503 [DEBUG] tlsutil: Update with version 1
TestSessionList/#01 - 2019/11/27 02:18:56.760694 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSessionList/#01 - 2019/11/27 02:18:56.760977 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestSessionList/#01 - 2019/11/27 02:18:56.761251 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e4c9cb04-cb6b-d94e-360e-6383efc3ce21 Address:127.0.0.1:11668}]
2019/11/27 02:18:57 [INFO]  raft: Node at 127.0.0.1:11668 [Follower] entering Follower state (Leader: "")
TestSessionList/#01 - 2019/11/27 02:18:57.846541 [INFO] serf: EventMemberJoin: Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21.dc1 127.0.0.1
TestSessionList/#01 - 2019/11/27 02:18:57.851820 [INFO] serf: EventMemberJoin: Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21 127.0.0.1
TestSessionList/#01 - 2019/11/27 02:18:57.854289 [INFO] consul: Adding LAN server Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21 (Addr: tcp/127.0.0.1:11668) (DC: dc1)
TestSessionList/#01 - 2019/11/27 02:18:57.855017 [INFO] consul: Handled member-join event for server "Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21.dc1" in area "wan"
TestSessionList/#01 - 2019/11/27 02:18:57.857461 [INFO] agent: Started DNS server 127.0.0.1:11663 (tcp)
TestSessionList/#01 - 2019/11/27 02:18:57.857551 [INFO] agent: Started DNS server 127.0.0.1:11663 (udp)
TestSessionList/#01 - 2019/11/27 02:18:57.859502 [INFO] agent: Started HTTP server on 127.0.0.1:11664 (tcp)
TestSessionList/#01 - 2019/11/27 02:18:57.859590 [INFO] agent: started state syncer
2019/11/27 02:18:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:57 [INFO]  raft: Node at 127.0.0.1:11668 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:58 [INFO]  raft: Node at 127.0.0.1:11668 [Leader] entering Leader state
TestSessionList/#01 - 2019/11/27 02:18:58.396803 [INFO] consul: cluster leadership acquired
TestSessionList/#01 - 2019/11/27 02:18:58.397259 [INFO] consul: New leader elected: Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21
TestSessionList/#01 - 2019/11/27 02:18:58.673730 [INFO] agent: Synced node info
TestSessionList/#01 - 2019/11/27 02:18:58.673864 [DEBUG] agent: Node info in sync
TestSessionList/#01 - 2019/11/27 02:18:58.746353 [DEBUG] agent: Node info in sync
TestSessionList/#01 - 2019/11/27 02:19:00.751379 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSessionList/#01 - 2019/11/27 02:19:00.751905 [DEBUG] consul: Skipping self join check for "Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21" since the cluster is too small
TestSessionList/#01 - 2019/11/27 02:19:00.752076 [INFO] consul: member 'Node e4c9cb04-cb6b-d94e-360e-6383efc3ce21' joined, marking health alive
TestSessionList/#01 - 2019/11/27 02:19:02.406964 [INFO] agent: Requesting shutdown
TestSessionList/#01 - 2019/11/27 02:19:02.407080 [INFO] consul: shutting down server
TestSessionList/#01 - 2019/11/27 02:19:02.407136 [WARN] serf: Shutdown without a Leave
TestSessionList/#01 - 2019/11/27 02:19:02.461213 [WARN] serf: Shutdown without a Leave
TestSessionList/#01 - 2019/11/27 02:19:02.516939 [INFO] manager: shutting down
TestSessionList/#01 - 2019/11/27 02:19:02.517554 [INFO] agent: consul server down
TestSessionList/#01 - 2019/11/27 02:19:02.517612 [INFO] agent: shutdown complete
TestSessionList/#01 - 2019/11/27 02:19:02.517673 [INFO] agent: Stopping DNS server 127.0.0.1:11663 (tcp)
TestSessionList/#01 - 2019/11/27 02:19:02.517811 [INFO] agent: Stopping DNS server 127.0.0.1:11663 (udp)
TestSessionList/#01 - 2019/11/27 02:19:02.517951 [INFO] agent: Stopping HTTP server 127.0.0.1:11664 (tcp)
TestSessionList/#01 - 2019/11/27 02:19:02.518161 [INFO] agent: Waiting for endpoints to shut down
TestSessionList/#01 - 2019/11/27 02:19:02.518228 [INFO] agent: Endpoints down
--- PASS: TestSessionList (10.22s)
    --- PASS: TestSessionList/#00 (4.35s)
    --- PASS: TestSessionList/#01 (5.87s)
=== RUN   TestSessionsForNode
=== PAUSE TestSessionsForNode
=== RUN   TestSessionDeleteDestroy
=== PAUSE TestSessionDeleteDestroy
=== RUN   TestAgent_sidecarServiceFromNodeService
=== RUN   TestAgent_sidecarServiceFromNodeService/no_sidecar
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:02.655949 [WARN] agent: Node name "Node 96ea3298-4984-8452-8dce-62bd7caf6d71" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:02.666958 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:02.667058 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:02.677020 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:02.677204 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:96ea3298-4984-8452-8dce-62bd7caf6d71 Address:127.0.0.1:11674}]
2019/11/27 02:19:03 [INFO]  raft: Node at 127.0.0.1:11674 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:03.343110 [INFO] serf: EventMemberJoin: Node 96ea3298-4984-8452-8dce-62bd7caf6d71.dc1 127.0.0.1
jones - 2019/11/27 02:19:03.346662 [INFO] serf: EventMemberJoin: Node 96ea3298-4984-8452-8dce-62bd7caf6d71 127.0.0.1
jones - 2019/11/27 02:19:03.347954 [INFO] consul: Adding LAN server Node 96ea3298-4984-8452-8dce-62bd7caf6d71 (Addr: tcp/127.0.0.1:11674) (DC: dc1)
jones - 2019/11/27 02:19:03.348117 [INFO] consul: Handled member-join event for server "Node 96ea3298-4984-8452-8dce-62bd7caf6d71.dc1" in area "wan"
jones - 2019/11/27 02:19:03.348529 [INFO] agent: Started DNS server 127.0.0.1:11669 (tcp)
jones - 2019/11/27 02:19:03.348604 [INFO] agent: Started DNS server 127.0.0.1:11669 (udp)
jones - 2019/11/27 02:19:03.350710 [INFO] agent: Started HTTP server on 127.0.0.1:11670 (tcp)
jones - 2019/11/27 02:19:03.350796 [INFO] agent: started state syncer
2019/11/27 02:19:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:03 [INFO]  raft: Node at 127.0.0.1:11674 [Candidate] entering Candidate state in term 2
2019/11/27 02:19:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:03 [INFO]  raft: Node at 127.0.0.1:11674 [Leader] entering Leader state
jones - 2019/11/27 02:19:03.828268 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:03.828712 [INFO] consul: New leader elected: Node 96ea3298-4984-8452-8dce-62bd7caf6d71
=== RUN   TestAgent_sidecarServiceFromNodeService/all_the_defaults
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:04.301939 [WARN] agent: Node name "Node 4c613484-61cd-f189-9fd4-637dea8a81e0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:04.303099 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:04.303308 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:04.303671 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:04.303965 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:05.085483 [INFO] agent: Synced node info
jones - 2019/11/27 02:19:05.085606 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:19:05.676648 [DEBUG] agent: Node info in sync
2019/11/27 02:19:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4c613484-61cd-f189-9fd4-637dea8a81e0 Address:127.0.0.1:11680}]
2019/11/27 02:19:05 [INFO]  raft: Node at 127.0.0.1:11680 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:05.998712 [INFO] serf: EventMemberJoin: Node 4c613484-61cd-f189-9fd4-637dea8a81e0.dc1 127.0.0.1
jones - 2019/11/27 02:19:06.004464 [INFO] serf: EventMemberJoin: Node 4c613484-61cd-f189-9fd4-637dea8a81e0 127.0.0.1
jones - 2019/11/27 02:19:06.005514 [INFO] consul: Adding LAN server Node 4c613484-61cd-f189-9fd4-637dea8a81e0 (Addr: tcp/127.0.0.1:11680) (DC: dc1)
jones - 2019/11/27 02:19:06.005681 [INFO] consul: Handled member-join event for server "Node 4c613484-61cd-f189-9fd4-637dea8a81e0.dc1" in area "wan"
jones - 2019/11/27 02:19:06.006448 [INFO] agent: Started DNS server 127.0.0.1:11675 (tcp)
jones - 2019/11/27 02:19:06.007569 [INFO] agent: Started DNS server 127.0.0.1:11675 (udp)
jones - 2019/11/27 02:19:06.009665 [INFO] agent: Started HTTP server on 127.0.0.1:11676 (tcp)
jones - 2019/11/27 02:19:06.009815 [INFO] agent: started state syncer
2019/11/27 02:19:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:06 [INFO]  raft: Node at 127.0.0.1:11680 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:06.430866 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:06.436620 [DEBUG] consul: Skipping self join check for "Node 96ea3298-4984-8452-8dce-62bd7caf6d71" since the cluster is too small
jones - 2019/11/27 02:19:06.437031 [INFO] consul: member 'Node 96ea3298-4984-8452-8dce-62bd7caf6d71' joined, marking health alive
2019/11/27 02:19:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:07 [INFO]  raft: Node at 127.0.0.1:11680 [Leader] entering Leader state
jones - 2019/11/27 02:19:07.330714 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:07.331179 [INFO] consul: New leader elected: Node 4c613484-61cd-f189-9fd4-637dea8a81e0
jones - 2019/11/27 02:19:07.795457 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/all_the_allowed_overrides
jones - 2019/11/27 02:19:07.828029 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:07.866324 [WARN] agent: Node name "Node 005cb1c3-f8e5-2827-9833-9849ba78d405" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:07.866929 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:07.867000 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:07.868739 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:07.868888 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:08.273332 [DEBUG] agent: Node info in sync
2019/11/27 02:19:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:005cb1c3-f8e5-2827-9833-9849ba78d405 Address:127.0.0.1:11686}]
2019/11/27 02:19:09 [INFO]  raft: Node at 127.0.0.1:11686 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:09.374777 [INFO] serf: EventMemberJoin: Node 005cb1c3-f8e5-2827-9833-9849ba78d405.dc1 127.0.0.1
jones - 2019/11/27 02:19:09.378217 [INFO] serf: EventMemberJoin: Node 005cb1c3-f8e5-2827-9833-9849ba78d405 127.0.0.1
jones - 2019/11/27 02:19:09.379162 [INFO] consul: Adding LAN server Node 005cb1c3-f8e5-2827-9833-9849ba78d405 (Addr: tcp/127.0.0.1:11686) (DC: dc1)
jones - 2019/11/27 02:19:09.379514 [INFO] consul: Handled member-join event for server "Node 005cb1c3-f8e5-2827-9833-9849ba78d405.dc1" in area "wan"
jones - 2019/11/27 02:19:09.379674 [INFO] agent: Started DNS server 127.0.0.1:11681 (udp)
jones - 2019/11/27 02:19:09.379964 [INFO] agent: Started DNS server 127.0.0.1:11681 (tcp)
jones - 2019/11/27 02:19:09.382129 [INFO] agent: Started HTTP server on 127.0.0.1:11682 (tcp)
jones - 2019/11/27 02:19:09.382225 [INFO] agent: started state syncer
2019/11/27 02:19:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:09 [INFO]  raft: Node at 127.0.0.1:11686 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:09.783836 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:09.784316 [DEBUG] consul: Skipping self join check for "Node 4c613484-61cd-f189-9fd4-637dea8a81e0" since the cluster is too small
jones - 2019/11/27 02:19:09.784482 [INFO] consul: member 'Node 4c613484-61cd-f189-9fd4-637dea8a81e0' joined, marking health alive
2019/11/27 02:19:10 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:10 [INFO]  raft: Node at 127.0.0.1:11686 [Leader] entering Leader state
jones - 2019/11/27 02:19:10.440286 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:10.440739 [INFO] consul: New leader elected: Node 005cb1c3-f8e5-2827-9833-9849ba78d405
=== RUN   TestAgent_sidecarServiceFromNodeService/no_auto_ports_available
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:10.686280 [WARN] agent: Node name "Node 3a0dee63-0112-ab1b-d438-213ed51c845e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:10.686750 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:10.686825 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:10.687011 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:10.687127 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:11.695141 [INFO] agent: Synced node info
jones - 2019/11/27 02:19:11.695278 [DEBUG] agent: Node info in sync
2019/11/27 02:19:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3a0dee63-0112-ab1b-d438-213ed51c845e Address:127.0.0.1:11692}]
2019/11/27 02:19:12 [INFO]  raft: Node at 127.0.0.1:11692 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:12.836599 [INFO] serf: EventMemberJoin: Node 3a0dee63-0112-ab1b-d438-213ed51c845e.dc1 127.0.0.1
jones - 2019/11/27 02:19:12.846750 [INFO] serf: EventMemberJoin: Node 3a0dee63-0112-ab1b-d438-213ed51c845e 127.0.0.1
jones - 2019/11/27 02:19:12.848419 [INFO] consul: Adding LAN server Node 3a0dee63-0112-ab1b-d438-213ed51c845e (Addr: tcp/127.0.0.1:11692) (DC: dc1)
jones - 2019/11/27 02:19:12.849426 [INFO] consul: Handled member-join event for server "Node 3a0dee63-0112-ab1b-d438-213ed51c845e.dc1" in area "wan"
jones - 2019/11/27 02:19:12.853476 [INFO] agent: Started DNS server 127.0.0.1:11687 (tcp)
jones - 2019/11/27 02:19:12.854282 [INFO] agent: Started DNS server 127.0.0.1:11687 (udp)
jones - 2019/11/27 02:19:12.862771 [INFO] agent: Started HTTP server on 127.0.0.1:11688 (tcp)
jones - 2019/11/27 02:19:12.862900 [INFO] agent: started state syncer
2019/11/27 02:19:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:12 [INFO]  raft: Node at 127.0.0.1:11692 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:13.161427 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:13.161945 [DEBUG] consul: Skipping self join check for "Node 005cb1c3-f8e5-2827-9833-9849ba78d405" since the cluster is too small
jones - 2019/11/27 02:19:13.162108 [INFO] consul: member 'Node 005cb1c3-f8e5-2827-9833-9849ba78d405' joined, marking health alive
jones - 2019/11/27 02:19:13.178059 [DEBUG] agent: Node info in sync
2019/11/27 02:19:14 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:14 [INFO]  raft: Node at 127.0.0.1:11692 [Leader] entering Leader state
jones - 2019/11/27 02:19:14.738850 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:14.739323 [INFO] consul: New leader elected: Node 3a0dee63-0112-ab1b-d438-213ed51c845e
jones - 2019/11/27 02:19:15.863236 [INFO] agent: Synced node info
=== RUN   TestAgent_sidecarServiceFromNodeService/auto_ports_disabled
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:15.922441 [WARN] agent: Node name "Node 975e7e65-a3c0-20b5-0590-acecff7cdae7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:15.922881 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:15.923025 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:15.923267 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:15.923435 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:975e7e65-a3c0-20b5-0590-acecff7cdae7 Address:127.0.0.1:11698}]
2019/11/27 02:19:17 [INFO]  raft: Node at 127.0.0.1:11698 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:17.547896 [INFO] serf: EventMemberJoin: Node 975e7e65-a3c0-20b5-0590-acecff7cdae7.dc1 127.0.0.1
2019/11/27 02:19:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:17 [INFO]  raft: Node at 127.0.0.1:11698 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:17.554091 [INFO] serf: EventMemberJoin: Node 975e7e65-a3c0-20b5-0590-acecff7cdae7 127.0.0.1
jones - 2019/11/27 02:19:17.556067 [INFO] consul: Adding LAN server Node 975e7e65-a3c0-20b5-0590-acecff7cdae7 (Addr: tcp/127.0.0.1:11698) (DC: dc1)
jones - 2019/11/27 02:19:17.558003 [INFO] consul: Handled member-join event for server "Node 975e7e65-a3c0-20b5-0590-acecff7cdae7.dc1" in area "wan"
jones - 2019/11/27 02:19:17.569298 [INFO] agent: Started DNS server 127.0.0.1:11693 (udp)
jones - 2019/11/27 02:19:17.569790 [INFO] agent: Started DNS server 127.0.0.1:11693 (tcp)
jones - 2019/11/27 02:19:17.573609 [INFO] agent: Started HTTP server on 127.0.0.1:11694 (tcp)
jones - 2019/11/27 02:19:17.573756 [INFO] agent: started state syncer
jones - 2019/11/27 02:19:17.895950 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:17.896550 [DEBUG] consul: Skipping self join check for "Node 3a0dee63-0112-ab1b-d438-213ed51c845e" since the cluster is too small
jones - 2019/11/27 02:19:17.896816 [INFO] consul: member 'Node 3a0dee63-0112-ab1b-d438-213ed51c845e' joined, marking health alive
jones - 2019/11/27 02:19:17.898629 [INFO] agent: Synced service "api-proxy-sidecar"
jones - 2019/11/27 02:19:17.898749 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:19:17.898881 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/11/27 02:19:17.898936 [DEBUG] agent: Node info in sync
2019/11/27 02:19:18 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:18 [INFO]  raft: Node at 127.0.0.1:11698 [Leader] entering Leader state
jones - 2019/11/27 02:19:18.375191 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:18.375667 [INFO] consul: New leader elected: Node 975e7e65-a3c0-20b5-0590-acecff7cdae7
=== RUN   TestAgent_sidecarServiceFromNodeService/inherit_tags_and_meta
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:18.590035 [WARN] agent: Node name "Node 403133b7-b420-a957-4062-918c86f7ac39" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:18.590880 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:18.591116 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:18.591513 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:18.591949 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:18.919872 [INFO] agent: Synced node info
jones - 2019/11/27 02:19:18.919988 [DEBUG] agent: Node info in sync
2019/11/27 02:19:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:403133b7-b420-a957-4062-918c86f7ac39 Address:127.0.0.1:11704}]
2019/11/27 02:19:19 [INFO]  raft: Node at 127.0.0.1:11704 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:19.557916 [INFO] serf: EventMemberJoin: Node 403133b7-b420-a957-4062-918c86f7ac39.dc1 127.0.0.1
jones - 2019/11/27 02:19:19.568466 [INFO] serf: EventMemberJoin: Node 403133b7-b420-a957-4062-918c86f7ac39 127.0.0.1
jones - 2019/11/27 02:19:19.570546 [INFO] agent: Started DNS server 127.0.0.1:11699 (udp)
jones - 2019/11/27 02:19:19.572533 [INFO] agent: Started DNS server 127.0.0.1:11699 (tcp)
jones - 2019/11/27 02:19:19.571827 [INFO] consul: Handled member-join event for server "Node 403133b7-b420-a957-4062-918c86f7ac39.dc1" in area "wan"
jones - 2019/11/27 02:19:19.574751 [INFO] consul: Adding LAN server Node 403133b7-b420-a957-4062-918c86f7ac39 (Addr: tcp/127.0.0.1:11704) (DC: dc1)
jones - 2019/11/27 02:19:19.576614 [INFO] agent: Started HTTP server on 127.0.0.1:11700 (tcp)
jones - 2019/11/27 02:19:19.577032 [INFO] agent: started state syncer
2019/11/27 02:19:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:19 [INFO]  raft: Node at 127.0.0.1:11704 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:19.779514 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:19:20.860878 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:20.861378 [DEBUG] consul: Skipping self join check for "Node 975e7e65-a3c0-20b5-0590-acecff7cdae7" since the cluster is too small
jones - 2019/11/27 02:19:20.861549 [INFO] consul: member 'Node 975e7e65-a3c0-20b5-0590-acecff7cdae7' joined, marking health alive
2019/11/27 02:19:21 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:21 [INFO]  raft: Node at 127.0.0.1:11704 [Leader] entering Leader state
jones - 2019/11/27 02:19:21.062444 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:21.062893 [INFO] consul: New leader elected: Node 403133b7-b420-a957-4062-918c86f7ac39
jones - 2019/11/27 02:19:21.453959 [INFO] agent: Synced node info
jones - 2019/11/27 02:19:21.454306 [DEBUG] agent: Node info in sync
=== RUN   TestAgent_sidecarServiceFromNodeService/invalid_check_type
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:21.544705 [WARN] agent: Node name "Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:21.545193 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:21.545321 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:21.546854 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:21.547026 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2347fd59-3fd9-73da-16e7-50f89d3e62a6 Address:127.0.0.1:11710}]
2019/11/27 02:19:23 [INFO]  raft: Node at 127.0.0.1:11710 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:23.561415 [INFO] serf: EventMemberJoin: Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6.dc1 127.0.0.1
jones - 2019/11/27 02:19:23.565752 [INFO] serf: EventMemberJoin: Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6 127.0.0.1
jones - 2019/11/27 02:19:23.566644 [INFO] consul: Adding LAN server Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6 (Addr: tcp/127.0.0.1:11710) (DC: dc1)
jones - 2019/11/27 02:19:23.566663 [INFO] consul: Handled member-join event for server "Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6.dc1" in area "wan"
jones - 2019/11/27 02:19:23.567876 [INFO] agent: Started DNS server 127.0.0.1:11705 (udp)
jones - 2019/11/27 02:19:23.568357 [INFO] agent: Started DNS server 127.0.0.1:11705 (tcp)
jones - 2019/11/27 02:19:23.570704 [INFO] agent: Started HTTP server on 127.0.0.1:11706 (tcp)
jones - 2019/11/27 02:19:23.570810 [INFO] agent: started state syncer
2019/11/27 02:19:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:23 [INFO]  raft: Node at 127.0.0.1:11710 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:23.776208 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:23.776605 [DEBUG] consul: Skipping self join check for "Node 403133b7-b420-a957-4062-918c86f7ac39" since the cluster is too small
jones - 2019/11/27 02:19:23.776805 [INFO] consul: member 'Node 403133b7-b420-a957-4062-918c86f7ac39' joined, marking health alive
jones - 2019/11/27 02:19:24.278620 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:19:24.278701 [DEBUG] agent: Node info in sync
2019/11/27 02:19:24 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:24 [INFO]  raft: Node at 127.0.0.1:11710 [Leader] entering Leader state
jones - 2019/11/27 02:19:24.395258 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:24.395731 [INFO] consul: New leader elected: Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6
=== RUN   TestAgent_sidecarServiceFromNodeService/invalid_meta
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:24.516468 [WARN] agent: Node name "Node ce219ef7-1c48-e006-7f93-81e0fe3f4967" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:24.517267 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:24.517429 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:24.517658 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:24.517831 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:24.730979 [INFO] agent: Synced node info
jones - 2019/11/27 02:19:24.731086 [DEBUG] agent: Node info in sync
2019/11/27 02:19:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ce219ef7-1c48-e006-7f93-81e0fe3f4967 Address:127.0.0.1:11716}]
2019/11/27 02:19:25 [INFO]  raft: Node at 127.0.0.1:11716 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:25.763943 [INFO] serf: EventMemberJoin: Node ce219ef7-1c48-e006-7f93-81e0fe3f4967.dc1 127.0.0.1
jones - 2019/11/27 02:19:25.767451 [INFO] serf: EventMemberJoin: Node ce219ef7-1c48-e006-7f93-81e0fe3f4967 127.0.0.1
jones - 2019/11/27 02:19:25.768447 [INFO] consul: Adding LAN server Node ce219ef7-1c48-e006-7f93-81e0fe3f4967 (Addr: tcp/127.0.0.1:11716) (DC: dc1)
jones - 2019/11/27 02:19:25.768598 [INFO] consul: Handled member-join event for server "Node ce219ef7-1c48-e006-7f93-81e0fe3f4967.dc1" in area "wan"
jones - 2019/11/27 02:19:25.769996 [INFO] agent: Started DNS server 127.0.0.1:11711 (tcp)
jones - 2019/11/27 02:19:25.770398 [INFO] agent: Started DNS server 127.0.0.1:11711 (udp)
jones - 2019/11/27 02:19:25.772429 [INFO] agent: Started HTTP server on 127.0.0.1:11712 (tcp)
jones - 2019/11/27 02:19:25.772522 [INFO] agent: started state syncer
2019/11/27 02:19:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:25 [INFO]  raft: Node at 127.0.0.1:11716 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:26.040353 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:26.040817 [DEBUG] consul: Skipping self join check for "Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6" since the cluster is too small
jones - 2019/11/27 02:19:26.040971 [INFO] consul: member 'Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6' joined, marking health alive
jones - 2019/11/27 02:19:26.552160 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:19:26.552244 [DEBUG] agent: Node info in sync
2019/11/27 02:19:26 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:26 [INFO]  raft: Node at 127.0.0.1:11716 [Leader] entering Leader state
jones - 2019/11/27 02:19:26.927028 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:26.927540 [INFO] consul: New leader elected: Node ce219ef7-1c48-e006-7f93-81e0fe3f4967
=== RUN   TestAgent_sidecarServiceFromNodeService/re-registering_same_sidecar_with_no_port_should_pick_same_one
WARNING: bootstrap = true: do not enable unless necessary
jones - 2019/11/27 02:19:27.237887 [WARN] agent: Node name "Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
jones - 2019/11/27 02:19:27.238392 [DEBUG] tlsutil: Update with version 1
jones - 2019/11/27 02:19:27.238463 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:27.238801 [DEBUG] tlsutil: IncomingRPCConfig with version 1
jones - 2019/11/27 02:19:27.238961 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:19:27.649702 [INFO] agent: Synced node info
2019/11/27 02:19:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:36e68f41-fcf1-fd4e-fd91-f1e92fff0f22 Address:127.0.0.1:11722}]
2019/11/27 02:19:28 [INFO]  raft: Node at 127.0.0.1:11722 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:19:28.953165 [INFO] serf: EventMemberJoin: Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22.dc1 127.0.0.1
jones - 2019/11/27 02:19:28.957368 [INFO] serf: EventMemberJoin: Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22 127.0.0.1
jones - 2019/11/27 02:19:28.959070 [INFO] agent: Started DNS server 127.0.0.1:11717 (udp)
jones - 2019/11/27 02:19:28.959448 [INFO] consul: Adding LAN server Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22 (Addr: tcp/127.0.0.1:11722) (DC: dc1)
jones - 2019/11/27 02:19:28.959986 [INFO] agent: Started DNS server 127.0.0.1:11717 (tcp)
jones - 2019/11/27 02:19:28.960054 [INFO] consul: Handled member-join event for server "Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22.dc1" in area "wan"
jones - 2019/11/27 02:19:28.964087 [INFO] agent: Started HTTP server on 127.0.0.1:11718 (tcp)
jones - 2019/11/27 02:19:28.964189 [INFO] agent: started state syncer
2019/11/27 02:19:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:28 [INFO]  raft: Node at 127.0.0.1:11722 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:19:29.639370 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:29.642182 [DEBUG] consul: Skipping self join check for "Node ce219ef7-1c48-e006-7f93-81e0fe3f4967" since the cluster is too small
jones - 2019/11/27 02:19:29.642431 [INFO] consul: member 'Node ce219ef7-1c48-e006-7f93-81e0fe3f4967' joined, marking health alive
2019/11/27 02:19:29 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:29 [INFO]  raft: Node at 127.0.0.1:11722 [Leader] entering Leader state
jones - 2019/11/27 02:19:29.906591 [INFO] consul: cluster leadership acquired
jones - 2019/11/27 02:19:29.907076 [INFO] consul: New leader elected: Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22
--- PASS: TestAgent_sidecarServiceFromNodeService (27.40s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/no_sidecar (1.67s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/all_the_defaults (3.61s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/all_the_allowed_overrides (2.81s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/no_auto_ports_available (5.26s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/auto_ports_disabled (2.65s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/inherit_tags_and_meta (2.94s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/invalid_check_type (2.99s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/invalid_meta (2.68s)
    --- PASS: TestAgent_sidecarServiceFromNodeService/re-registering_same_sidecar_with_no_port_should_pick_same_one (2.80s)
=== RUN   TestSnapshot
--- SKIP: TestSnapshot (0.00s)
    snapshot_endpoint_test.go:15: DM-skipped
=== RUN   TestSnapshot_Options
=== PAUSE TestSnapshot_Options
=== RUN   TestStatusLeader
--- SKIP: TestStatusLeader (0.00s)
    status_endpoint_test.go:11: DM-skipped
=== RUN   TestStatusPeers
=== PAUSE TestStatusPeers
=== RUN   TestDefaultConfig
=== RUN   TestDefaultConfig/#00
=== PAUSE TestDefaultConfig/#00
=== RUN   TestDefaultConfig/#01
=== PAUSE TestDefaultConfig/#01
=== RUN   TestDefaultConfig/#02
=== PAUSE TestDefaultConfig/#02
=== RUN   TestDefaultConfig/#03
=== PAUSE TestDefaultConfig/#03
=== RUN   TestDefaultConfig/#04
=== PAUSE TestDefaultConfig/#04
=== RUN   TestDefaultConfig/#05
=== PAUSE TestDefaultConfig/#05
=== RUN   TestDefaultConfig/#06
=== PAUSE TestDefaultConfig/#06
=== RUN   TestDefaultConfig/#07
=== PAUSE TestDefaultConfig/#07
=== RUN   TestDefaultConfig/#08
=== PAUSE TestDefaultConfig/#08
=== RUN   TestDefaultConfig/#09
=== PAUSE TestDefaultConfig/#09
=== RUN   TestDefaultConfig/#10
=== PAUSE TestDefaultConfig/#10
=== RUN   TestDefaultConfig/#11
=== PAUSE TestDefaultConfig/#11
=== RUN   TestDefaultConfig/#12
=== PAUSE TestDefaultConfig/#12
=== RUN   TestDefaultConfig/#13
=== PAUSE TestDefaultConfig/#13
=== RUN   TestDefaultConfig/#14
=== PAUSE TestDefaultConfig/#14
=== RUN   TestDefaultConfig/#15
=== PAUSE TestDefaultConfig/#15
=== RUN   TestDefaultConfig/#16
=== PAUSE TestDefaultConfig/#16
=== RUN   TestDefaultConfig/#17
=== PAUSE TestDefaultConfig/#17
=== RUN   TestDefaultConfig/#18
=== PAUSE TestDefaultConfig/#18
=== RUN   TestDefaultConfig/#19
=== PAUSE TestDefaultConfig/#19
=== RUN   TestDefaultConfig/#20
=== PAUSE TestDefaultConfig/#20
=== RUN   TestDefaultConfig/#21
=== PAUSE TestDefaultConfig/#21
=== RUN   TestDefaultConfig/#22
=== PAUSE TestDefaultConfig/#22
=== RUN   TestDefaultConfig/#23
=== PAUSE TestDefaultConfig/#23
=== RUN   TestDefaultConfig/#24
=== PAUSE TestDefaultConfig/#24
=== RUN   TestDefaultConfig/#25
=== PAUSE TestDefaultConfig/#25
=== RUN   TestDefaultConfig/#26
=== PAUSE TestDefaultConfig/#26
=== RUN   TestDefaultConfig/#27
=== PAUSE TestDefaultConfig/#27
=== RUN   TestDefaultConfig/#28
=== PAUSE TestDefaultConfig/#28
=== RUN   TestDefaultConfig/#29
=== PAUSE TestDefaultConfig/#29
=== RUN   TestDefaultConfig/#30
=== PAUSE TestDefaultConfig/#30
=== RUN   TestDefaultConfig/#31
=== PAUSE TestDefaultConfig/#31
=== RUN   TestDefaultConfig/#32
=== PAUSE TestDefaultConfig/#32
=== RUN   TestDefaultConfig/#33
=== PAUSE TestDefaultConfig/#33
=== RUN   TestDefaultConfig/#34
=== PAUSE TestDefaultConfig/#34
=== RUN   TestDefaultConfig/#35
=== PAUSE TestDefaultConfig/#35
=== RUN   TestDefaultConfig/#36
=== PAUSE TestDefaultConfig/#36
=== RUN   TestDefaultConfig/#37
=== PAUSE TestDefaultConfig/#37
=== RUN   TestDefaultConfig/#38
=== PAUSE TestDefaultConfig/#38
=== RUN   TestDefaultConfig/#39
=== PAUSE TestDefaultConfig/#39
=== RUN   TestDefaultConfig/#40
=== PAUSE TestDefaultConfig/#40
=== RUN   TestDefaultConfig/#41
=== PAUSE TestDefaultConfig/#41
=== RUN   TestDefaultConfig/#42
=== PAUSE TestDefaultConfig/#42
=== RUN   TestDefaultConfig/#43
=== PAUSE TestDefaultConfig/#43
=== RUN   TestDefaultConfig/#44
=== PAUSE TestDefaultConfig/#44
=== RUN   TestDefaultConfig/#45
=== PAUSE TestDefaultConfig/#45
=== RUN   TestDefaultConfig/#46
=== PAUSE TestDefaultConfig/#46
=== RUN   TestDefaultConfig/#47
=== PAUSE TestDefaultConfig/#47
=== RUN   TestDefaultConfig/#48
=== PAUSE TestDefaultConfig/#48
=== RUN   TestDefaultConfig/#49
=== PAUSE TestDefaultConfig/#49
=== RUN   TestDefaultConfig/#50
=== PAUSE TestDefaultConfig/#50
=== RUN   TestDefaultConfig/#51
=== PAUSE TestDefaultConfig/#51
=== RUN   TestDefaultConfig/#52
=== PAUSE TestDefaultConfig/#52
=== RUN   TestDefaultConfig/#53
=== PAUSE TestDefaultConfig/#53
=== RUN   TestDefaultConfig/#54
=== PAUSE TestDefaultConfig/#54
=== RUN   TestDefaultConfig/#55
=== PAUSE TestDefaultConfig/#55
=== RUN   TestDefaultConfig/#56
=== PAUSE TestDefaultConfig/#56
=== RUN   TestDefaultConfig/#57
=== PAUSE TestDefaultConfig/#57
=== RUN   TestDefaultConfig/#58
=== PAUSE TestDefaultConfig/#58
=== RUN   TestDefaultConfig/#59
=== PAUSE TestDefaultConfig/#59
=== RUN   TestDefaultConfig/#60
=== PAUSE TestDefaultConfig/#60
=== RUN   TestDefaultConfig/#61
=== PAUSE TestDefaultConfig/#61
=== RUN   TestDefaultConfig/#62
=== PAUSE TestDefaultConfig/#62
=== RUN   TestDefaultConfig/#63
=== PAUSE TestDefaultConfig/#63
=== RUN   TestDefaultConfig/#64
=== PAUSE TestDefaultConfig/#64
=== RUN   TestDefaultConfig/#65
=== PAUSE TestDefaultConfig/#65
=== RUN   TestDefaultConfig/#66
=== PAUSE TestDefaultConfig/#66
=== RUN   TestDefaultConfig/#67
=== PAUSE TestDefaultConfig/#67
=== RUN   TestDefaultConfig/#68
=== PAUSE TestDefaultConfig/#68
=== RUN   TestDefaultConfig/#69
=== PAUSE TestDefaultConfig/#69
=== RUN   TestDefaultConfig/#70
=== PAUSE TestDefaultConfig/#70
=== RUN   TestDefaultConfig/#71
=== PAUSE TestDefaultConfig/#71
=== RUN   TestDefaultConfig/#72
=== PAUSE TestDefaultConfig/#72
=== RUN   TestDefaultConfig/#73
=== PAUSE TestDefaultConfig/#73
=== RUN   TestDefaultConfig/#74
=== PAUSE TestDefaultConfig/#74
=== RUN   TestDefaultConfig/#75
=== PAUSE TestDefaultConfig/#75
=== RUN   TestDefaultConfig/#76
=== PAUSE TestDefaultConfig/#76
=== RUN   TestDefaultConfig/#77
=== PAUSE TestDefaultConfig/#77
=== RUN   TestDefaultConfig/#78
=== PAUSE TestDefaultConfig/#78
=== RUN   TestDefaultConfig/#79
=== PAUSE TestDefaultConfig/#79
=== RUN   TestDefaultConfig/#80
=== PAUSE TestDefaultConfig/#80
=== RUN   TestDefaultConfig/#81
=== PAUSE TestDefaultConfig/#81
=== RUN   TestDefaultConfig/#82
=== PAUSE TestDefaultConfig/#82
=== RUN   TestDefaultConfig/#83
=== PAUSE TestDefaultConfig/#83
=== RUN   TestDefaultConfig/#84
=== PAUSE TestDefaultConfig/#84
=== RUN   TestDefaultConfig/#85
=== PAUSE TestDefaultConfig/#85
=== RUN   TestDefaultConfig/#86
=== PAUSE TestDefaultConfig/#86
=== RUN   TestDefaultConfig/#87
=== PAUSE TestDefaultConfig/#87
=== RUN   TestDefaultConfig/#88
=== PAUSE TestDefaultConfig/#88
=== RUN   TestDefaultConfig/#89
=== PAUSE TestDefaultConfig/#89
=== RUN   TestDefaultConfig/#90
=== PAUSE TestDefaultConfig/#90
=== RUN   TestDefaultConfig/#91
=== PAUSE TestDefaultConfig/#91
=== RUN   TestDefaultConfig/#92
=== PAUSE TestDefaultConfig/#92
=== RUN   TestDefaultConfig/#93
=== PAUSE TestDefaultConfig/#93
=== RUN   TestDefaultConfig/#94
=== PAUSE TestDefaultConfig/#94
=== RUN   TestDefaultConfig/#95
=== PAUSE TestDefaultConfig/#95
=== RUN   TestDefaultConfig/#96
=== PAUSE TestDefaultConfig/#96
=== RUN   TestDefaultConfig/#97
=== PAUSE TestDefaultConfig/#97
=== RUN   TestDefaultConfig/#98
=== PAUSE TestDefaultConfig/#98
=== RUN   TestDefaultConfig/#99
=== PAUSE TestDefaultConfig/#99
=== RUN   TestDefaultConfig/#100
=== PAUSE TestDefaultConfig/#100
=== RUN   TestDefaultConfig/#101
=== PAUSE TestDefaultConfig/#101
=== RUN   TestDefaultConfig/#102
=== PAUSE TestDefaultConfig/#102
=== RUN   TestDefaultConfig/#103
=== PAUSE TestDefaultConfig/#103
=== RUN   TestDefaultConfig/#104
=== PAUSE TestDefaultConfig/#104
=== RUN   TestDefaultConfig/#105
=== PAUSE TestDefaultConfig/#105
=== RUN   TestDefaultConfig/#106
=== PAUSE TestDefaultConfig/#106
=== RUN   TestDefaultConfig/#107
=== PAUSE TestDefaultConfig/#107
=== RUN   TestDefaultConfig/#108
=== PAUSE TestDefaultConfig/#108
=== RUN   TestDefaultConfig/#109
=== PAUSE TestDefaultConfig/#109
=== RUN   TestDefaultConfig/#110
=== PAUSE TestDefaultConfig/#110
=== RUN   TestDefaultConfig/#111
=== PAUSE TestDefaultConfig/#111
=== RUN   TestDefaultConfig/#112
=== PAUSE TestDefaultConfig/#112
=== RUN   TestDefaultConfig/#113
=== PAUSE TestDefaultConfig/#113
=== RUN   TestDefaultConfig/#114
=== PAUSE TestDefaultConfig/#114
=== RUN   TestDefaultConfig/#115
=== PAUSE TestDefaultConfig/#115
=== RUN   TestDefaultConfig/#116
=== PAUSE TestDefaultConfig/#116
=== RUN   TestDefaultConfig/#117
=== PAUSE TestDefaultConfig/#117
=== RUN   TestDefaultConfig/#118
=== PAUSE TestDefaultConfig/#118
=== RUN   TestDefaultConfig/#119
=== PAUSE TestDefaultConfig/#119
=== RUN   TestDefaultConfig/#120
=== PAUSE TestDefaultConfig/#120
=== RUN   TestDefaultConfig/#121
=== PAUSE TestDefaultConfig/#121
=== RUN   TestDefaultConfig/#122
=== PAUSE TestDefaultConfig/#122
=== RUN   TestDefaultConfig/#123
=== PAUSE TestDefaultConfig/#123
=== RUN   TestDefaultConfig/#124
=== PAUSE TestDefaultConfig/#124
=== RUN   TestDefaultConfig/#125
=== PAUSE TestDefaultConfig/#125
=== RUN   TestDefaultConfig/#126
=== PAUSE TestDefaultConfig/#126
=== RUN   TestDefaultConfig/#127
=== PAUSE TestDefaultConfig/#127
=== RUN   TestDefaultConfig/#128
=== PAUSE TestDefaultConfig/#128
=== RUN   TestDefaultConfig/#129
=== PAUSE TestDefaultConfig/#129
=== RUN   TestDefaultConfig/#130
=== PAUSE TestDefaultConfig/#130
=== RUN   TestDefaultConfig/#131
=== PAUSE TestDefaultConfig/#131
=== RUN   TestDefaultConfig/#132
=== PAUSE TestDefaultConfig/#132
=== RUN   TestDefaultConfig/#133
=== PAUSE TestDefaultConfig/#133
=== RUN   TestDefaultConfig/#134
=== PAUSE TestDefaultConfig/#134
=== RUN   TestDefaultConfig/#135
=== PAUSE TestDefaultConfig/#135
=== RUN   TestDefaultConfig/#136
=== PAUSE TestDefaultConfig/#136
=== RUN   TestDefaultConfig/#137
=== PAUSE TestDefaultConfig/#137
=== RUN   TestDefaultConfig/#138
=== PAUSE TestDefaultConfig/#138
=== RUN   TestDefaultConfig/#139
=== PAUSE TestDefaultConfig/#139
=== RUN   TestDefaultConfig/#140
=== PAUSE TestDefaultConfig/#140
=== RUN   TestDefaultConfig/#141
=== PAUSE TestDefaultConfig/#141
=== RUN   TestDefaultConfig/#142
=== PAUSE TestDefaultConfig/#142
=== RUN   TestDefaultConfig/#143
=== PAUSE TestDefaultConfig/#143
=== RUN   TestDefaultConfig/#144
=== PAUSE TestDefaultConfig/#144
=== RUN   TestDefaultConfig/#145
=== PAUSE TestDefaultConfig/#145
=== RUN   TestDefaultConfig/#146
=== PAUSE TestDefaultConfig/#146
=== RUN   TestDefaultConfig/#147
=== PAUSE TestDefaultConfig/#147
=== RUN   TestDefaultConfig/#148
=== PAUSE TestDefaultConfig/#148
=== RUN   TestDefaultConfig/#149
=== PAUSE TestDefaultConfig/#149
=== RUN   TestDefaultConfig/#150
=== PAUSE TestDefaultConfig/#150
=== RUN   TestDefaultConfig/#151
=== PAUSE TestDefaultConfig/#151
=== RUN   TestDefaultConfig/#152
=== PAUSE TestDefaultConfig/#152
=== RUN   TestDefaultConfig/#153
=== PAUSE TestDefaultConfig/#153
=== RUN   TestDefaultConfig/#154
=== PAUSE TestDefaultConfig/#154
=== RUN   TestDefaultConfig/#155
=== PAUSE TestDefaultConfig/#155
=== RUN   TestDefaultConfig/#156
=== PAUSE TestDefaultConfig/#156
=== RUN   TestDefaultConfig/#157
=== PAUSE TestDefaultConfig/#157
=== RUN   TestDefaultConfig/#158
=== PAUSE TestDefaultConfig/#158
=== RUN   TestDefaultConfig/#159
=== PAUSE TestDefaultConfig/#159
=== RUN   TestDefaultConfig/#160
=== PAUSE TestDefaultConfig/#160
=== RUN   TestDefaultConfig/#161
=== PAUSE TestDefaultConfig/#161
=== RUN   TestDefaultConfig/#162
=== PAUSE TestDefaultConfig/#162
=== RUN   TestDefaultConfig/#163
=== PAUSE TestDefaultConfig/#163
=== RUN   TestDefaultConfig/#164
=== PAUSE TestDefaultConfig/#164
=== RUN   TestDefaultConfig/#165
=== PAUSE TestDefaultConfig/#165
=== RUN   TestDefaultConfig/#166
=== PAUSE TestDefaultConfig/#166
=== RUN   TestDefaultConfig/#167
=== PAUSE TestDefaultConfig/#167
=== RUN   TestDefaultConfig/#168
=== PAUSE TestDefaultConfig/#168
=== RUN   TestDefaultConfig/#169
=== PAUSE TestDefaultConfig/#169
=== RUN   TestDefaultConfig/#170
=== PAUSE TestDefaultConfig/#170
=== RUN   TestDefaultConfig/#171
=== PAUSE TestDefaultConfig/#171
=== RUN   TestDefaultConfig/#172
=== PAUSE TestDefaultConfig/#172
=== RUN   TestDefaultConfig/#173
=== PAUSE TestDefaultConfig/#173
=== RUN   TestDefaultConfig/#174
=== PAUSE TestDefaultConfig/#174
=== RUN   TestDefaultConfig/#175
=== PAUSE TestDefaultConfig/#175
=== RUN   TestDefaultConfig/#176
=== PAUSE TestDefaultConfig/#176
=== RUN   TestDefaultConfig/#177
=== PAUSE TestDefaultConfig/#177
=== RUN   TestDefaultConfig/#178
=== PAUSE TestDefaultConfig/#178
=== RUN   TestDefaultConfig/#179
=== PAUSE TestDefaultConfig/#179
=== RUN   TestDefaultConfig/#180
=== PAUSE TestDefaultConfig/#180
=== RUN   TestDefaultConfig/#181
=== PAUSE TestDefaultConfig/#181
=== RUN   TestDefaultConfig/#182
=== PAUSE TestDefaultConfig/#182
=== RUN   TestDefaultConfig/#183
=== PAUSE TestDefaultConfig/#183
=== RUN   TestDefaultConfig/#184
=== PAUSE TestDefaultConfig/#184
=== RUN   TestDefaultConfig/#185
=== PAUSE TestDefaultConfig/#185
=== RUN   TestDefaultConfig/#186
=== PAUSE TestDefaultConfig/#186
=== RUN   TestDefaultConfig/#187
=== PAUSE TestDefaultConfig/#187
=== RUN   TestDefaultConfig/#188
=== PAUSE TestDefaultConfig/#188
=== RUN   TestDefaultConfig/#189
=== PAUSE TestDefaultConfig/#189
=== RUN   TestDefaultConfig/#190
=== PAUSE TestDefaultConfig/#190
=== RUN   TestDefaultConfig/#191
=== PAUSE TestDefaultConfig/#191
=== RUN   TestDefaultConfig/#192
=== PAUSE TestDefaultConfig/#192
=== RUN   TestDefaultConfig/#193
=== PAUSE TestDefaultConfig/#193
=== RUN   TestDefaultConfig/#194
=== PAUSE TestDefaultConfig/#194
=== RUN   TestDefaultConfig/#195
=== PAUSE TestDefaultConfig/#195
=== RUN   TestDefaultConfig/#196
=== PAUSE TestDefaultConfig/#196
=== RUN   TestDefaultConfig/#197
=== PAUSE TestDefaultConfig/#197
=== RUN   TestDefaultConfig/#198
=== PAUSE TestDefaultConfig/#198
=== RUN   TestDefaultConfig/#199
=== PAUSE TestDefaultConfig/#199
=== RUN   TestDefaultConfig/#200
=== PAUSE TestDefaultConfig/#200
=== RUN   TestDefaultConfig/#201
=== PAUSE TestDefaultConfig/#201
=== RUN   TestDefaultConfig/#202
=== PAUSE TestDefaultConfig/#202
=== RUN   TestDefaultConfig/#203
=== PAUSE TestDefaultConfig/#203
=== RUN   TestDefaultConfig/#204
=== PAUSE TestDefaultConfig/#204
=== RUN   TestDefaultConfig/#205
=== PAUSE TestDefaultConfig/#205
=== RUN   TestDefaultConfig/#206
=== PAUSE TestDefaultConfig/#206
=== RUN   TestDefaultConfig/#207
=== PAUSE TestDefaultConfig/#207
=== RUN   TestDefaultConfig/#208
=== PAUSE TestDefaultConfig/#208
=== RUN   TestDefaultConfig/#209
=== PAUSE TestDefaultConfig/#209
=== RUN   TestDefaultConfig/#210
=== PAUSE TestDefaultConfig/#210
=== RUN   TestDefaultConfig/#211
=== PAUSE TestDefaultConfig/#211
=== RUN   TestDefaultConfig/#212
=== PAUSE TestDefaultConfig/#212
=== RUN   TestDefaultConfig/#213
=== PAUSE TestDefaultConfig/#213
=== RUN   TestDefaultConfig/#214
=== PAUSE TestDefaultConfig/#214
=== RUN   TestDefaultConfig/#215
=== PAUSE TestDefaultConfig/#215
=== RUN   TestDefaultConfig/#216
=== PAUSE TestDefaultConfig/#216
=== RUN   TestDefaultConfig/#217
=== PAUSE TestDefaultConfig/#217
=== RUN   TestDefaultConfig/#218
=== PAUSE TestDefaultConfig/#218
=== RUN   TestDefaultConfig/#219
=== PAUSE TestDefaultConfig/#219
=== RUN   TestDefaultConfig/#220
=== PAUSE TestDefaultConfig/#220
=== RUN   TestDefaultConfig/#221
=== PAUSE TestDefaultConfig/#221
=== RUN   TestDefaultConfig/#222
=== PAUSE TestDefaultConfig/#222
=== RUN   TestDefaultConfig/#223
=== PAUSE TestDefaultConfig/#223
=== RUN   TestDefaultConfig/#224
=== PAUSE TestDefaultConfig/#224
=== RUN   TestDefaultConfig/#225
=== PAUSE TestDefaultConfig/#225
=== RUN   TestDefaultConfig/#226
=== PAUSE TestDefaultConfig/#226
=== RUN   TestDefaultConfig/#227
=== PAUSE TestDefaultConfig/#227
=== RUN   TestDefaultConfig/#228
=== PAUSE TestDefaultConfig/#228
=== RUN   TestDefaultConfig/#229
=== PAUSE TestDefaultConfig/#229
=== RUN   TestDefaultConfig/#230
=== PAUSE TestDefaultConfig/#230
=== RUN   TestDefaultConfig/#231
=== PAUSE TestDefaultConfig/#231
=== RUN   TestDefaultConfig/#232
=== PAUSE TestDefaultConfig/#232
=== RUN   TestDefaultConfig/#233
=== PAUSE TestDefaultConfig/#233
=== RUN   TestDefaultConfig/#234
=== PAUSE TestDefaultConfig/#234
=== RUN   TestDefaultConfig/#235
=== PAUSE TestDefaultConfig/#235
=== RUN   TestDefaultConfig/#236
=== PAUSE TestDefaultConfig/#236
=== RUN   TestDefaultConfig/#237
=== PAUSE TestDefaultConfig/#237
=== RUN   TestDefaultConfig/#238
=== PAUSE TestDefaultConfig/#238
=== RUN   TestDefaultConfig/#239
=== PAUSE TestDefaultConfig/#239
=== RUN   TestDefaultConfig/#240
=== PAUSE TestDefaultConfig/#240
=== RUN   TestDefaultConfig/#241
=== PAUSE TestDefaultConfig/#241
=== RUN   TestDefaultConfig/#242
=== PAUSE TestDefaultConfig/#242
=== RUN   TestDefaultConfig/#243
=== PAUSE TestDefaultConfig/#243
=== RUN   TestDefaultConfig/#244
=== PAUSE TestDefaultConfig/#244
=== RUN   TestDefaultConfig/#245
=== PAUSE TestDefaultConfig/#245
=== RUN   TestDefaultConfig/#246
=== PAUSE TestDefaultConfig/#246
=== RUN   TestDefaultConfig/#247
=== PAUSE TestDefaultConfig/#247
=== RUN   TestDefaultConfig/#248
=== PAUSE TestDefaultConfig/#248
=== RUN   TestDefaultConfig/#249
=== PAUSE TestDefaultConfig/#249
=== RUN   TestDefaultConfig/#250
=== PAUSE TestDefaultConfig/#250
=== RUN   TestDefaultConfig/#251
=== PAUSE TestDefaultConfig/#251
=== RUN   TestDefaultConfig/#252
=== PAUSE TestDefaultConfig/#252
=== RUN   TestDefaultConfig/#253
=== PAUSE TestDefaultConfig/#253
=== RUN   TestDefaultConfig/#254
=== PAUSE TestDefaultConfig/#254
=== RUN   TestDefaultConfig/#255
=== PAUSE TestDefaultConfig/#255
=== RUN   TestDefaultConfig/#256
=== PAUSE TestDefaultConfig/#256
=== RUN   TestDefaultConfig/#257
=== PAUSE TestDefaultConfig/#257
=== RUN   TestDefaultConfig/#258
=== PAUSE TestDefaultConfig/#258
=== RUN   TestDefaultConfig/#259
=== PAUSE TestDefaultConfig/#259
=== RUN   TestDefaultConfig/#260
=== PAUSE TestDefaultConfig/#260
=== RUN   TestDefaultConfig/#261
=== PAUSE TestDefaultConfig/#261
=== RUN   TestDefaultConfig/#262
=== PAUSE TestDefaultConfig/#262
=== RUN   TestDefaultConfig/#263
=== PAUSE TestDefaultConfig/#263
=== RUN   TestDefaultConfig/#264
=== PAUSE TestDefaultConfig/#264
=== RUN   TestDefaultConfig/#265
=== PAUSE TestDefaultConfig/#265
=== RUN   TestDefaultConfig/#266
=== PAUSE TestDefaultConfig/#266
=== RUN   TestDefaultConfig/#267
=== PAUSE TestDefaultConfig/#267
=== RUN   TestDefaultConfig/#268
=== PAUSE TestDefaultConfig/#268
=== RUN   TestDefaultConfig/#269
=== PAUSE TestDefaultConfig/#269
=== RUN   TestDefaultConfig/#270
jones - 2019/11/27 02:19:30.010416 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
=== PAUSE TestDefaultConfig/#270
jones - 2019/11/27 02:19:30.010468 [DEBUG] agent: Node info in sync
=== RUN   TestDefaultConfig/#271
jones - 2019/11/27 02:19:30.010537 [DEBUG] agent: Node info in sync
=== PAUSE TestDefaultConfig/#271
=== RUN   TestDefaultConfig/#272
=== PAUSE TestDefaultConfig/#272
=== RUN   TestDefaultConfig/#273
=== PAUSE TestDefaultConfig/#273
=== RUN   TestDefaultConfig/#274
=== PAUSE TestDefaultConfig/#274
=== RUN   TestDefaultConfig/#275
=== PAUSE TestDefaultConfig/#275
=== RUN   TestDefaultConfig/#276
=== PAUSE TestDefaultConfig/#276
=== RUN   TestDefaultConfig/#277
=== PAUSE TestDefaultConfig/#277
=== RUN   TestDefaultConfig/#278
=== PAUSE TestDefaultConfig/#278
=== RUN   TestDefaultConfig/#279
=== PAUSE TestDefaultConfig/#279
=== RUN   TestDefaultConfig/#280
=== PAUSE TestDefaultConfig/#280
=== RUN   TestDefaultConfig/#281
=== PAUSE TestDefaultConfig/#281
=== RUN   TestDefaultConfig/#282
=== PAUSE TestDefaultConfig/#282
=== RUN   TestDefaultConfig/#283
=== PAUSE TestDefaultConfig/#283
=== RUN   TestDefaultConfig/#284
=== PAUSE TestDefaultConfig/#284
=== RUN   TestDefaultConfig/#285
=== PAUSE TestDefaultConfig/#285
=== RUN   TestDefaultConfig/#286
=== PAUSE TestDefaultConfig/#286
=== RUN   TestDefaultConfig/#287
=== PAUSE TestDefaultConfig/#287
=== RUN   TestDefaultConfig/#288
=== PAUSE TestDefaultConfig/#288
=== RUN   TestDefaultConfig/#289
=== PAUSE TestDefaultConfig/#289
=== RUN   TestDefaultConfig/#290
=== PAUSE TestDefaultConfig/#290
=== RUN   TestDefaultConfig/#291
=== PAUSE TestDefaultConfig/#291
=== RUN   TestDefaultConfig/#292
=== PAUSE TestDefaultConfig/#292
=== RUN   TestDefaultConfig/#293
=== PAUSE TestDefaultConfig/#293
=== RUN   TestDefaultConfig/#294
=== PAUSE TestDefaultConfig/#294
=== RUN   TestDefaultConfig/#295
=== PAUSE TestDefaultConfig/#295
=== RUN   TestDefaultConfig/#296
=== PAUSE TestDefaultConfig/#296
=== RUN   TestDefaultConfig/#297
=== PAUSE TestDefaultConfig/#297
=== RUN   TestDefaultConfig/#298
=== PAUSE TestDefaultConfig/#298
=== RUN   TestDefaultConfig/#299
=== PAUSE TestDefaultConfig/#299
=== RUN   TestDefaultConfig/#300
=== PAUSE TestDefaultConfig/#300
=== RUN   TestDefaultConfig/#301
=== PAUSE TestDefaultConfig/#301
=== RUN   TestDefaultConfig/#302
=== PAUSE TestDefaultConfig/#302
=== RUN   TestDefaultConfig/#303
=== PAUSE TestDefaultConfig/#303
=== RUN   TestDefaultConfig/#304
=== PAUSE TestDefaultConfig/#304
=== RUN   TestDefaultConfig/#305
=== PAUSE TestDefaultConfig/#305
=== RUN   TestDefaultConfig/#306
=== PAUSE TestDefaultConfig/#306
=== RUN   TestDefaultConfig/#307
=== PAUSE TestDefaultConfig/#307
=== RUN   TestDefaultConfig/#308
=== PAUSE TestDefaultConfig/#308
=== RUN   TestDefaultConfig/#309
=== PAUSE TestDefaultConfig/#309
=== RUN   TestDefaultConfig/#310
=== PAUSE TestDefaultConfig/#310
=== RUN   TestDefaultConfig/#311
=== PAUSE TestDefaultConfig/#311
=== RUN   TestDefaultConfig/#312
=== PAUSE TestDefaultConfig/#312
=== RUN   TestDefaultConfig/#313
=== PAUSE TestDefaultConfig/#313
=== RUN   TestDefaultConfig/#314
=== PAUSE TestDefaultConfig/#314
=== RUN   TestDefaultConfig/#315
=== PAUSE TestDefaultConfig/#315
=== RUN   TestDefaultConfig/#316
=== PAUSE TestDefaultConfig/#316
=== RUN   TestDefaultConfig/#317
=== PAUSE TestDefaultConfig/#317
=== RUN   TestDefaultConfig/#318
=== PAUSE TestDefaultConfig/#318
=== RUN   TestDefaultConfig/#319
=== PAUSE TestDefaultConfig/#319
=== RUN   TestDefaultConfig/#320
=== PAUSE TestDefaultConfig/#320
=== RUN   TestDefaultConfig/#321
=== PAUSE TestDefaultConfig/#321
=== RUN   TestDefaultConfig/#322
=== PAUSE TestDefaultConfig/#322
=== RUN   TestDefaultConfig/#323
=== PAUSE TestDefaultConfig/#323
=== RUN   TestDefaultConfig/#324
=== PAUSE TestDefaultConfig/#324
=== RUN   TestDefaultConfig/#325
=== PAUSE TestDefaultConfig/#325
=== RUN   TestDefaultConfig/#326
=== PAUSE TestDefaultConfig/#326
=== RUN   TestDefaultConfig/#327
=== PAUSE TestDefaultConfig/#327
=== RUN   TestDefaultConfig/#328
=== PAUSE TestDefaultConfig/#328
=== RUN   TestDefaultConfig/#329
=== PAUSE TestDefaultConfig/#329
=== RUN   TestDefaultConfig/#330
=== PAUSE TestDefaultConfig/#330
=== RUN   TestDefaultConfig/#331
=== PAUSE TestDefaultConfig/#331
=== RUN   TestDefaultConfig/#332
=== PAUSE TestDefaultConfig/#332
=== RUN   TestDefaultConfig/#333
=== PAUSE TestDefaultConfig/#333
=== RUN   TestDefaultConfig/#334
=== PAUSE TestDefaultConfig/#334
=== RUN   TestDefaultConfig/#335
=== PAUSE TestDefaultConfig/#335
=== RUN   TestDefaultConfig/#336
=== PAUSE TestDefaultConfig/#336
=== RUN   TestDefaultConfig/#337
=== PAUSE TestDefaultConfig/#337
=== RUN   TestDefaultConfig/#338
=== PAUSE TestDefaultConfig/#338
=== RUN   TestDefaultConfig/#339
=== PAUSE TestDefaultConfig/#339
=== RUN   TestDefaultConfig/#340
=== PAUSE TestDefaultConfig/#340
=== RUN   TestDefaultConfig/#341
=== PAUSE TestDefaultConfig/#341
=== RUN   TestDefaultConfig/#342
=== PAUSE TestDefaultConfig/#342
=== RUN   TestDefaultConfig/#343
=== PAUSE TestDefaultConfig/#343
=== RUN   TestDefaultConfig/#344
=== PAUSE TestDefaultConfig/#344
=== RUN   TestDefaultConfig/#345
=== PAUSE TestDefaultConfig/#345
=== RUN   TestDefaultConfig/#346
=== PAUSE TestDefaultConfig/#346
=== RUN   TestDefaultConfig/#347
=== PAUSE TestDefaultConfig/#347
=== RUN   TestDefaultConfig/#348
=== PAUSE TestDefaultConfig/#348
=== RUN   TestDefaultConfig/#349
=== PAUSE TestDefaultConfig/#349
=== RUN   TestDefaultConfig/#350
=== PAUSE TestDefaultConfig/#350
=== RUN   TestDefaultConfig/#351
=== PAUSE TestDefaultConfig/#351
=== RUN   TestDefaultConfig/#352
=== PAUSE TestDefaultConfig/#352
=== RUN   TestDefaultConfig/#353
=== PAUSE TestDefaultConfig/#353
=== RUN   TestDefaultConfig/#354
=== PAUSE TestDefaultConfig/#354
=== RUN   TestDefaultConfig/#355
=== PAUSE TestDefaultConfig/#355
=== RUN   TestDefaultConfig/#356
=== PAUSE TestDefaultConfig/#356
=== RUN   TestDefaultConfig/#357
=== PAUSE TestDefaultConfig/#357
=== RUN   TestDefaultConfig/#358
=== PAUSE TestDefaultConfig/#358
=== RUN   TestDefaultConfig/#359
=== PAUSE TestDefaultConfig/#359
=== RUN   TestDefaultConfig/#360
=== PAUSE TestDefaultConfig/#360
=== RUN   TestDefaultConfig/#361
=== PAUSE TestDefaultConfig/#361
=== RUN   TestDefaultConfig/#362
=== PAUSE TestDefaultConfig/#362
=== RUN   TestDefaultConfig/#363
=== PAUSE TestDefaultConfig/#363
=== RUN   TestDefaultConfig/#364
=== PAUSE TestDefaultConfig/#364
=== RUN   TestDefaultConfig/#365
=== PAUSE TestDefaultConfig/#365
=== RUN   TestDefaultConfig/#366
=== PAUSE TestDefaultConfig/#366
=== RUN   TestDefaultConfig/#367
=== PAUSE TestDefaultConfig/#367
=== RUN   TestDefaultConfig/#368
=== PAUSE TestDefaultConfig/#368
=== RUN   TestDefaultConfig/#369
=== PAUSE TestDefaultConfig/#369
=== RUN   TestDefaultConfig/#370
=== PAUSE TestDefaultConfig/#370
=== RUN   TestDefaultConfig/#371
=== PAUSE TestDefaultConfig/#371
=== RUN   TestDefaultConfig/#372
=== PAUSE TestDefaultConfig/#372
=== RUN   TestDefaultConfig/#373
=== PAUSE TestDefaultConfig/#373
=== RUN   TestDefaultConfig/#374
=== PAUSE TestDefaultConfig/#374
=== RUN   TestDefaultConfig/#375
=== PAUSE TestDefaultConfig/#375
=== RUN   TestDefaultConfig/#376
=== PAUSE TestDefaultConfig/#376
=== RUN   TestDefaultConfig/#377
=== PAUSE TestDefaultConfig/#377
=== RUN   TestDefaultConfig/#378
=== PAUSE TestDefaultConfig/#378
=== RUN   TestDefaultConfig/#379
=== PAUSE TestDefaultConfig/#379
=== RUN   TestDefaultConfig/#380
=== PAUSE TestDefaultConfig/#380
=== RUN   TestDefaultConfig/#381
=== PAUSE TestDefaultConfig/#381
=== RUN   TestDefaultConfig/#382
=== PAUSE TestDefaultConfig/#382
=== RUN   TestDefaultConfig/#383
=== PAUSE TestDefaultConfig/#383
=== RUN   TestDefaultConfig/#384
=== PAUSE TestDefaultConfig/#384
=== RUN   TestDefaultConfig/#385
=== PAUSE TestDefaultConfig/#385
=== RUN   TestDefaultConfig/#386
=== PAUSE TestDefaultConfig/#386
=== RUN   TestDefaultConfig/#387
=== PAUSE TestDefaultConfig/#387
=== RUN   TestDefaultConfig/#388
=== PAUSE TestDefaultConfig/#388
=== RUN   TestDefaultConfig/#389
=== PAUSE TestDefaultConfig/#389
=== RUN   TestDefaultConfig/#390
=== PAUSE TestDefaultConfig/#390
=== RUN   TestDefaultConfig/#391
=== PAUSE TestDefaultConfig/#391
=== RUN   TestDefaultConfig/#392
=== PAUSE TestDefaultConfig/#392
=== RUN   TestDefaultConfig/#393
=== PAUSE TestDefaultConfig/#393
=== RUN   TestDefaultConfig/#394
=== PAUSE TestDefaultConfig/#394
=== RUN   TestDefaultConfig/#395
=== PAUSE TestDefaultConfig/#395
=== RUN   TestDefaultConfig/#396
=== PAUSE TestDefaultConfig/#396
=== RUN   TestDefaultConfig/#397
=== PAUSE TestDefaultConfig/#397
=== RUN   TestDefaultConfig/#398
=== PAUSE TestDefaultConfig/#398
=== RUN   TestDefaultConfig/#399
=== PAUSE TestDefaultConfig/#399
=== RUN   TestDefaultConfig/#400
=== PAUSE TestDefaultConfig/#400
=== RUN   TestDefaultConfig/#401
=== PAUSE TestDefaultConfig/#401
=== RUN   TestDefaultConfig/#402
=== PAUSE TestDefaultConfig/#402
=== RUN   TestDefaultConfig/#403
=== PAUSE TestDefaultConfig/#403
=== RUN   TestDefaultConfig/#404
=== PAUSE TestDefaultConfig/#404
=== RUN   TestDefaultConfig/#405
=== PAUSE TestDefaultConfig/#405
=== RUN   TestDefaultConfig/#406
=== PAUSE TestDefaultConfig/#406
=== RUN   TestDefaultConfig/#407
=== PAUSE TestDefaultConfig/#407
=== RUN   TestDefaultConfig/#408
=== PAUSE TestDefaultConfig/#408
=== RUN   TestDefaultConfig/#409
=== PAUSE TestDefaultConfig/#409
=== RUN   TestDefaultConfig/#410
=== PAUSE TestDefaultConfig/#410
=== RUN   TestDefaultConfig/#411
=== PAUSE TestDefaultConfig/#411
=== RUN   TestDefaultConfig/#412
=== PAUSE TestDefaultConfig/#412
=== RUN   TestDefaultConfig/#413
=== PAUSE TestDefaultConfig/#413
=== RUN   TestDefaultConfig/#414
=== PAUSE TestDefaultConfig/#414
=== RUN   TestDefaultConfig/#415
=== PAUSE TestDefaultConfig/#415
=== RUN   TestDefaultConfig/#416
=== PAUSE TestDefaultConfig/#416
=== RUN   TestDefaultConfig/#417
=== PAUSE TestDefaultConfig/#417
=== RUN   TestDefaultConfig/#418
=== PAUSE TestDefaultConfig/#418
=== RUN   TestDefaultConfig/#419
=== PAUSE TestDefaultConfig/#419
=== RUN   TestDefaultConfig/#420
=== PAUSE TestDefaultConfig/#420
=== RUN   TestDefaultConfig/#421
=== PAUSE TestDefaultConfig/#421
=== RUN   TestDefaultConfig/#422
=== PAUSE TestDefaultConfig/#422
=== RUN   TestDefaultConfig/#423
=== PAUSE TestDefaultConfig/#423
=== RUN   TestDefaultConfig/#424
=== PAUSE TestDefaultConfig/#424
=== RUN   TestDefaultConfig/#425
=== PAUSE TestDefaultConfig/#425
=== RUN   TestDefaultConfig/#426
=== PAUSE TestDefaultConfig/#426
=== RUN   TestDefaultConfig/#427
=== PAUSE TestDefaultConfig/#427
=== RUN   TestDefaultConfig/#428
=== PAUSE TestDefaultConfig/#428
=== RUN   TestDefaultConfig/#429
=== PAUSE TestDefaultConfig/#429
=== RUN   TestDefaultConfig/#430
=== PAUSE TestDefaultConfig/#430
=== RUN   TestDefaultConfig/#431
=== PAUSE TestDefaultConfig/#431
=== RUN   TestDefaultConfig/#432
=== PAUSE TestDefaultConfig/#432
=== RUN   TestDefaultConfig/#433
=== PAUSE TestDefaultConfig/#433
=== RUN   TestDefaultConfig/#434
=== PAUSE TestDefaultConfig/#434
=== RUN   TestDefaultConfig/#435
=== PAUSE TestDefaultConfig/#435
=== RUN   TestDefaultConfig/#436
=== PAUSE TestDefaultConfig/#436
=== RUN   TestDefaultConfig/#437
=== PAUSE TestDefaultConfig/#437
=== RUN   TestDefaultConfig/#438
=== PAUSE TestDefaultConfig/#438
=== RUN   TestDefaultConfig/#439
=== PAUSE TestDefaultConfig/#439
=== RUN   TestDefaultConfig/#440
=== PAUSE TestDefaultConfig/#440
=== RUN   TestDefaultConfig/#441
=== PAUSE TestDefaultConfig/#441
=== RUN   TestDefaultConfig/#442
=== PAUSE TestDefaultConfig/#442
=== RUN   TestDefaultConfig/#443
=== PAUSE TestDefaultConfig/#443
=== RUN   TestDefaultConfig/#444
=== PAUSE TestDefaultConfig/#444
=== RUN   TestDefaultConfig/#445
=== PAUSE TestDefaultConfig/#445
=== RUN   TestDefaultConfig/#446
=== PAUSE TestDefaultConfig/#446
=== RUN   TestDefaultConfig/#447
=== PAUSE TestDefaultConfig/#447
=== RUN   TestDefaultConfig/#448
=== PAUSE TestDefaultConfig/#448
=== RUN   TestDefaultConfig/#449
=== PAUSE TestDefaultConfig/#449
=== RUN   TestDefaultConfig/#450
=== PAUSE TestDefaultConfig/#450
=== RUN   TestDefaultConfig/#451
=== PAUSE TestDefaultConfig/#451
=== RUN   TestDefaultConfig/#452
=== PAUSE TestDefaultConfig/#452
=== RUN   TestDefaultConfig/#453
=== PAUSE TestDefaultConfig/#453
=== RUN   TestDefaultConfig/#454
=== PAUSE TestDefaultConfig/#454
=== RUN   TestDefaultConfig/#455
=== PAUSE TestDefaultConfig/#455
=== RUN   TestDefaultConfig/#456
=== PAUSE TestDefaultConfig/#456
=== RUN   TestDefaultConfig/#457
=== PAUSE TestDefaultConfig/#457
=== RUN   TestDefaultConfig/#458
=== PAUSE TestDefaultConfig/#458
=== RUN   TestDefaultConfig/#459
=== PAUSE TestDefaultConfig/#459
=== RUN   TestDefaultConfig/#460
=== PAUSE TestDefaultConfig/#460
=== RUN   TestDefaultConfig/#461
=== PAUSE TestDefaultConfig/#461
=== RUN   TestDefaultConfig/#462
=== PAUSE TestDefaultConfig/#462
=== RUN   TestDefaultConfig/#463
=== PAUSE TestDefaultConfig/#463
=== RUN   TestDefaultConfig/#464
=== PAUSE TestDefaultConfig/#464
=== RUN   TestDefaultConfig/#465
=== PAUSE TestDefaultConfig/#465
=== RUN   TestDefaultConfig/#466
=== PAUSE TestDefaultConfig/#466
=== RUN   TestDefaultConfig/#467
=== PAUSE TestDefaultConfig/#467
=== RUN   TestDefaultConfig/#468
=== PAUSE TestDefaultConfig/#468
=== RUN   TestDefaultConfig/#469
=== PAUSE TestDefaultConfig/#469
=== RUN   TestDefaultConfig/#470
=== PAUSE TestDefaultConfig/#470
=== RUN   TestDefaultConfig/#471
=== PAUSE TestDefaultConfig/#471
=== RUN   TestDefaultConfig/#472
=== PAUSE TestDefaultConfig/#472
=== RUN   TestDefaultConfig/#473
=== PAUSE TestDefaultConfig/#473
=== RUN   TestDefaultConfig/#474
=== PAUSE TestDefaultConfig/#474
=== RUN   TestDefaultConfig/#475
=== PAUSE TestDefaultConfig/#475
=== RUN   TestDefaultConfig/#476
=== PAUSE TestDefaultConfig/#476
=== RUN   TestDefaultConfig/#477
=== PAUSE TestDefaultConfig/#477
=== RUN   TestDefaultConfig/#478
=== PAUSE TestDefaultConfig/#478
=== RUN   TestDefaultConfig/#479
=== PAUSE TestDefaultConfig/#479
=== RUN   TestDefaultConfig/#480
=== PAUSE TestDefaultConfig/#480
=== RUN   TestDefaultConfig/#481
=== PAUSE TestDefaultConfig/#481
=== RUN   TestDefaultConfig/#482
=== PAUSE TestDefaultConfig/#482
=== RUN   TestDefaultConfig/#483
=== PAUSE TestDefaultConfig/#483
=== RUN   TestDefaultConfig/#484
=== PAUSE TestDefaultConfig/#484
=== RUN   TestDefaultConfig/#485
=== PAUSE TestDefaultConfig/#485
=== RUN   TestDefaultConfig/#486
=== PAUSE TestDefaultConfig/#486
=== RUN   TestDefaultConfig/#487
=== PAUSE TestDefaultConfig/#487
=== RUN   TestDefaultConfig/#488
=== PAUSE TestDefaultConfig/#488
=== RUN   TestDefaultConfig/#489
=== PAUSE TestDefaultConfig/#489
=== RUN   TestDefaultConfig/#490
=== PAUSE TestDefaultConfig/#490
=== RUN   TestDefaultConfig/#491
=== PAUSE TestDefaultConfig/#491
=== RUN   TestDefaultConfig/#492
=== PAUSE TestDefaultConfig/#492
=== RUN   TestDefaultConfig/#493
=== PAUSE TestDefaultConfig/#493
=== RUN   TestDefaultConfig/#494
=== PAUSE TestDefaultConfig/#494
=== RUN   TestDefaultConfig/#495
=== PAUSE TestDefaultConfig/#495
=== RUN   TestDefaultConfig/#496
=== PAUSE TestDefaultConfig/#496
=== RUN   TestDefaultConfig/#497
=== PAUSE TestDefaultConfig/#497
=== RUN   TestDefaultConfig/#498
=== PAUSE TestDefaultConfig/#498
=== RUN   TestDefaultConfig/#499
=== PAUSE TestDefaultConfig/#499
=== CONT  TestDefaultConfig/#00
=== CONT  TestDefaultConfig/#499
=== CONT  TestDefaultConfig/#435
=== CONT  TestDefaultConfig/#385
=== CONT  TestDefaultConfig/#242
=== CONT  TestDefaultConfig/#241
=== CONT  TestDefaultConfig/#240
=== CONT  TestDefaultConfig/#239
jones - 2019/11/27 02:19:30.429136 [INFO] agent: Synced service "web1-sidecar-proxy"
jones - 2019/11/27 02:19:30.429227 [DEBUG] agent: Node info in sync
=== CONT  TestDefaultConfig/#238
=== CONT  TestDefaultConfig/#237
=== CONT  TestDefaultConfig/#236
=== CONT  TestDefaultConfig/#235
=== CONT  TestDefaultConfig/#234
=== CONT  TestDefaultConfig/#233
=== CONT  TestDefaultConfig/#232
=== CONT  TestDefaultConfig/#231
=== CONT  TestDefaultConfig/#230
=== CONT  TestDefaultConfig/#229
=== CONT  TestDefaultConfig/#228
=== CONT  TestDefaultConfig/#227
=== CONT  TestDefaultConfig/#226
=== CONT  TestDefaultConfig/#225
=== CONT  TestDefaultConfig/#224
=== CONT  TestDefaultConfig/#349
=== CONT  TestDefaultConfig/#223
=== CONT  TestDefaultConfig/#222
=== CONT  TestDefaultConfig/#221
=== CONT  TestDefaultConfig/#220
=== CONT  TestDefaultConfig/#219
=== CONT  TestDefaultConfig/#218
=== CONT  TestDefaultConfig/#217
=== CONT  TestDefaultConfig/#216
=== CONT  TestDefaultConfig/#215
=== CONT  TestDefaultConfig/#214
=== CONT  TestDefaultConfig/#213
=== CONT  TestDefaultConfig/#212
=== CONT  TestDefaultConfig/#211
=== CONT  TestDefaultConfig/#210
=== CONT  TestDefaultConfig/#209
jones - 2019/11/27 02:19:31.967147 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:19:31.967567 [DEBUG] consul: Skipping self join check for "Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22" since the cluster is too small
jones - 2019/11/27 02:19:31.967739 [INFO] consul: member 'Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22' joined, marking health alive
=== CONT  TestDefaultConfig/#208
=== CONT  TestDefaultConfig/#153
=== CONT  TestDefaultConfig/#207
=== CONT  TestDefaultConfig/#206
=== CONT  TestDefaultConfig/#184
=== CONT  TestDefaultConfig/#205
=== CONT  TestDefaultConfig/#204
=== CONT  TestDefaultConfig/#203
=== CONT  TestDefaultConfig/#202
=== CONT  TestDefaultConfig/#201
=== CONT  TestDefaultConfig/#200
=== CONT  TestDefaultConfig/#199
jones - 2019/11/27 02:19:32.622613 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:19:32.622718 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/11/27 02:19:32.622762 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:19:32.622852 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/11/27 02:19:32.622895 [DEBUG] agent: Node info in sync
=== CONT  TestDefaultConfig/#198
=== CONT  TestDefaultConfig/#197
=== CONT  TestDefaultConfig/#196
=== CONT  TestDefaultConfig/#195
=== CONT  TestDefaultConfig/#194
=== CONT  TestDefaultConfig/#193
=== CONT  TestDefaultConfig/#192
=== CONT  TestDefaultConfig/#191
=== CONT  TestDefaultConfig/#190
=== CONT  TestDefaultConfig/#189
=== CONT  TestDefaultConfig/#188
=== CONT  TestDefaultConfig/#187
=== CONT  TestDefaultConfig/#186
=== CONT  TestDefaultConfig/#185
=== CONT  TestDefaultConfig/#183
=== CONT  TestDefaultConfig/#182
=== CONT  TestDefaultConfig/#181
=== CONT  TestDefaultConfig/#180
=== CONT  TestDefaultConfig/#179
=== CONT  TestDefaultConfig/#178
=== CONT  TestDefaultConfig/#177
=== CONT  TestDefaultConfig/#176
=== CONT  TestDefaultConfig/#175
=== CONT  TestDefaultConfig/#174
=== CONT  TestDefaultConfig/#173
=== CONT  TestDefaultConfig/#172
=== CONT  TestDefaultConfig/#171
=== CONT  TestDefaultConfig/#170
=== CONT  TestDefaultConfig/#169
=== CONT  TestDefaultConfig/#168
=== CONT  TestDefaultConfig/#167
=== CONT  TestDefaultConfig/#166
=== CONT  TestDefaultConfig/#165
=== CONT  TestDefaultConfig/#348
=== CONT  TestDefaultConfig/#164
=== CONT  TestDefaultConfig/#163
=== CONT  TestDefaultConfig/#162
=== CONT  TestDefaultConfig/#161
=== CONT  TestDefaultConfig/#160
=== CONT  TestDefaultConfig/#159
=== CONT  TestDefaultConfig/#158
=== CONT  TestDefaultConfig/#157
=== CONT  TestDefaultConfig/#156
=== CONT  TestDefaultConfig/#155
=== CONT  TestDefaultConfig/#154
=== CONT  TestDefaultConfig/#152
=== CONT  TestDefaultConfig/#151
=== CONT  TestDefaultConfig/#150
=== CONT  TestDefaultConfig/#149
=== CONT  TestDefaultConfig/#148
=== CONT  TestDefaultConfig/#147
=== CONT  TestDefaultConfig/#146
=== CONT  TestDefaultConfig/#145
=== CONT  TestDefaultConfig/#144
=== CONT  TestDefaultConfig/#143
=== CONT  TestDefaultConfig/#142
=== CONT  TestDefaultConfig/#141
=== CONT  TestDefaultConfig/#140
=== CONT  TestDefaultConfig/#139
=== CONT  TestDefaultConfig/#138
=== CONT  TestDefaultConfig/#137
=== CONT  TestDefaultConfig/#136
=== CONT  TestDefaultConfig/#135
=== CONT  TestDefaultConfig/#134
=== CONT  TestDefaultConfig/#133
=== CONT  TestDefaultConfig/#132
=== CONT  TestDefaultConfig/#131
=== CONT  TestDefaultConfig/#130
=== CONT  TestDefaultConfig/#129
=== CONT  TestDefaultConfig/#128
=== CONT  TestDefaultConfig/#127
=== CONT  TestDefaultConfig/#126
=== CONT  TestDefaultConfig/#125
=== CONT  TestDefaultConfig/#124
=== CONT  TestDefaultConfig/#123
=== CONT  TestDefaultConfig/#122
=== CONT  TestDefaultConfig/#121
=== CONT  TestDefaultConfig/#120
=== CONT  TestDefaultConfig/#119
=== CONT  TestDefaultConfig/#118
=== CONT  TestDefaultConfig/#117
=== CONT  TestDefaultConfig/#116
=== CONT  TestDefaultConfig/#115
=== CONT  TestDefaultConfig/#113
=== CONT  TestDefaultConfig/#112
=== CONT  TestDefaultConfig/#111
=== CONT  TestDefaultConfig/#110
=== CONT  TestDefaultConfig/#109
=== CONT  TestDefaultConfig/#108
=== CONT  TestDefaultConfig/#107
=== CONT  TestDefaultConfig/#105
=== CONT  TestDefaultConfig/#106
=== CONT  TestDefaultConfig/#104
=== CONT  TestDefaultConfig/#103
=== CONT  TestDefaultConfig/#347
=== CONT  TestDefaultConfig/#102
=== CONT  TestDefaultConfig/#101
=== CONT  TestDefaultConfig/#100
=== CONT  TestDefaultConfig/#99
=== CONT  TestDefaultConfig/#98
=== CONT  TestDefaultConfig/#97
=== CONT  TestDefaultConfig/#96
=== CONT  TestDefaultConfig/#95
=== CONT  TestDefaultConfig/#94
=== CONT  TestDefaultConfig/#93
=== CONT  TestDefaultConfig/#92
=== CONT  TestDefaultConfig/#91
=== CONT  TestDefaultConfig/#90
=== CONT  TestDefaultConfig/#89
=== CONT  TestDefaultConfig/#88
=== CONT  TestDefaultConfig/#87
=== CONT  TestDefaultConfig/#86
=== CONT  TestDefaultConfig/#85
=== CONT  TestDefaultConfig/#84
=== CONT  TestDefaultConfig/#83
=== CONT  TestDefaultConfig/#82
=== CONT  TestDefaultConfig/#81
=== CONT  TestDefaultConfig/#80
=== CONT  TestDefaultConfig/#79
=== CONT  TestDefaultConfig/#78
=== CONT  TestDefaultConfig/#77
=== CONT  TestDefaultConfig/#76
=== CONT  TestDefaultConfig/#75
=== CONT  TestDefaultConfig/#74
=== CONT  TestDefaultConfig/#73
=== CONT  TestDefaultConfig/#72
=== CONT  TestDefaultConfig/#71
=== CONT  TestDefaultConfig/#70
=== CONT  TestDefaultConfig/#69
=== CONT  TestDefaultConfig/#68
=== CONT  TestDefaultConfig/#67
=== CONT  TestDefaultConfig/#66
=== CONT  TestDefaultConfig/#65
=== CONT  TestDefaultConfig/#64
=== CONT  TestDefaultConfig/#63
=== CONT  TestDefaultConfig/#62
=== CONT  TestDefaultConfig/#61
=== CONT  TestDefaultConfig/#60
=== CONT  TestDefaultConfig/#59
=== CONT  TestDefaultConfig/#58
=== CONT  TestDefaultConfig/#57
=== CONT  TestDefaultConfig/#56
=== CONT  TestDefaultConfig/#55
=== CONT  TestDefaultConfig/#54
=== CONT  TestDefaultConfig/#53
=== CONT  TestDefaultConfig/#52
=== CONT  TestDefaultConfig/#51
=== CONT  TestDefaultConfig/#50
=== CONT  TestDefaultConfig/#49
=== CONT  TestDefaultConfig/#48
=== CONT  TestDefaultConfig/#47
=== CONT  TestDefaultConfig/#46
=== CONT  TestDefaultConfig/#45
=== CONT  TestDefaultConfig/#44
=== CONT  TestDefaultConfig/#43
=== CONT  TestDefaultConfig/#346
=== CONT  TestDefaultConfig/#42
=== CONT  TestDefaultConfig/#41
=== CONT  TestDefaultConfig/#40
=== CONT  TestDefaultConfig/#39
=== CONT  TestDefaultConfig/#38
=== CONT  TestDefaultConfig/#37
=== CONT  TestDefaultConfig/#36
=== CONT  TestDefaultConfig/#35
=== CONT  TestDefaultConfig/#34
=== CONT  TestDefaultConfig/#33
=== CONT  TestDefaultConfig/#32
=== CONT  TestDefaultConfig/#31
=== CONT  TestDefaultConfig/#30
=== CONT  TestDefaultConfig/#29
=== CONT  TestDefaultConfig/#28
=== CONT  TestDefaultConfig/#27
=== CONT  TestDefaultConfig/#26
=== CONT  TestDefaultConfig/#25
=== CONT  TestDefaultConfig/#24
=== CONT  TestDefaultConfig/#23
=== CONT  TestDefaultConfig/#22
=== CONT  TestDefaultConfig/#21
=== CONT  TestDefaultConfig/#20
=== CONT  TestDefaultConfig/#19
=== CONT  TestDefaultConfig/#18
=== CONT  TestDefaultConfig/#17
=== CONT  TestDefaultConfig/#16
=== CONT  TestDefaultConfig/#15
=== CONT  TestDefaultConfig/#14
=== CONT  TestDefaultConfig/#13
=== CONT  TestDefaultConfig/#12
=== CONT  TestDefaultConfig/#11
=== CONT  TestDefaultConfig/#10
=== CONT  TestDefaultConfig/#09
=== CONT  TestDefaultConfig/#08
=== CONT  TestDefaultConfig/#07
=== CONT  TestDefaultConfig/#06
=== CONT  TestDefaultConfig/#05
=== CONT  TestDefaultConfig/#04
=== CONT  TestDefaultConfig/#03
=== CONT  TestDefaultConfig/#02
=== CONT  TestDefaultConfig/#01
=== CONT  TestDefaultConfig/#345
=== CONT  TestDefaultConfig/#344
=== CONT  TestDefaultConfig/#343
=== CONT  TestDefaultConfig/#342
=== CONT  TestDefaultConfig/#341
=== CONT  TestDefaultConfig/#340
=== CONT  TestDefaultConfig/#338
=== CONT  TestDefaultConfig/#339
=== CONT  TestDefaultConfig/#337
=== CONT  TestDefaultConfig/#336
=== CONT  TestDefaultConfig/#335
=== CONT  TestDefaultConfig/#334
=== CONT  TestDefaultConfig/#333
=== CONT  TestDefaultConfig/#332
=== CONT  TestDefaultConfig/#331
=== CONT  TestDefaultConfig/#330
=== CONT  TestDefaultConfig/#329
=== CONT  TestDefaultConfig/#328
=== CONT  TestDefaultConfig/#318
=== CONT  TestDefaultConfig/#327
=== CONT  TestDefaultConfig/#326
=== CONT  TestDefaultConfig/#325
=== CONT  TestDefaultConfig/#324
=== CONT  TestDefaultConfig/#323
=== CONT  TestDefaultConfig/#322
=== CONT  TestDefaultConfig/#321
=== CONT  TestDefaultConfig/#320
=== CONT  TestDefaultConfig/#319
=== CONT  TestDefaultConfig/#317
=== CONT  TestDefaultConfig/#316
=== CONT  TestDefaultConfig/#315
=== CONT  TestDefaultConfig/#314
=== CONT  TestDefaultConfig/#313
=== CONT  TestDefaultConfig/#312
=== CONT  TestDefaultConfig/#311
=== CONT  TestDefaultConfig/#310
=== CONT  TestDefaultConfig/#309
=== CONT  TestDefaultConfig/#308
=== CONT  TestDefaultConfig/#307
=== CONT  TestDefaultConfig/#306
=== CONT  TestDefaultConfig/#305
=== CONT  TestDefaultConfig/#304
=== CONT  TestDefaultConfig/#303
=== CONT  TestDefaultConfig/#302
=== CONT  TestDefaultConfig/#301
=== CONT  TestDefaultConfig/#300
=== CONT  TestDefaultConfig/#299
=== CONT  TestDefaultConfig/#298
=== CONT  TestDefaultConfig/#297
=== CONT  TestDefaultConfig/#296
=== CONT  TestDefaultConfig/#295
=== CONT  TestDefaultConfig/#294
=== CONT  TestDefaultConfig/#293
=== CONT  TestDefaultConfig/#292
=== CONT  TestDefaultConfig/#291
=== CONT  TestDefaultConfig/#290
=== CONT  TestDefaultConfig/#289
=== CONT  TestDefaultConfig/#288
=== CONT  TestDefaultConfig/#287
=== CONT  TestDefaultConfig/#286
=== CONT  TestDefaultConfig/#285
=== CONT  TestDefaultConfig/#284
=== CONT  TestDefaultConfig/#283
=== CONT  TestDefaultConfig/#282
=== CONT  TestDefaultConfig/#281
=== CONT  TestDefaultConfig/#280
=== CONT  TestDefaultConfig/#279
=== CONT  TestDefaultConfig/#278
=== CONT  TestDefaultConfig/#277
=== CONT  TestDefaultConfig/#276
=== CONT  TestDefaultConfig/#275
=== CONT  TestDefaultConfig/#274
=== CONT  TestDefaultConfig/#273
=== CONT  TestDefaultConfig/#272
=== CONT  TestDefaultConfig/#271
=== CONT  TestDefaultConfig/#270
=== CONT  TestDefaultConfig/#269
=== CONT  TestDefaultConfig/#268
=== CONT  TestDefaultConfig/#267
=== CONT  TestDefaultConfig/#261
=== CONT  TestDefaultConfig/#266
=== CONT  TestDefaultConfig/#265
=== CONT  TestDefaultConfig/#264
=== CONT  TestDefaultConfig/#263
=== CONT  TestDefaultConfig/#262
=== CONT  TestDefaultConfig/#260
=== CONT  TestDefaultConfig/#498
=== CONT  TestDefaultConfig/#497
=== CONT  TestDefaultConfig/#496
=== CONT  TestDefaultConfig/#495
=== CONT  TestDefaultConfig/#494
=== CONT  TestDefaultConfig/#493
=== CONT  TestDefaultConfig/#492
=== CONT  TestDefaultConfig/#491
=== CONT  TestDefaultConfig/#490
=== CONT  TestDefaultConfig/#489
=== CONT  TestDefaultConfig/#488
=== CONT  TestDefaultConfig/#487
=== CONT  TestDefaultConfig/#486
=== CONT  TestDefaultConfig/#477
=== CONT  TestDefaultConfig/#485
=== CONT  TestDefaultConfig/#484
=== CONT  TestDefaultConfig/#483
=== CONT  TestDefaultConfig/#482
=== CONT  TestDefaultConfig/#481
=== CONT  TestDefaultConfig/#480
=== CONT  TestDefaultConfig/#479
=== CONT  TestDefaultConfig/#478
=== CONT  TestDefaultConfig/#476
=== CONT  TestDefaultConfig/#475
=== CONT  TestDefaultConfig/#474
=== CONT  TestDefaultConfig/#473
=== CONT  TestDefaultConfig/#472
=== CONT  TestDefaultConfig/#471
=== CONT  TestDefaultConfig/#470
=== CONT  TestDefaultConfig/#469
=== CONT  TestDefaultConfig/#408
=== CONT  TestDefaultConfig/#468
=== CONT  TestDefaultConfig/#467
=== CONT  TestDefaultConfig/#466
=== CONT  TestDefaultConfig/#465
=== CONT  TestDefaultConfig/#464
=== CONT  TestDefaultConfig/#463
=== CONT  TestDefaultConfig/#462
=== CONT  TestDefaultConfig/#461
=== CONT  TestDefaultConfig/#460
=== CONT  TestDefaultConfig/#459
=== CONT  TestDefaultConfig/#458
=== CONT  TestDefaultConfig/#457
=== CONT  TestDefaultConfig/#456
=== CONT  TestDefaultConfig/#455
=== CONT  TestDefaultConfig/#454
=== CONT  TestDefaultConfig/#453
=== CONT  TestDefaultConfig/#452
=== CONT  TestDefaultConfig/#451
=== CONT  TestDefaultConfig/#450
=== CONT  TestDefaultConfig/#254
=== CONT  TestDefaultConfig/#449
=== CONT  TestDefaultConfig/#448
=== CONT  TestDefaultConfig/#447
=== CONT  TestDefaultConfig/#446
=== CONT  TestDefaultConfig/#445
=== CONT  TestDefaultConfig/#444
=== CONT  TestDefaultConfig/#443
=== CONT  TestDefaultConfig/#442
=== CONT  TestDefaultConfig/#441
=== CONT  TestDefaultConfig/#440
=== CONT  TestDefaultConfig/#439
=== CONT  TestDefaultConfig/#438
=== CONT  TestDefaultConfig/#437
=== CONT  TestDefaultConfig/#436
=== CONT  TestDefaultConfig/#253
=== CONT  TestDefaultConfig/#252
=== CONT  TestDefaultConfig/#251
=== CONT  TestDefaultConfig/#250
=== CONT  TestDefaultConfig/#249
=== CONT  TestDefaultConfig/#248
=== CONT  TestDefaultConfig/#247
=== CONT  TestDefaultConfig/#246
=== CONT  TestDefaultConfig/#245
=== CONT  TestDefaultConfig/#244
=== CONT  TestDefaultConfig/#114
=== CONT  TestDefaultConfig/#434
=== CONT  TestDefaultConfig/#433
=== CONT  TestDefaultConfig/#432
=== CONT  TestDefaultConfig/#431
=== CONT  TestDefaultConfig/#430
=== CONT  TestDefaultConfig/#429
=== CONT  TestDefaultConfig/#428
=== CONT  TestDefaultConfig/#427
=== CONT  TestDefaultConfig/#426
=== CONT  TestDefaultConfig/#425
=== CONT  TestDefaultConfig/#424
=== CONT  TestDefaultConfig/#423
=== CONT  TestDefaultConfig/#422
=== CONT  TestDefaultConfig/#421
=== CONT  TestDefaultConfig/#420
=== CONT  TestDefaultConfig/#419
=== CONT  TestDefaultConfig/#418
=== CONT  TestDefaultConfig/#417
=== CONT  TestDefaultConfig/#416
=== CONT  TestDefaultConfig/#415
=== CONT  TestDefaultConfig/#414
=== CONT  TestDefaultConfig/#413
=== CONT  TestDefaultConfig/#412
=== CONT  TestDefaultConfig/#411
=== CONT  TestDefaultConfig/#410
=== CONT  TestDefaultConfig/#409
=== CONT  TestDefaultConfig/#407
=== CONT  TestDefaultConfig/#406
=== CONT  TestDefaultConfig/#405
=== CONT  TestDefaultConfig/#384
=== CONT  TestDefaultConfig/#383
=== CONT  TestDefaultConfig/#382
=== CONT  TestDefaultConfig/#381
=== CONT  TestDefaultConfig/#380
=== CONT  TestDefaultConfig/#379
=== CONT  TestDefaultConfig/#378
=== CONT  TestDefaultConfig/#377
=== CONT  TestDefaultConfig/#376
=== CONT  TestDefaultConfig/#375
=== CONT  TestDefaultConfig/#374
=== CONT  TestDefaultConfig/#373
=== CONT  TestDefaultConfig/#372
=== CONT  TestDefaultConfig/#243
=== CONT  TestDefaultConfig/#371
=== CONT  TestDefaultConfig/#370
=== CONT  TestDefaultConfig/#369
=== CONT  TestDefaultConfig/#368
=== CONT  TestDefaultConfig/#367
=== CONT  TestDefaultConfig/#366
=== CONT  TestDefaultConfig/#365
=== CONT  TestDefaultConfig/#364
=== CONT  TestDefaultConfig/#363
=== CONT  TestDefaultConfig/#362
=== CONT  TestDefaultConfig/#361
=== CONT  TestDefaultConfig/#360
=== CONT  TestDefaultConfig/#359
=== CONT  TestDefaultConfig/#358
=== CONT  TestDefaultConfig/#357
=== CONT  TestDefaultConfig/#356
=== CONT  TestDefaultConfig/#355
=== CONT  TestDefaultConfig/#354
=== CONT  TestDefaultConfig/#353
=== CONT  TestDefaultConfig/#352
=== CONT  TestDefaultConfig/#351
=== CONT  TestDefaultConfig/#350
=== CONT  TestDefaultConfig/#259
=== CONT  TestDefaultConfig/#258
=== CONT  TestDefaultConfig/#257
=== CONT  TestDefaultConfig/#256
=== CONT  TestDefaultConfig/#255
=== CONT  TestDefaultConfig/#404
=== CONT  TestDefaultConfig/#403
=== CONT  TestDefaultConfig/#402
=== CONT  TestDefaultConfig/#401
=== CONT  TestDefaultConfig/#400
=== CONT  TestDefaultConfig/#399
=== CONT  TestDefaultConfig/#398
=== CONT  TestDefaultConfig/#397
=== CONT  TestDefaultConfig/#396
=== CONT  TestDefaultConfig/#395
=== CONT  TestDefaultConfig/#394
=== CONT  TestDefaultConfig/#393
=== CONT  TestDefaultConfig/#392
=== CONT  TestDefaultConfig/#391
=== CONT  TestDefaultConfig/#390
=== CONT  TestDefaultConfig/#389
=== CONT  TestDefaultConfig/#388
=== CONT  TestDefaultConfig/#387
=== CONT  TestDefaultConfig/#386
--- PASS: TestDefaultConfig (0.17s)
    --- PASS: TestDefaultConfig/#385 (0.22s)
    --- PASS: TestDefaultConfig/#499 (0.25s)
    --- PASS: TestDefaultConfig/#00 (0.27s)
    --- PASS: TestDefaultConfig/#435 (0.29s)
    --- PASS: TestDefaultConfig/#242 (0.16s)
    --- PASS: TestDefaultConfig/#240 (0.15s)
    --- PASS: TestDefaultConfig/#241 (0.19s)
    --- PASS: TestDefaultConfig/#239 (0.15s)
    --- PASS: TestDefaultConfig/#238 (0.12s)
    --- PASS: TestDefaultConfig/#237 (0.11s)
    --- PASS: TestDefaultConfig/#235 (0.19s)
    --- PASS: TestDefaultConfig/#236 (0.23s)
    --- PASS: TestDefaultConfig/#234 (0.21s)
    --- PASS: TestDefaultConfig/#233 (0.19s)
    --- PASS: TestDefaultConfig/#231 (0.12s)
    --- PASS: TestDefaultConfig/#232 (0.14s)
    --- PASS: TestDefaultConfig/#230 (0.10s)
    --- PASS: TestDefaultConfig/#229 (0.15s)
    --- PASS: TestDefaultConfig/#228 (0.23s)
    --- PASS: TestDefaultConfig/#227 (0.24s)
    --- PASS: TestDefaultConfig/#226 (0.28s)
    --- PASS: TestDefaultConfig/#225 (0.18s)
    --- PASS: TestDefaultConfig/#349 (0.10s)
    --- PASS: TestDefaultConfig/#224 (0.15s)
    --- PASS: TestDefaultConfig/#223 (0.12s)
    --- PASS: TestDefaultConfig/#222 (0.27s)
    --- PASS: TestDefaultConfig/#220 (0.23s)
    --- PASS: TestDefaultConfig/#219 (0.24s)
    --- PASS: TestDefaultConfig/#221 (0.31s)
    --- PASS: TestDefaultConfig/#218 (0.15s)
    --- PASS: TestDefaultConfig/#217 (0.14s)
    --- PASS: TestDefaultConfig/#216 (0.20s)
    --- PASS: TestDefaultConfig/#215 (0.25s)
    --- PASS: TestDefaultConfig/#213 (0.27s)
    --- PASS: TestDefaultConfig/#214 (0.31s)
    --- PASS: TestDefaultConfig/#211 (0.17s)
    --- PASS: TestDefaultConfig/#212 (0.25s)
    --- PASS: TestDefaultConfig/#210 (0.11s)
    --- PASS: TestDefaultConfig/#209 (0.19s)
    --- PASS: TestDefaultConfig/#153 (0.19s)
    --- PASS: TestDefaultConfig/#208 (0.25s)
    --- PASS: TestDefaultConfig/#207 (0.21s)
    --- PASS: TestDefaultConfig/#206 (0.20s)
    --- PASS: TestDefaultConfig/#204 (0.12s)
    --- PASS: TestDefaultConfig/#184 (0.21s)
    --- PASS: TestDefaultConfig/#205 (0.18s)
    --- PASS: TestDefaultConfig/#203 (0.18s)
    --- PASS: TestDefaultConfig/#200 (0.23s)
    --- PASS: TestDefaultConfig/#201 (0.26s)
    --- PASS: TestDefaultConfig/#202 (0.30s)
    --- PASS: TestDefaultConfig/#199 (0.22s)
    --- PASS: TestDefaultConfig/#196 (0.12s)
    --- PASS: TestDefaultConfig/#198 (0.16s)
    --- PASS: TestDefaultConfig/#197 (0.20s)
    --- PASS: TestDefaultConfig/#195 (0.35s)
    --- PASS: TestDefaultConfig/#194 (0.34s)
    --- PASS: TestDefaultConfig/#193 (0.34s)
    --- PASS: TestDefaultConfig/#192 (0.36s)
    --- PASS: TestDefaultConfig/#191 (0.16s)
    --- PASS: TestDefaultConfig/#190 (0.17s)
    --- PASS: TestDefaultConfig/#189 (0.14s)
    --- PASS: TestDefaultConfig/#188 (0.18s)
    --- PASS: TestDefaultConfig/#187 (0.32s)
    --- PASS: TestDefaultConfig/#185 (0.27s)
    --- PASS: TestDefaultConfig/#186 (0.31s)
    --- PASS: TestDefaultConfig/#183 (0.28s)
    --- PASS: TestDefaultConfig/#181 (0.16s)
    --- PASS: TestDefaultConfig/#182 (0.18s)
    --- PASS: TestDefaultConfig/#180 (0.18s)
    --- PASS: TestDefaultConfig/#177 (0.24s)
    --- PASS: TestDefaultConfig/#179 (0.28s)
    --- PASS: TestDefaultConfig/#178 (0.27s)
    --- PASS: TestDefaultConfig/#175 (0.16s)
    --- PASS: TestDefaultConfig/#176 (0.37s)
    --- PASS: TestDefaultConfig/#174 (0.20s)
    --- PASS: TestDefaultConfig/#173 (0.19s)
    --- PASS: TestDefaultConfig/#172 (0.22s)
    --- PASS: TestDefaultConfig/#171 (0.29s)
    --- PASS: TestDefaultConfig/#169 (0.32s)
    --- PASS: TestDefaultConfig/#168 (0.18s)
    --- PASS: TestDefaultConfig/#170 (0.43s)
    --- PASS: TestDefaultConfig/#167 (0.18s)
    --- PASS: TestDefaultConfig/#166 (0.18s)
    --- PASS: TestDefaultConfig/#165 (0.22s)
    --- PASS: TestDefaultConfig/#348 (0.32s)
    --- PASS: TestDefaultConfig/#163 (0.26s)
    --- PASS: TestDefaultConfig/#164 (0.35s)
    --- PASS: TestDefaultConfig/#162 (0.22s)
    --- PASS: TestDefaultConfig/#161 (0.13s)
    --- PASS: TestDefaultConfig/#160 (0.12s)
    --- PASS: TestDefaultConfig/#159 (0.09s)
    --- PASS: TestDefaultConfig/#158 (0.24s)
    --- PASS: TestDefaultConfig/#157 (0.25s)
    --- PASS: TestDefaultConfig/#156 (0.25s)
    --- PASS: TestDefaultConfig/#155 (0.25s)
    --- PASS: TestDefaultConfig/#151 (0.15s)
    --- PASS: TestDefaultConfig/#150 (0.16s)
    --- PASS: TestDefaultConfig/#154 (0.25s)
    --- PASS: TestDefaultConfig/#152 (0.20s)
    --- PASS: TestDefaultConfig/#149 (0.37s)
    --- PASS: TestDefaultConfig/#147 (0.35s)
    --- PASS: TestDefaultConfig/#148 (0.37s)
    --- PASS: TestDefaultConfig/#146 (0.42s)
    --- PASS: TestDefaultConfig/#145 (0.19s)
    --- PASS: TestDefaultConfig/#144 (0.20s)
    --- PASS: TestDefaultConfig/#143 (0.19s)
    --- PASS: TestDefaultConfig/#142 (0.27s)
    --- PASS: TestDefaultConfig/#141 (0.35s)
    --- PASS: TestDefaultConfig/#139 (0.35s)
    --- PASS: TestDefaultConfig/#140 (0.42s)
    --- PASS: TestDefaultConfig/#138 (0.26s)
    --- PASS: TestDefaultConfig/#137 (0.17s)
    --- PASS: TestDefaultConfig/#136 (0.17s)
    --- PASS: TestDefaultConfig/#135 (0.18s)
    --- PASS: TestDefaultConfig/#134 (0.21s)
    --- PASS: TestDefaultConfig/#133 (0.26s)
    --- PASS: TestDefaultConfig/#131 (0.26s)
    --- PASS: TestDefaultConfig/#132 (0.32s)
    --- PASS: TestDefaultConfig/#130 (0.27s)
    --- PASS: TestDefaultConfig/#129 (0.16s)
    --- PASS: TestDefaultConfig/#128 (0.16s)
    --- PASS: TestDefaultConfig/#127 (0.19s)
    --- PASS: TestDefaultConfig/#126 (0.39s)
    --- PASS: TestDefaultConfig/#125 (0.44s)
    --- PASS: TestDefaultConfig/#124 (0.41s)
    --- PASS: TestDefaultConfig/#123 (0.43s)
    --- PASS: TestDefaultConfig/#122 (0.25s)
    --- PASS: TestDefaultConfig/#121 (0.22s)
    --- PASS: TestDefaultConfig/#119 (0.24s)
    --- PASS: TestDefaultConfig/#120 (0.32s)
    --- PASS: TestDefaultConfig/#118 (0.27s)
    --- PASS: TestDefaultConfig/#117 (0.25s)
    --- PASS: TestDefaultConfig/#116 (0.20s)
    --- PASS: TestDefaultConfig/#115 (0.20s)
    --- PASS: TestDefaultConfig/#112 (0.14s)
    --- PASS: TestDefaultConfig/#113 (0.16s)
    --- PASS: TestDefaultConfig/#111 (0.17s)
    --- PASS: TestDefaultConfig/#108 (0.21s)
    --- PASS: TestDefaultConfig/#110 (0.23s)
    --- PASS: TestDefaultConfig/#109 (0.25s)
    --- PASS: TestDefaultConfig/#107 (0.14s)
    --- PASS: TestDefaultConfig/#105 (0.11s)
    --- PASS: TestDefaultConfig/#106 (0.14s)
    --- PASS: TestDefaultConfig/#104 (0.13s)
    --- PASS: TestDefaultConfig/#103 (0.13s)
    --- PASS: TestDefaultConfig/#347 (0.24s)
    --- PASS: TestDefaultConfig/#101 (0.26s)
    --- PASS: TestDefaultConfig/#100 (0.25s)
    --- PASS: TestDefaultConfig/#102 (0.30s)
    --- PASS: TestDefaultConfig/#99 (0.17s)
    --- PASS: TestDefaultConfig/#96 (0.12s)
    --- PASS: TestDefaultConfig/#98 (0.19s)
    --- PASS: TestDefaultConfig/#97 (0.20s)
    --- PASS: TestDefaultConfig/#95 (0.30s)
    --- PASS: TestDefaultConfig/#94 (0.27s)
    --- PASS: TestDefaultConfig/#92 (0.21s)
    --- PASS: TestDefaultConfig/#93 (0.28s)
    --- PASS: TestDefaultConfig/#90 (0.11s)
    --- PASS: TestDefaultConfig/#91 (0.15s)
    --- PASS: TestDefaultConfig/#89 (0.17s)
    --- PASS: TestDefaultConfig/#88 (0.24s)
    --- PASS: TestDefaultConfig/#87 (0.31s)
    --- PASS: TestDefaultConfig/#86 (0.29s)
    --- PASS: TestDefaultConfig/#84 (0.16s)
    --- PASS: TestDefaultConfig/#85 (0.29s)
    --- PASS: TestDefaultConfig/#82 (0.17s)
    --- PASS: TestDefaultConfig/#83 (0.19s)
    --- PASS: TestDefaultConfig/#81 (0.33s)
    --- PASS: TestDefaultConfig/#80 (0.37s)
    --- PASS: TestDefaultConfig/#79 (0.37s)
    --- PASS: TestDefaultConfig/#76 (0.15s)
    --- PASS: TestDefaultConfig/#77 (0.20s)
    --- PASS: TestDefaultConfig/#78 (0.42s)
    --- PASS: TestDefaultConfig/#75 (0.29s)
    --- PASS: TestDefaultConfig/#72 (0.27s)
    --- PASS: TestDefaultConfig/#74 (0.34s)
    --- PASS: TestDefaultConfig/#73 (0.36s)
    --- PASS: TestDefaultConfig/#71 (0.19s)
    --- PASS: TestDefaultConfig/#70 (0.21s)
    --- PASS: TestDefaultConfig/#68 (0.16s)
    --- PASS: TestDefaultConfig/#69 (0.21s)
    --- PASS: TestDefaultConfig/#67 (0.31s)
    --- PASS: TestDefaultConfig/#66 (0.26s)
    --- PASS: TestDefaultConfig/#65 (0.26s)
    --- PASS: TestDefaultConfig/#64 (0.26s)
    --- PASS: TestDefaultConfig/#63 (0.12s)
    --- PASS: TestDefaultConfig/#60 (0.09s)
    --- PASS: TestDefaultConfig/#62 (0.14s)
    --- PASS: TestDefaultConfig/#61 (0.15s)
    --- PASS: TestDefaultConfig/#59 (0.20s)
    --- PASS: TestDefaultConfig/#58 (0.26s)
    --- PASS: TestDefaultConfig/#56 (0.30s)
    --- PASS: TestDefaultConfig/#57 (0.34s)
    --- PASS: TestDefaultConfig/#55 (0.28s)
    --- PASS: TestDefaultConfig/#54 (0.28s)
    --- PASS: TestDefaultConfig/#52 (0.20s)
    --- PASS: TestDefaultConfig/#53 (0.24s)
    --- PASS: TestDefaultConfig/#51 (0.30s)
    --- PASS: TestDefaultConfig/#50 (0.30s)
    --- PASS: TestDefaultConfig/#49 (0.33s)
    --- PASS: TestDefaultConfig/#48 (0.37s)
    --- PASS: TestDefaultConfig/#47 (0.25s)
    --- PASS: TestDefaultConfig/#45 (0.17s)
    --- PASS: TestDefaultConfig/#46 (0.29s)
    --- PASS: TestDefaultConfig/#44 (0.30s)
    --- PASS: TestDefaultConfig/#346 (0.27s)
    --- PASS: TestDefaultConfig/#43 (0.33s)
    --- PASS: TestDefaultConfig/#42 (0.28s)
    --- PASS: TestDefaultConfig/#41 (0.19s)
    --- PASS: TestDefaultConfig/#40 (0.16s)
    --- PASS: TestDefaultConfig/#39 (0.16s)
    --- PASS: TestDefaultConfig/#38 (0.32s)
    --- PASS: TestDefaultConfig/#37 (0.32s)
    --- PASS: TestDefaultConfig/#36 (0.35s)
    --- PASS: TestDefaultConfig/#35 (0.34s)
    --- PASS: TestDefaultConfig/#34 (0.16s)
    --- PASS: TestDefaultConfig/#33 (0.16s)
    --- PASS: TestDefaultConfig/#31 (0.27s)
    --- PASS: TestDefaultConfig/#32 (0.32s)
    --- PASS: TestDefaultConfig/#30 (0.27s)
    --- PASS: TestDefaultConfig/#29 (0.25s)
    --- PASS: TestDefaultConfig/#26 (0.18s)
    --- PASS: TestDefaultConfig/#27 (0.19s)
    --- PASS: TestDefaultConfig/#25 (0.17s)
    --- PASS: TestDefaultConfig/#28 (0.25s)
    --- PASS: TestDefaultConfig/#23 (0.17s)
    --- PASS: TestDefaultConfig/#24 (0.25s)
    --- PASS: TestDefaultConfig/#22 (0.28s)
    --- PASS: TestDefaultConfig/#21 (0.28s)
    --- PASS: TestDefaultConfig/#20 (0.19s)
    --- PASS: TestDefaultConfig/#19 (0.17s)
    --- PASS: TestDefaultConfig/#18 (0.17s)
    --- PASS: TestDefaultConfig/#17 (0.18s)
    --- PASS: TestDefaultConfig/#15 (0.25s)
    --- PASS: TestDefaultConfig/#16 (0.31s)
    --- PASS: TestDefaultConfig/#13 (0.26s)
    --- PASS: TestDefaultConfig/#14 (0.29s)
    --- PASS: TestDefaultConfig/#12 (0.13s)
    --- PASS: TestDefaultConfig/#11 (0.13s)
    --- PASS: TestDefaultConfig/#08 (0.13s)
    --- PASS: TestDefaultConfig/#09 (0.18s)
    --- PASS: TestDefaultConfig/#10 (0.21s)
    --- PASS: TestDefaultConfig/#07 (0.30s)
    --- PASS: TestDefaultConfig/#04 (0.21s)
    --- PASS: TestDefaultConfig/#06 (0.25s)
    --- PASS: TestDefaultConfig/#05 (0.28s)
    --- PASS: TestDefaultConfig/#03 (0.17s)
    --- PASS: TestDefaultConfig/#02 (0.20s)
    --- PASS: TestDefaultConfig/#01 (0.28s)
    --- PASS: TestDefaultConfig/#345 (0.32s)
    --- PASS: TestDefaultConfig/#344 (0.36s)
    --- PASS: TestDefaultConfig/#342 (0.25s)
    --- PASS: TestDefaultConfig/#341 (0.19s)
    --- PASS: TestDefaultConfig/#343 (0.38s)
    --- PASS: TestDefaultConfig/#340 (0.15s)
    --- PASS: TestDefaultConfig/#337 (0.26s)
    --- PASS: TestDefaultConfig/#338 (0.28s)
    --- PASS: TestDefaultConfig/#339 (0.29s)
    --- PASS: TestDefaultConfig/#336 (0.32s)
    --- PASS: TestDefaultConfig/#334 (0.22s)
    --- PASS: TestDefaultConfig/#335 (0.24s)
    --- PASS: TestDefaultConfig/#333 (0.22s)
    --- PASS: TestDefaultConfig/#332 (0.20s)
    --- PASS: TestDefaultConfig/#330 (0.22s)
    --- PASS: TestDefaultConfig/#329 (0.27s)
    --- PASS: TestDefaultConfig/#331 (0.30s)
    --- PASS: TestDefaultConfig/#318 (0.11s)
    --- PASS: TestDefaultConfig/#328 (0.25s)
    --- PASS: TestDefaultConfig/#327 (0.14s)
    --- PASS: TestDefaultConfig/#324 (0.08s)
    --- PASS: TestDefaultConfig/#326 (0.17s)
    --- PASS: TestDefaultConfig/#325 (0.13s)
    --- PASS: TestDefaultConfig/#323 (0.23s)
    --- PASS: TestDefaultConfig/#321 (0.24s)
    --- PASS: TestDefaultConfig/#322 (0.29s)
    --- PASS: TestDefaultConfig/#320 (0.32s)
    --- PASS: TestDefaultConfig/#319 (0.14s)
    --- PASS: TestDefaultConfig/#317 (0.16s)
    --- PASS: TestDefaultConfig/#316 (0.15s)
    --- PASS: TestDefaultConfig/#315 (0.16s)
    --- PASS: TestDefaultConfig/#314 (0.26s)
    --- PASS: TestDefaultConfig/#312 (0.18s)
    --- PASS: TestDefaultConfig/#311 (0.17s)
    --- PASS: TestDefaultConfig/#313 (0.30s)
    --- PASS: TestDefaultConfig/#310 (0.12s)
    --- PASS: TestDefaultConfig/#309 (0.12s)
    --- PASS: TestDefaultConfig/#307 (0.21s)
    --- PASS: TestDefaultConfig/#308 (0.27s)
    --- PASS: TestDefaultConfig/#306 (0.30s)
    --- PASS: TestDefaultConfig/#305 (0.27s)
    --- PASS: TestDefaultConfig/#304 (0.16s)
    --- PASS: TestDefaultConfig/#303 (0.17s)
    --- PASS: TestDefaultConfig/#301 (0.15s)
    --- PASS: TestDefaultConfig/#302 (0.15s)
    --- PASS: TestDefaultConfig/#299 (0.20s)
    --- PASS: TestDefaultConfig/#300 (0.26s)
    --- PASS: TestDefaultConfig/#297 (0.27s)
    --- PASS: TestDefaultConfig/#295 (0.11s)
    --- PASS: TestDefaultConfig/#298 (0.31s)
    --- PASS: TestDefaultConfig/#296 (0.18s)
    --- PASS: TestDefaultConfig/#294 (0.12s)
    --- PASS: TestDefaultConfig/#293 (0.21s)
    --- PASS: TestDefaultConfig/#292 (0.30s)
    --- PASS: TestDefaultConfig/#291 (0.30s)
    --- PASS: TestDefaultConfig/#290 (0.26s)
    --- PASS: TestDefaultConfig/#287 (0.11s)
    --- PASS: TestDefaultConfig/#289 (0.26s)
    --- PASS: TestDefaultConfig/#288 (0.14s)
    --- PASS: TestDefaultConfig/#286 (0.12s)
    --- PASS: TestDefaultConfig/#285 (0.18s)
    --- PASS: TestDefaultConfig/#283 (0.29s)
    --- PASS: TestDefaultConfig/#282 (0.34s)
    --- PASS: TestDefaultConfig/#284 (0.41s)
    --- PASS: TestDefaultConfig/#281 (0.30s)
    --- PASS: TestDefaultConfig/#280 (0.23s)
    --- PASS: TestDefaultConfig/#279 (0.20s)
    --- PASS: TestDefaultConfig/#278 (0.18s)
    --- PASS: TestDefaultConfig/#277 (0.32s)
    --- PASS: TestDefaultConfig/#275 (0.31s)
    --- PASS: TestDefaultConfig/#276 (0.37s)
    --- PASS: TestDefaultConfig/#274 (0.36s)
    --- PASS: TestDefaultConfig/#273 (0.22s)
    --- PASS: TestDefaultConfig/#272 (0.13s)
    --- PASS: TestDefaultConfig/#271 (0.12s)
    --- PASS: TestDefaultConfig/#270 (0.24s)
    --- PASS: TestDefaultConfig/#268 (0.31s)
    --- PASS: TestDefaultConfig/#269 (0.34s)
    --- PASS: TestDefaultConfig/#267 (0.33s)
    --- PASS: TestDefaultConfig/#261 (0.17s)
    --- PASS: TestDefaultConfig/#265 (0.10s)
    --- PASS: TestDefaultConfig/#266 (0.18s)
    --- PASS: TestDefaultConfig/#264 (0.16s)
    --- PASS: TestDefaultConfig/#263 (0.18s)
    --- PASS: TestDefaultConfig/#262 (0.32s)
    --- PASS: TestDefaultConfig/#498 (0.28s)
    --- PASS: TestDefaultConfig/#260 (0.31s)
    --- PASS: TestDefaultConfig/#497 (0.28s)
    --- PASS: TestDefaultConfig/#496 (0.11s)
    --- PASS: TestDefaultConfig/#494 (0.11s)
    --- PASS: TestDefaultConfig/#493 (0.26s)
    --- PASS: TestDefaultConfig/#495 (0.38s)
    --- PASS: TestDefaultConfig/#492 (0.32s)
    --- PASS: TestDefaultConfig/#491 (0.32s)
    --- PASS: TestDefaultConfig/#490 (0.23s)
    --- PASS: TestDefaultConfig/#489 (0.17s)
    --- PASS: TestDefaultConfig/#488 (0.13s)
    --- PASS: TestDefaultConfig/#487 (0.15s)
    --- PASS: TestDefaultConfig/#486 (0.22s)
    --- PASS: TestDefaultConfig/#477 (0.21s)
    --- PASS: TestDefaultConfig/#485 (0.25s)
    --- PASS: TestDefaultConfig/#484 (0.25s)
    --- PASS: TestDefaultConfig/#482 (0.24s)
    --- PASS: TestDefaultConfig/#481 (0.22s)
    --- PASS: TestDefaultConfig/#483 (0.29s)
    --- PASS: TestDefaultConfig/#480 (0.24s)
    --- PASS: TestDefaultConfig/#478 (0.32s)
    --- PASS: TestDefaultConfig/#476 (0.31s)
    --- PASS: TestDefaultConfig/#479 (0.38s)
    --- PASS: TestDefaultConfig/#475 (0.34s)
    --- PASS: TestDefaultConfig/#474 (0.20s)
    --- PASS: TestDefaultConfig/#472 (0.20s)
    --- PASS: TestDefaultConfig/#473 (0.25s)
    --- PASS: TestDefaultConfig/#471 (0.21s)
    --- PASS: TestDefaultConfig/#470 (0.24s)
    --- PASS: TestDefaultConfig/#468 (0.26s)
    --- PASS: TestDefaultConfig/#469 (0.30s)
    --- PASS: TestDefaultConfig/#408 (0.30s)
    --- PASS: TestDefaultConfig/#466 (0.16s)
    --- PASS: TestDefaultConfig/#467 (0.26s)
    --- PASS: TestDefaultConfig/#465 (0.19s)
    --- PASS: TestDefaultConfig/#464 (0.18s)
    --- PASS: TestDefaultConfig/#463 (0.27s)
    --- PASS: TestDefaultConfig/#462 (0.31s)
    --- PASS: TestDefaultConfig/#460 (0.34s)
    --- PASS: TestDefaultConfig/#461 (0.38s)
    --- PASS: TestDefaultConfig/#459 (0.17s)
    --- PASS: TestDefaultConfig/#458 (0.15s)
    --- PASS: TestDefaultConfig/#456 (0.25s)
    --- PASS: TestDefaultConfig/#457 (0.32s)
    --- PASS: TestDefaultConfig/#455 (0.32s)
    --- PASS: TestDefaultConfig/#454 (0.32s)
    --- PASS: TestDefaultConfig/#453 (0.22s)
    --- PASS: TestDefaultConfig/#452 (0.23s)
    --- PASS: TestDefaultConfig/#451 (0.19s)
    --- PASS: TestDefaultConfig/#450 (0.17s)
    --- PASS: TestDefaultConfig/#254 (0.16s)
    --- PASS: TestDefaultConfig/#447 (0.22s)
    --- PASS: TestDefaultConfig/#448 (0.24s)
    --- PASS: TestDefaultConfig/#449 (0.29s)
    --- PASS: TestDefaultConfig/#445 (0.12s)
    --- PASS: TestDefaultConfig/#446 (0.29s)
    --- PASS: TestDefaultConfig/#443 (0.12s)
    --- PASS: TestDefaultConfig/#444 (0.19s)
    --- PASS: TestDefaultConfig/#442 (0.19s)
    --- PASS: TestDefaultConfig/#441 (0.25s)
    --- PASS: TestDefaultConfig/#440 (0.25s)
    --- PASS: TestDefaultConfig/#439 (0.24s)
    --- PASS: TestDefaultConfig/#437 (0.11s)
    --- PASS: TestDefaultConfig/#436 (0.12s)
    --- PASS: TestDefaultConfig/#438 (0.25s)
    --- PASS: TestDefaultConfig/#253 (0.11s)
    --- PASS: TestDefaultConfig/#252 (0.20s)
    --- PASS: TestDefaultConfig/#250 (0.21s)
    --- PASS: TestDefaultConfig/#249 (0.22s)
    --- PASS: TestDefaultConfig/#251 (0.25s)
    --- PASS: TestDefaultConfig/#248 (0.16s)
    --- PASS: TestDefaultConfig/#247 (0.18s)
    --- PASS: TestDefaultConfig/#245 (0.16s)
    --- PASS: TestDefaultConfig/#246 (0.20s)
    --- PASS: TestDefaultConfig/#244 (0.26s)
    --- PASS: TestDefaultConfig/#433 (0.33s)
    --- PASS: TestDefaultConfig/#434 (0.37s)
    --- PASS: TestDefaultConfig/#114 (0.39s)
    --- PASS: TestDefaultConfig/#432 (0.18s)
    --- PASS: TestDefaultConfig/#431 (0.16s)
    --- PASS: TestDefaultConfig/#430 (0.22s)
    --- PASS: TestDefaultConfig/#429 (0.22s)
    --- PASS: TestDefaultConfig/#428 (0.27s)
    --- PASS: TestDefaultConfig/#427 (0.30s)
    --- PASS: TestDefaultConfig/#425 (0.24s)
    --- PASS: TestDefaultConfig/#426 (0.30s)
    --- PASS: TestDefaultConfig/#424 (0.20s)
    --- PASS: TestDefaultConfig/#423 (0.13s)
    --- PASS: TestDefaultConfig/#422 (0.13s)
    --- PASS: TestDefaultConfig/#421 (0.30s)
    --- PASS: TestDefaultConfig/#419 (0.30s)
    --- PASS: TestDefaultConfig/#420 (0.37s)
    --- PASS: TestDefaultConfig/#418 (0.34s)
    --- PASS: TestDefaultConfig/#417 (0.21s)
    --- PASS: TestDefaultConfig/#415 (0.16s)
    --- PASS: TestDefaultConfig/#416 (0.23s)
    --- PASS: TestDefaultConfig/#414 (0.15s)
    --- PASS: TestDefaultConfig/#412 (0.29s)
    --- PASS: TestDefaultConfig/#413 (0.37s)
    --- PASS: TestDefaultConfig/#411 (0.34s)
    --- PASS: TestDefaultConfig/#410 (0.37s)
    --- PASS: TestDefaultConfig/#409 (0.21s)
    --- PASS: TestDefaultConfig/#407 (0.16s)
    --- PASS: TestDefaultConfig/#405 (0.12s)
    --- PASS: TestDefaultConfig/#406 (0.21s)
    --- PASS: TestDefaultConfig/#383 (0.29s)
    --- PASS: TestDefaultConfig/#384 (0.35s)
    --- PASS: TestDefaultConfig/#382 (0.34s)
    --- PASS: TestDefaultConfig/#381 (0.31s)
    --- PASS: TestDefaultConfig/#380 (0.18s)
    --- PASS: TestDefaultConfig/#379 (0.18s)
    --- PASS: TestDefaultConfig/#378 (0.15s)
    --- PASS: TestDefaultConfig/#377 (0.19s)
    --- PASS: TestDefaultConfig/#376 (0.31s)
    --- PASS: TestDefaultConfig/#374 (0.26s)
    --- PASS: TestDefaultConfig/#375 (0.28s)
    --- PASS: TestDefaultConfig/#373 (0.24s)
    --- PASS: TestDefaultConfig/#372 (0.14s)
    --- PASS: TestDefaultConfig/#243 (0.18s)
    --- PASS: TestDefaultConfig/#370 (0.16s)
    --- PASS: TestDefaultConfig/#371 (0.31s)
    --- PASS: TestDefaultConfig/#369 (0.35s)
    --- PASS: TestDefaultConfig/#368 (0.32s)
    --- PASS: TestDefaultConfig/#366 (0.19s)
    --- PASS: TestDefaultConfig/#367 (0.33s)
    --- PASS: TestDefaultConfig/#365 (0.14s)
    --- PASS: TestDefaultConfig/#364 (0.12s)
    --- PASS: TestDefaultConfig/#362 (0.11s)
    --- PASS: TestDefaultConfig/#363 (0.21s)
    --- PASS: TestDefaultConfig/#360 (0.16s)
    --- PASS: TestDefaultConfig/#361 (0.20s)
    --- PASS: TestDefaultConfig/#359 (0.20s)
    --- PASS: TestDefaultConfig/#358 (0.14s)
    --- PASS: TestDefaultConfig/#357 (0.13s)
    --- PASS: TestDefaultConfig/#355 (0.19s)
    --- PASS: TestDefaultConfig/#356 (0.35s)
    --- PASS: TestDefaultConfig/#353 (0.29s)
    --- PASS: TestDefaultConfig/#354 (0.32s)
    --- PASS: TestDefaultConfig/#351 (0.14s)
    --- PASS: TestDefaultConfig/#352 (0.28s)
    --- PASS: TestDefaultConfig/#259 (0.14s)
    --- PASS: TestDefaultConfig/#350 (0.20s)
    --- PASS: TestDefaultConfig/#258 (0.23s)
    --- PASS: TestDefaultConfig/#257 (0.30s)
    --- PASS: TestDefaultConfig/#255 (0.28s)
    --- PASS: TestDefaultConfig/#256 (0.37s)
    --- PASS: TestDefaultConfig/#404 (0.21s)
    --- PASS: TestDefaultConfig/#403 (0.16s)
    --- PASS: TestDefaultConfig/#402 (0.12s)
    --- PASS: TestDefaultConfig/#400 (0.15s)
    --- PASS: TestDefaultConfig/#401 (0.22s)
    --- PASS: TestDefaultConfig/#399 (0.21s)
    --- PASS: TestDefaultConfig/#398 (0.22s)
    --- PASS: TestDefaultConfig/#396 (0.13s)
    --- PASS: TestDefaultConfig/#397 (0.19s)
    --- PASS: TestDefaultConfig/#395 (0.15s)
    --- PASS: TestDefaultConfig/#392 (0.15s)
    --- PASS: TestDefaultConfig/#394 (0.23s)
    --- PASS: TestDefaultConfig/#393 (0.31s)
    --- PASS: TestDefaultConfig/#391 (0.28s)
    --- PASS: TestDefaultConfig/#390 (0.23s)
    --- PASS: TestDefaultConfig/#389 (0.28s)
    --- PASS: TestDefaultConfig/#388 (0.18s)
    --- PASS: TestDefaultConfig/#387 (0.18s)
    --- PASS: TestDefaultConfig/#386 (0.13s)
=== RUN   TestTxnEndpoint_Bad_JSON
=== PAUSE TestTxnEndpoint_Bad_JSON
=== RUN   TestTxnEndpoint_Bad_Size_Item
=== PAUSE TestTxnEndpoint_Bad_Size_Item
=== RUN   TestTxnEndpoint_Bad_Size_Net
=== PAUSE TestTxnEndpoint_Bad_Size_Net
=== RUN   TestTxnEndpoint_Bad_Size_Ops
=== PAUSE TestTxnEndpoint_Bad_Size_Ops
=== RUN   TestTxnEndpoint_KV_Actions
=== PAUSE TestTxnEndpoint_KV_Actions
=== RUN   TestUiIndex
=== PAUSE TestUiIndex
=== RUN   TestUiNodes
=== PAUSE TestUiNodes
=== RUN   TestUiNodeInfo
=== PAUSE TestUiNodeInfo
=== RUN   TestSummarizeServices
=== PAUSE TestSummarizeServices
=== RUN   TestValidateUserEventParams
=== PAUSE TestValidateUserEventParams
=== RUN   TestShouldProcessUserEvent
=== PAUSE TestShouldProcessUserEvent
=== RUN   TestIngestUserEvent
=== PAUSE TestIngestUserEvent
=== RUN   TestFireReceiveEvent
=== PAUSE TestFireReceiveEvent
=== RUN   TestUserEventToken
=== PAUSE TestUserEventToken
=== RUN   TestStringHash
=== PAUSE TestStringHash
=== RUN   TestSetFilePermissions
=== PAUSE TestSetFilePermissions
=== RUN   TestDurationFixer
--- PASS: TestDurationFixer (0.00s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== RUN   TestForwardSignals
=== RUN   TestForwardSignals/signal-interrupt
=== RUN   TestForwardSignals/signal-terminated
--- PASS: TestForwardSignals (0.49s)
    --- PASS: TestForwardSignals/signal-interrupt (0.25s)
    --- PASS: TestForwardSignals/signal-terminated (0.24s)
=== RUN   TestMakeWatchHandler
=== PAUSE TestMakeWatchHandler
=== RUN   TestMakeHTTPWatchHandler
=== PAUSE TestMakeHTTPWatchHandler
=== CONT  TestACL_Legacy_Disabled_Response
=== CONT  TestMakeHTTPWatchHandler
=== CONT  TestMakeWatchHandler
=== CONT  TestKVSEndpoint_AcquireRelease
2019/11/27 02:19:59 [TRACE] agent: http watch handler 'http://127.0.0.1:37593' output: Ok, i see
=== CONT  TestHTTPServer_UnixSocket
--- PASS: TestMakeHTTPWatchHandler (0.04s)
WARNING: bootstrap = true: do not enable unless necessary
TestACL_Legacy_Disabled_Response - 2019/11/27 02:19:59.954974 [WARN] agent: Node name "Node 2680730c-99fe-349c-5ca5-d943789990eb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestACL_Legacy_Disabled_Response - 2019/11/27 02:19:59.955580 [DEBUG] tlsutil: Update with version 1
TestACL_Legacy_Disabled_Response - 2019/11/27 02:19:59.955776 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestACL_Legacy_Disabled_Response - 2019/11/27 02:19:59.956085 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestACL_Legacy_Disabled_Response - 2019/11/27 02:19:59.956376 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:19:59.978318 [WARN] agent: Node name "Node f1efce21-246d-4cc5-edec-f708204e320a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:19:59.980561 [DEBUG] tlsutil: Update with version 1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:19:59.980653 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:19:59.980941 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:19:59.981065 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestHTTPServer_UnixSocket - 2019/11/27 02:20:00.000364 [WARN] agent: Node name "Node b8e37649-c470-9a44-a033-1144be05cdce" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHTTPServer_UnixSocket - 2019/11/27 02:20:00.000862 [DEBUG] tlsutil: Update with version 1
TestHTTPServer_UnixSocket - 2019/11/27 02:20:00.000995 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHTTPServer_UnixSocket - 2019/11/27 02:20:00.001261 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHTTPServer_UnixSocket - 2019/11/27 02:20:00.001381 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:00 [DEBUG] agent: watch handler 'bash -c 'echo $CONSUL_INDEX >> handler_index_out && cat >> handler_out'' output: 
--- PASS: TestMakeWatchHandler (0.94s)
=== CONT  TestFilterNonPassing
--- PASS: TestFilterNonPassing (0.00s)
=== CONT  TestHealthConnectServiceNodes_PassingFilter
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:00.812500 [WARN] agent: Node name "Node 76cd9012-37cb-e173-62ad-1b5b7e9b0581" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:00.813062 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:00.813150 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:00.813359 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:00.813496 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f1efce21-246d-4cc5-edec-f708204e320a Address:127.0.0.1:11734}]
2019/11/27 02:20:01 [INFO]  raft: Node at 127.0.0.1:11734 [Follower] entering Follower state (Leader: "")
2019/11/27 02:20:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b8e37649-c470-9a44-a033-1144be05cdce Address:127.0.0.1:11740}]
2019/11/27 02:20:01 [INFO]  raft: Node at 127.0.0.1:11740 [Follower] entering Follower state (Leader: "")
2019/11/27 02:20:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2680730c-99fe-349c-5ca5-d943789990eb Address:127.0.0.1:11728}]
2019/11/27 02:20:01 [INFO]  raft: Node at 127.0.0.1:11728 [Follower] entering Follower state (Leader: "")
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.763381 [INFO] serf: EventMemberJoin: Node f1efce21-246d-4cc5-edec-f708204e320a.dc1 127.0.0.1
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.763808 [INFO] serf: EventMemberJoin: Node b8e37649-c470-9a44-a033-1144be05cdce.dc1 127.0.0.1
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.765911 [INFO] serf: EventMemberJoin: Node 2680730c-99fe-349c-5ca5-d943789990eb.dc1 127.0.0.1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.769024 [INFO] serf: EventMemberJoin: Node f1efce21-246d-4cc5-edec-f708204e320a 127.0.0.1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.776618 [INFO] agent: Started DNS server 127.0.0.1:11729 (udp)
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.777552 [INFO] consul: Adding LAN server Node f1efce21-246d-4cc5-edec-f708204e320a (Addr: tcp/127.0.0.1:11734) (DC: dc1)
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.777829 [INFO] consul: Handled member-join event for server "Node f1efce21-246d-4cc5-edec-f708204e320a.dc1" in area "wan"
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.778491 [INFO] agent: Started DNS server 127.0.0.1:11729 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.780756 [INFO] agent: Started HTTP server on 127.0.0.1:11730 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:01.780875 [INFO] agent: started state syncer
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.783535 [INFO] serf: EventMemberJoin: Node b8e37649-c470-9a44-a033-1144be05cdce 127.0.0.1
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.785587 [INFO] consul: Handled member-join event for server "Node b8e37649-c470-9a44-a033-1144be05cdce.dc1" in area "wan"
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.785631 [INFO] serf: EventMemberJoin: Node 2680730c-99fe-349c-5ca5-d943789990eb 127.0.0.1
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.785982 [INFO] consul: Adding LAN server Node b8e37649-c470-9a44-a033-1144be05cdce (Addr: tcp/127.0.0.1:11740) (DC: dc1)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.786746 [INFO] consul: Adding LAN server Node 2680730c-99fe-349c-5ca5-d943789990eb (Addr: tcp/127.0.0.1:11728) (DC: dc1)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.787142 [INFO] agent: Started DNS server 127.0.0.1:11723 (udp)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.787369 [INFO] consul: Handled member-join event for server "Node 2680730c-99fe-349c-5ca5-d943789990eb.dc1" in area "wan"
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.787766 [INFO] agent: Started DNS server 127.0.0.1:11723 (tcp)
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.789946 [INFO] agent: Started DNS server 127.0.0.1:11735 (tcp)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.790243 [INFO] agent: Started HTTP server on 127.0.0.1:11724 (tcp)
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.790983 [INFO] agent: Started DNS server 127.0.0.1:11735 (udp)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:01.794205 [INFO] agent: started state syncer
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.795430 [INFO] agent: Started HTTP server on /tmp/consul-test/TestHTTPServer_UnixSocket-consul190312151/test.sock (unix)
TestHTTPServer_UnixSocket - 2019/11/27 02:20:01.795546 [INFO] agent: started state syncer
2019/11/27 02:20:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:01 [INFO]  raft: Node at 127.0.0.1:11734 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:01 [INFO]  raft: Node at 127.0.0.1:11728 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:01 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:01 [INFO]  raft: Node at 127.0.0.1:11740 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:76cd9012-37cb-e173-62ad-1b5b7e9b0581 Address:127.0.0.1:11746}]
2019/11/27 02:20:02 [INFO]  raft: Node at 127.0.0.1:11746 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.189181 [INFO] serf: EventMemberJoin: Node 76cd9012-37cb-e173-62ad-1b5b7e9b0581.dc1 127.0.0.1
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.193266 [INFO] serf: EventMemberJoin: Node 76cd9012-37cb-e173-62ad-1b5b7e9b0581 127.0.0.1
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.193892 [INFO] consul: Adding LAN server Node 76cd9012-37cb-e173-62ad-1b5b7e9b0581 (Addr: tcp/127.0.0.1:11746) (DC: dc1)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.194117 [INFO] consul: Handled member-join event for server "Node 76cd9012-37cb-e173-62ad-1b5b7e9b0581.dc1" in area "wan"
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.195290 [INFO] agent: Started DNS server 127.0.0.1:11741 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.195371 [INFO] agent: Started DNS server 127.0.0.1:11741 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.197397 [INFO] agent: Started HTTP server on 127.0.0.1:11742 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:02.197488 [INFO] agent: started state syncer
2019/11/27 02:20:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:02 [INFO]  raft: Node at 127.0.0.1:11746 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:03 [INFO]  raft: Node at 127.0.0.1:11740 [Leader] entering Leader state
2019/11/27 02:20:03 [INFO]  raft: Node at 127.0.0.1:11734 [Leader] entering Leader state
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.141840 [INFO] consul: cluster leadership acquired
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.142391 [INFO] consul: New leader elected: Node b8e37649-c470-9a44-a033-1144be05cdce
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:03.142763 [INFO] consul: cluster leadership acquired
2019/11/27 02:20:03 [INFO]  raft: Election won. Tally: 1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:03.143161 [INFO] consul: New leader elected: Node f1efce21-246d-4cc5-edec-f708204e320a
2019/11/27 02:20:03 [INFO]  raft: Node at 127.0.0.1:11728 [Leader] entering Leader state
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:03.143519 [INFO] consul: cluster leadership acquired
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:03.143886 [INFO] consul: New leader elected: Node 2680730c-99fe-349c-5ca5-d943789990eb
2019/11/27 02:20:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:03 [INFO]  raft: Node at 127.0.0.1:11746 [Leader] entering Leader state
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:03.671216 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:03.671757 [INFO] consul: New leader elected: Node 76cd9012-37cb-e173-62ad-1b5b7e9b0581
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:03.820528 [INFO] agent: Synced node info
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:03.839252 [INFO] agent: Synced node info
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.849121 [INFO] agent: Synced node info
=== RUN   TestACL_Legacy_Disabled_Response/0
=== RUN   TestACL_Legacy_Disabled_Response/1
=== RUN   TestACL_Legacy_Disabled_Response/2
=== RUN   TestACL_Legacy_Disabled_Response/3
=== RUN   TestACL_Legacy_Disabled_Response/4
=== RUN   TestACL_Legacy_Disabled_Response/5
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:03.866536 [INFO] agent: Requesting shutdown
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:03.866663 [INFO] consul: shutting down server
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:03.867226 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.895734 [DEBUG] http: Request GET /v1/agent/self (515.106034ms) from=@
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.902573 [INFO] agent: Requesting shutdown
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.902680 [INFO] consul: shutting down server
TestHTTPServer_UnixSocket - 2019/11/27 02:20:03.902733 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.037408 [WARN] serf: Shutdown without a Leave
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.038028 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.161879 [INFO] agent: Synced node info
=== RUN   TestHealthConnectServiceNodes_PassingFilter/bc_no_query_value
=== RUN   TestHealthConnectServiceNodes_PassingFilter/passing_true
=== RUN   TestHealthConnectServiceNodes_PassingFilter/passing_false
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.165549 [DEBUG] agent: Node info in sync
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.162117 [INFO] manager: shutting down
jones - 2019/11/27 02:20:04.166002 [DEBUG] consul: Skipping self join check for "Node 96ea3298-4984-8452-8dce-62bd7caf6d71" since the cluster is too small
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.166666 [INFO] manager: shutting down
=== RUN   TestHealthConnectServiceNodes_PassingFilter/passing_bad
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.174439 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.175903 [INFO] consul: shutting down server
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.176074 [WARN] serf: Shutdown without a Leave
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.335491 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.335792 [INFO] agent: consul server down
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.335866 [INFO] agent: shutdown complete
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.335944 [INFO] agent: Stopping DNS server 127.0.0.1:11723 (tcp)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.336109 [INFO] agent: Stopping DNS server 127.0.0.1:11723 (udp)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.336311 [INFO] agent: Stopping HTTP server 127.0.0.1:11724 (tcp)
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.336558 [INFO] agent: Waiting for endpoints to shut down
TestACL_Legacy_Disabled_Response - 2019/11/27 02:20:04.336646 [INFO] agent: Endpoints down
--- PASS: TestACL_Legacy_Disabled_Response (4.54s)
    --- PASS: TestACL_Legacy_Disabled_Response/0 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/1 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/2 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/3 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/4 (0.00s)
    --- PASS: TestACL_Legacy_Disabled_Response/5 (0.00s)
=== CONT  TestHealthConnectServiceNodes
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.338756 [WARN] serf: Shutdown without a Leave
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.424577 [INFO] agent: consul server down
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.424668 [INFO] agent: shutdown complete
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.424749 [INFO] agent: Stopping DNS server 127.0.0.1:11735 (tcp)
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.424936 [INFO] agent: Stopping DNS server 127.0.0.1:11735 (udp)
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.425139 [INFO] agent: Stopping HTTP server /tmp/consul-test/TestHTTPServer_UnixSocket-consul190312151/test.sock (unix)
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.425816 [INFO] agent: Waiting for endpoints to shut down
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.425904 [INFO] agent: Endpoints down
--- PASS: TestHTTPServer_UnixSocket (4.59s)
=== CONT  TestHealthServiceNodes_WanTranslation
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.429565 [INFO] manager: shutting down
TestHTTPServer_UnixSocket - 2019/11/27 02:20:04.438964 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthConnectServiceNodes - 2019/11/27 02:20:04.477115 [WARN] agent: Node name "Node d2498805-6904-6f66-a990-a38dc7d41caa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthConnectServiceNodes - 2019/11/27 02:20:04.477750 [DEBUG] tlsutil: Update with version 1
TestHealthConnectServiceNodes - 2019/11/27 02:20:04.477834 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthConnectServiceNodes - 2019/11/27 02:20:04.478129 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthConnectServiceNodes - 2019/11/27 02:20:04.478303 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.490996 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.491320 [INFO] agent: consul server down
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.491377 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.491417 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.491432 [INFO] agent: Stopping DNS server 127.0.0.1:11741 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.491648 [INFO] agent: Stopping DNS server 127.0.0.1:11741 (udp)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.491885 [INFO] agent: Stopping HTTP server 127.0.0.1:11742 (tcp)
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.492109 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes_PassingFilter - 2019/11/27 02:20:04.492191 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes_PassingFilter (3.75s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/bc_no_query_value (0.00s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/passing_true (0.00s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/passing_false (0.01s)
    --- PASS: TestHealthConnectServiceNodes_PassingFilter/passing_bad (0.00s)
=== CONT  TestHealthServiceNodes_DistanceSort
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:04.572866 [WARN] agent: Node name "Node 893f4515-886d-bb55-fe8d-4842186401f0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:04.573332 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:04.573419 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:04.573585 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:04.573701 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:04.621079 [WARN] agent: Node name "Node addaddc2-1b90-1db6-09e8-e6b355a79168" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:04.621535 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:04.621619 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:04.629900 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:04.630690 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:06.015911 [DEBUG] agent: Node info in sync
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:06.016045 [DEBUG] agent: Node info in sync
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:06.441778 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:06.444567 [DEBUG] consul: Skipping self join check for "Node f1efce21-246d-4cc5-edec-f708204e320a" since the cluster is too small
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:06.444865 [INFO] consul: member 'Node f1efce21-246d-4cc5-edec-f708204e320a' joined, marking health alive
2019/11/27 02:20:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d2498805-6904-6f66-a990-a38dc7d41caa Address:127.0.0.1:11752}]
2019/11/27 02:20:06 [INFO]  raft: Node at 127.0.0.1:11752 [Follower] entering Follower state (Leader: "")
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.774857 [INFO] serf: EventMemberJoin: Node d2498805-6904-6f66-a990-a38dc7d41caa.dc1 127.0.0.1
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.778942 [INFO] serf: EventMemberJoin: Node d2498805-6904-6f66-a990-a38dc7d41caa 127.0.0.1
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.779674 [INFO] consul: Handled member-join event for server "Node d2498805-6904-6f66-a990-a38dc7d41caa.dc1" in area "wan"
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.780044 [INFO] consul: Adding LAN server Node d2498805-6904-6f66-a990-a38dc7d41caa (Addr: tcp/127.0.0.1:11752) (DC: dc1)
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.780398 [INFO] agent: Started DNS server 127.0.0.1:11747 (udp)
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.780468 [INFO] agent: Started DNS server 127.0.0.1:11747 (tcp)
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.783040 [INFO] agent: Started HTTP server on 127.0.0.1:11748 (tcp)
TestHealthConnectServiceNodes - 2019/11/27 02:20:06.783183 [INFO] agent: started state syncer
2019/11/27 02:20:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:06 [INFO]  raft: Node at 127.0.0.1:11752 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:893f4515-886d-bb55-fe8d-4842186401f0 Address:127.0.0.1:11758}]
2019/11/27 02:20:07 [INFO]  raft: Node at 127.0.0.1:11758 [Follower] entering Follower state (Leader: "")
2019/11/27 02:20:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:addaddc2-1b90-1db6-09e8-e6b355a79168 Address:127.0.0.1:11764}]
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.250580 [INFO] serf: EventMemberJoin: Node 893f4515-886d-bb55-fe8d-4842186401f0.dc1 127.0.0.1
2019/11/27 02:20:07 [INFO]  raft: Node at 127.0.0.1:11764 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.250580 [INFO] serf: EventMemberJoin: Node addaddc2-1b90-1db6-09e8-e6b355a79168.dc1 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.263221 [INFO] serf: EventMemberJoin: Node addaddc2-1b90-1db6-09e8-e6b355a79168 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.264462 [INFO] agent: Started DNS server 127.0.0.1:11759 (udp)
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.265318 [INFO] consul: Adding LAN server Node addaddc2-1b90-1db6-09e8-e6b355a79168 (Addr: tcp/127.0.0.1:11764) (DC: dc1)
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.265523 [INFO] consul: Handled member-join event for server "Node addaddc2-1b90-1db6-09e8-e6b355a79168.dc1" in area "wan"
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.267426 [INFO] agent: Started DNS server 127.0.0.1:11759 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.270008 [INFO] serf: EventMemberJoin: Node 893f4515-886d-bb55-fe8d-4842186401f0 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.276566 [INFO] agent: Started DNS server 127.0.0.1:11753 (udp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.277034 [INFO] consul: Adding LAN server Node 893f4515-886d-bb55-fe8d-4842186401f0 (Addr: tcp/127.0.0.1:11758) (DC: dc1)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.277248 [INFO] consul: Handled member-join event for server "Node 893f4515-886d-bb55-fe8d-4842186401f0.dc1" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.277700 [INFO] agent: Started DNS server 127.0.0.1:11753 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.280771 [INFO] agent: Started HTTP server on 127.0.0.1:11754 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:07.281008 [INFO] agent: started state syncer
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.282093 [INFO] agent: Started HTTP server on 127.0.0.1:11760 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:07.282275 [INFO] agent: started state syncer
2019/11/27 02:20:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:07 [INFO]  raft: Node at 127.0.0.1:11764 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:07 [INFO]  raft: Node at 127.0.0.1:11758 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:20:07.691775 [DEBUG] consul: Skipping self join check for "Node 4c613484-61cd-f189-9fd4-637dea8a81e0" since the cluster is too small
2019/11/27 02:20:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:07 [INFO]  raft: Node at 127.0.0.1:11752 [Leader] entering Leader state
TestHealthConnectServiceNodes - 2019/11/27 02:20:07.828953 [INFO] consul: cluster leadership acquired
TestHealthConnectServiceNodes - 2019/11/27 02:20:07.829433 [INFO] consul: New leader elected: Node d2498805-6904-6f66-a990-a38dc7d41caa
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.127755 [INFO] agent: Requesting shutdown
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.127866 [INFO] consul: shutting down server
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.127915 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.544653 [WARN] serf: Shutdown without a Leave
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.790726 [INFO] manager: shutting down
2019/11/27 02:20:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:08 [INFO]  raft: Node at 127.0.0.1:11758 [Leader] entering Leader state
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.791555 [INFO] agent: consul server down
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.791620 [INFO] agent: shutdown complete
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.791752 [INFO] agent: Stopping DNS server 127.0.0.1:11729 (tcp)
2019/11/27 02:20:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:08 [INFO]  raft: Node at 127.0.0.1:11764 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:08.792070 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:08.792438 [INFO] consul: New leader elected: Node 893f4515-886d-bb55-fe8d-4842186401f0
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.791926 [INFO] agent: Stopping DNS server 127.0.0.1:11729 (udp)
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.792810 [INFO] agent: Stopping HTTP server 127.0.0.1:11730 (tcp)
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.793068 [INFO] agent: Waiting for endpoints to shut down
TestKVSEndpoint_AcquireRelease - 2019/11/27 02:20:08.793153 [INFO] agent: Endpoints down
--- PASS: TestKVSEndpoint_AcquireRelease (8.98s)
=== CONT  TestHealthServiceNodes_NodeMetaFilter
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:08.802136 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:08.802650 [INFO] consul: New leader elected: Node addaddc2-1b90-1db6-09e8-e6b355a79168
TestHealthConnectServiceNodes - 2019/11/27 02:20:08.803867 [INFO] agent: Synced node info
TestHealthConnectServiceNodes - 2019/11/27 02:20:08.803973 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:08.833597 [INFO] acl: initializing acls
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:08.843771 [ERR] agent: failed to sync remote state: ACL not found
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:08.871859 [WARN] agent: Node name "Node ab39dc96-d97b-1db9-4bd8-b70bdb3d1210" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:08.872481 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:08.873253 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:08.873468 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:08.873595 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:09.415223 [INFO] agent: Synced node info
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:09.415348 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:09.525586 [INFO] consul: Created ACL 'global-management' policy
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:09.531017 [INFO] acl: initializing acls
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.017942 [INFO] agent: Requesting shutdown
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.018135 [INFO] consul: shutting down server
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.018215 [WARN] serf: Shutdown without a Leave
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.401841 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:10.409380 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:10.409486 [DEBUG] acl: transitioning out of legacy ACL mode
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:10.410261 [INFO] serf: EventMemberUpdate: Node 893f4515-886d-bb55-fe8d-4842186401f0
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:10.410872 [INFO] serf: EventMemberUpdate: Node 893f4515-886d-bb55-fe8d-4842186401f0.dc1
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.515653 [INFO] manager: shutting down
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.680694 [INFO] agent: consul server down
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.680767 [INFO] agent: shutdown complete
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.680873 [INFO] agent: Stopping DNS server 127.0.0.1:11747 (tcp)
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.681052 [INFO] agent: Stopping DNS server 127.0.0.1:11747 (udp)
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.681369 [INFO] agent: Stopping HTTP server 127.0.0.1:11748 (tcp)
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.681626 [INFO] agent: Waiting for endpoints to shut down
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.681772 [INFO] agent: Endpoints down
--- PASS: TestHealthConnectServiceNodes (6.34s)
=== CONT  TestHealthServiceNodes
TestHealthConnectServiceNodes - 2019/11/27 02:20:10.683446 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes - 2019/11/27 02:20:10.768413 [WARN] agent: Node name "Node 70e964cf-fd62-8548-1a5b-2d848ac646f3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes - 2019/11/27 02:20:10.768846 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes - 2019/11/27 02:20:10.768919 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes - 2019/11/27 02:20:10.769086 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceNodes - 2019/11/27 02:20:10.769206 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:20:10.880249 [DEBUG] consul: Skipping self join check for "Node 005cb1c3-f8e5-2827-9833-9849ba78d405" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:11.041488 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:11.042376 [INFO] serf: EventMemberUpdate: Node 893f4515-886d-bb55-fe8d-4842186401f0
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:11.043029 [INFO] serf: EventMemberUpdate: Node 893f4515-886d-bb55-fe8d-4842186401f0.dc1
2019/11/27 02:20:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ab39dc96-d97b-1db9-4bd8-b70bdb3d1210 Address:127.0.0.1:11770}]
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.175743 [INFO] serf: EventMemberJoin: Node ab39dc96-d97b-1db9-4bd8-b70bdb3d1210.dc1 127.0.0.1
2019/11/27 02:20:11 [INFO]  raft: Node at 127.0.0.1:11770 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.185251 [INFO] serf: EventMemberJoin: Node ab39dc96-d97b-1db9-4bd8-b70bdb3d1210 127.0.0.1
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.186650 [INFO] consul: Handled member-join event for server "Node ab39dc96-d97b-1db9-4bd8-b70bdb3d1210.dc1" in area "wan"
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.186919 [INFO] consul: Adding LAN server Node ab39dc96-d97b-1db9-4bd8-b70bdb3d1210 (Addr: tcp/127.0.0.1:11770) (DC: dc1)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.189237 [INFO] agent: Started DNS server 127.0.0.1:11765 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.189347 [INFO] agent: Started DNS server 127.0.0.1:11765 (udp)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.192002 [INFO] agent: Started HTTP server on 127.0.0.1:11766 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:11.192130 [INFO] agent: started state syncer
2019/11/27 02:20:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:11 [INFO]  raft: Node at 127.0.0.1:11770 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:11.456339 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.010523 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.010620 [INFO] consul: shutting down server
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.010670 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.146878 [WARN] serf: Shutdown without a Leave
2019/11/27 02:20:12 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:12 [INFO]  raft: Node at 127.0.0.1:11770 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.149704 [INFO] agent: Synced node info
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.149812 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:12.150206 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:12.150556 [INFO] consul: New leader elected: Node ab39dc96-d97b-1db9-4bd8-b70bdb3d1210
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.332891 [WARN] agent: Node name "Node d919b0b9-ad8b-7924-4434-631c5b7cd32b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.333539 [DEBUG] tlsutil: Update with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.335909 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.336551 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:12.336967 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.346384 [INFO] manager: shutting down
2019/11/27 02:20:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:70e964cf-fd62-8548-1a5b-2d848ac646f3 Address:127.0.0.1:11776}]
2019/11/27 02:20:12 [INFO]  raft: Node at 127.0.0.1:11776 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes - 2019/11/27 02:20:12.350819 [INFO] serf: EventMemberJoin: Node 70e964cf-fd62-8548-1a5b-2d848ac646f3.dc1 127.0.0.1
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352131 [INFO] agent: consul server down
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352206 [INFO] agent: shutdown complete
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352263 [INFO] agent: Stopping DNS server 127.0.0.1:11759 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352405 [INFO] agent: Stopping DNS server 127.0.0.1:11759 (udp)
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352572 [INFO] agent: Stopping HTTP server 127.0.0.1:11760 (tcp)
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352656 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352778 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_DistanceSort - 2019/11/27 02:20:12.352846 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_DistanceSort (7.86s)
=== CONT  TestHealthServiceChecks_DistanceSort
TestHealthServiceNodes - 2019/11/27 02:20:12.363140 [INFO] serf: EventMemberJoin: Node 70e964cf-fd62-8548-1a5b-2d848ac646f3 127.0.0.1
TestHealthServiceNodes - 2019/11/27 02:20:12.364014 [INFO] consul: Adding LAN server Node 70e964cf-fd62-8548-1a5b-2d848ac646f3 (Addr: tcp/127.0.0.1:11776) (DC: dc1)
TestHealthServiceNodes - 2019/11/27 02:20:12.364301 [INFO] consul: Handled member-join event for server "Node 70e964cf-fd62-8548-1a5b-2d848ac646f3.dc1" in area "wan"
TestHealthServiceNodes - 2019/11/27 02:20:12.364770 [INFO] agent: Started DNS server 127.0.0.1:11771 (tcp)
TestHealthServiceNodes - 2019/11/27 02:20:12.368383 [INFO] agent: Started DNS server 127.0.0.1:11771 (udp)
TestHealthServiceNodes - 2019/11/27 02:20:12.370435 [INFO] agent: Started HTTP server on 127.0.0.1:11772 (tcp)
TestHealthServiceNodes - 2019/11/27 02:20:12.370558 [INFO] agent: started state syncer
2019/11/27 02:20:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:12 [INFO]  raft: Node at 127.0.0.1:11776 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:12.495688 [WARN] agent: Node name "Node 8a83d70f-899e-d3e8-184b-fd5434cb1283" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:12.496134 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:12.496258 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:12.496492 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:12.496631 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:20:15.277050 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:15.277128 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:15.458631 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:15.459261 [DEBUG] consul: Skipping self join check for "Node 893f4515-886d-bb55-fe8d-4842186401f0" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:15.459484 [INFO] consul: member 'Node 893f4515-886d-bb55-fe8d-4842186401f0' joined, marking health alive
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:15.462836 [INFO] agent: Synced node info
jones - 2019/11/27 02:20:15.647350 [DEBUG] consul: Skipping self join check for "Node 3a0dee63-0112-ab1b-d438-213ed51c845e" since the cluster is too small
2019/11/27 02:20:15 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:15 [INFO]  raft: Node at 127.0.0.1:11776 [Leader] entering Leader state
TestHealthServiceNodes - 2019/11/27 02:20:15.925888 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes - 2019/11/27 02:20:15.927117 [INFO] consul: New leader elected: Node 70e964cf-fd62-8548-1a5b-2d848ac646f3
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:15.928849 [DEBUG] consul: Skipping self join check for "Node 893f4515-886d-bb55-fe8d-4842186401f0" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:15.929545 [DEBUG] consul: Skipping self join check for "Node 893f4515-886d-bb55-fe8d-4842186401f0" since the cluster is too small
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:16.209627 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:16.209737 [INFO] consul: shutting down server
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:16.209800 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:16.423478 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:16.423592 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.472259 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.680471 [INFO] manager: shutting down
TestHealthServiceNodes - 2019/11/27 02:20:17.684753 [INFO] agent: Synced node info
TestHealthServiceNodes - 2019/11/27 02:20:17.684881 [DEBUG] agent: Node info in sync
2019/11/27 02:20:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8a83d70f-899e-d3e8-184b-fd5434cb1283 Address:127.0.0.1:11788}]
2019/11/27 02:20:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d919b0b9-ad8b-7924-4434-631c5b7cd32b Address:127.0.0.1:11782}]
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827164 [INFO] agent: consul server down
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827239 [INFO] agent: shutdown complete
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827299 [INFO] agent: Stopping DNS server 127.0.0.1:11765 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827437 [INFO] agent: Stopping DNS server 127.0.0.1:11765 (udp)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827596 [INFO] agent: Stopping HTTP server 127.0.0.1:11766 (tcp)
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827811 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.827887 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_NodeMetaFilter (9.03s)
=== CONT  TestDNS_AddressLookup
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.828133 [INFO] serf: EventMemberJoin: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.833346 [INFO] serf: EventMemberJoin: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b 127.0.0.1
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.833883 [INFO] serf: EventMemberJoin: Node 8a83d70f-899e-d3e8-184b-fd5434cb1283.dc1 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.834565 [INFO] agent: Started DNS server 127.0.0.1:11777 (udp)
2019/11/27 02:20:17 [INFO]  raft: Node at 127.0.0.1:11782 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.836971 [INFO] consul: Adding LAN server Node d919b0b9-ad8b-7924-4434-631c5b7cd32b (Addr: tcp/127.0.0.1:11782) (DC: dc2)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.837260 [INFO] consul: Handled member-join event for server "Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.837838 [INFO] agent: Started DNS server 127.0.0.1:11777 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.839745 [INFO] agent: Started HTTP server on 127.0.0.1:11778 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:17.839842 [INFO] agent: started state syncer
TestHealthServiceNodes_NodeMetaFilter - 2019/11/27 02:20:17.840354 [ERR] consul: failed to establish leadership: leadership lost while committing log
2019/11/27 02:20:17 [INFO]  raft: Node at 127.0.0.1:11788 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.854624 [INFO] serf: EventMemberJoin: Node 8a83d70f-899e-d3e8-184b-fd5434cb1283 127.0.0.1
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.856070 [INFO] agent: Started DNS server 127.0.0.1:11783 (udp)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.856543 [INFO] consul: Adding LAN server Node 8a83d70f-899e-d3e8-184b-fd5434cb1283 (Addr: tcp/127.0.0.1:11788) (DC: dc1)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.856837 [INFO] consul: Handled member-join event for server "Node 8a83d70f-899e-d3e8-184b-fd5434cb1283.dc1" in area "wan"
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.857391 [INFO] agent: Started DNS server 127.0.0.1:11783 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.859617 [INFO] agent: Started HTTP server on 127.0.0.1:11784 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:17.859733 [INFO] agent: started state syncer
2019/11/27 02:20:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:17 [INFO]  raft: Node at 127.0.0.1:11788 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:17 [INFO]  raft: Node at 127.0.0.1:11782 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_AddressLookup - 2019/11/27 02:20:17.925022 [WARN] agent: Node name "Node 8e94a312-9163-9613-2547-2f1180245c98" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_AddressLookup - 2019/11/27 02:20:17.925557 [DEBUG] tlsutil: Update with version 1
TestDNS_AddressLookup - 2019/11/27 02:20:17.925632 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_AddressLookup - 2019/11/27 02:20:17.925817 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_AddressLookup - 2019/11/27 02:20:17.925933 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes - 2019/11/27 02:20:18.569721 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:20:18.669589 [DEBUG] consul: Skipping self join check for "Node 975e7e65-a3c0-20b5-0590-acecff7cdae7" since the cluster is too small
2019/11/27 02:20:18 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:18 [INFO]  raft: Node at 127.0.0.1:11788 [Leader] entering Leader state
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:18.947169 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:18.947793 [INFO] consul: New leader elected: Node 8a83d70f-899e-d3e8-184b-fd5434cb1283
2019/11/27 02:20:19 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:19 [INFO]  raft: Node at 127.0.0.1:11782 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.333205 [INFO] consul: cluster leadership acquired
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.333715 [INFO] consul: New leader elected: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.389235 [INFO] acl: initializing acls
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.415678 [ERR] agent: failed to sync remote state: ACL not found
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:19.719793 [INFO] agent: Synced node info
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:19.719922 [DEBUG] agent: Node info in sync
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.870263 [INFO] consul: Created ACL 'global-management' policy
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.873714 [INFO] acl: initializing acls
2019/11/27 02:20:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8e94a312-9163-9613-2547-2f1180245c98 Address:127.0.0.1:11794}]
2019/11/27 02:20:19 [INFO]  raft: Node at 127.0.0.1:11794 [Follower] entering Follower state (Leader: "")
TestDNS_AddressLookup - 2019/11/27 02:20:19.878131 [INFO] serf: EventMemberJoin: Node 8e94a312-9163-9613-2547-2f1180245c98.dc1 127.0.0.1
TestDNS_AddressLookup - 2019/11/27 02:20:19.881151 [INFO] serf: EventMemberJoin: Node 8e94a312-9163-9613-2547-2f1180245c98 127.0.0.1
TestDNS_AddressLookup - 2019/11/27 02:20:19.882328 [INFO] consul: Adding LAN server Node 8e94a312-9163-9613-2547-2f1180245c98 (Addr: tcp/127.0.0.1:11794) (DC: dc1)
TestDNS_AddressLookup - 2019/11/27 02:20:19.882488 [INFO] consul: Handled member-join event for server "Node 8e94a312-9163-9613-2547-2f1180245c98.dc1" in area "wan"
TestDNS_AddressLookup - 2019/11/27 02:20:19.883936 [INFO] agent: Started DNS server 127.0.0.1:11789 (tcp)
TestDNS_AddressLookup - 2019/11/27 02:20:19.884023 [INFO] agent: Started DNS server 127.0.0.1:11789 (udp)
TestDNS_AddressLookup - 2019/11/27 02:20:19.886025 [INFO] agent: Started HTTP server on 127.0.0.1:11790 (tcp)
TestDNS_AddressLookup - 2019/11/27 02:20:19.886116 [INFO] agent: started state syncer
2019/11/27 02:20:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:19 [INFO]  raft: Node at 127.0.0.1:11794 [Candidate] entering Candidate state in term 2
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:19.915612 [ERR] agent: failed to sync remote state: ACL not found
TestHealthServiceNodes - 2019/11/27 02:20:20.033782 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes - 2019/11/27 02:20:20.034291 [DEBUG] consul: Skipping self join check for "Node 70e964cf-fd62-8548-1a5b-2d848ac646f3" since the cluster is too small
TestHealthServiceNodes - 2019/11/27 02:20:20.034511 [INFO] consul: member 'Node 70e964cf-fd62-8548-1a5b-2d848ac646f3' joined, marking health alive
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.323432 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.323541 [DEBUG] acl: transitioning out of legacy ACL mode
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.330658 [INFO] serf: EventMemberUpdate: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.331751 [INFO] serf: EventMemberUpdate: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.870630 [INFO] consul: Created ACL anonymous token from configuration
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.871505 [INFO] serf: EventMemberUpdate: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:20.872241 [INFO] serf: EventMemberUpdate: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:20.963811 [DEBUG] agent: Node info in sync
2019/11/27 02:20:21 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:21 [INFO]  raft: Node at 127.0.0.1:11794 [Leader] entering Leader state
TestDNS_AddressLookup - 2019/11/27 02:20:21.052190 [INFO] consul: cluster leadership acquired
TestDNS_AddressLookup - 2019/11/27 02:20:21.052818 [INFO] consul: New leader elected: Node 8e94a312-9163-9613-2547-2f1180245c98
TestHealthServiceNodes - 2019/11/27 02:20:21.217286 [INFO] agent: Requesting shutdown
TestHealthServiceNodes - 2019/11/27 02:20:21.217380 [INFO] consul: shutting down server
TestHealthServiceNodes - 2019/11/27 02:20:21.217430 [WARN] serf: Shutdown without a Leave
jones - 2019/11/27 02:20:21.412667 [DEBUG] consul: Skipping self join check for "Node 403133b7-b420-a957-4062-918c86f7ac39" since the cluster is too small
TestHealthServiceNodes - 2019/11/27 02:20:21.413252 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes - 2019/11/27 02:20:21.646067 [INFO] manager: shutting down
TestHealthServiceNodes - 2019/11/27 02:20:21.646659 [INFO] agent: consul server down
TestHealthServiceNodes - 2019/11/27 02:20:21.648991 [INFO] agent: shutdown complete
TestHealthServiceNodes - 2019/11/27 02:20:21.649060 [INFO] agent: Stopping DNS server 127.0.0.1:11771 (tcp)
TestHealthServiceNodes - 2019/11/27 02:20:21.649289 [INFO] agent: Stopping DNS server 127.0.0.1:11771 (udp)
TestHealthServiceNodes - 2019/11/27 02:20:21.649505 [INFO] agent: Stopping HTTP server 127.0.0.1:11772 (tcp)
TestHealthServiceNodes - 2019/11/27 02:20:21.649865 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes - 2019/11/27 02:20:21.649991 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes (10.97s)
=== CONT  TestHealthServiceChecks_NodeMetaFilter
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:21.746230 [WARN] agent: Node name "Node 3a25d714-d367-9067-c963-0a92ceac3cef" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:21.746634 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:21.746757 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:21.746945 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:21.747045 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_AddressLookup - 2019/11/27 02:20:21.979849 [INFO] agent: Synced node info
TestDNS_AddressLookup - 2019/11/27 02:20:21.980012 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:21.982799 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:21.983239 [DEBUG] consul: Skipping self join check for "Node 8a83d70f-899e-d3e8-184b-fd5434cb1283" since the cluster is too small
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:21.983426 [INFO] consul: member 'Node 8a83d70f-899e-d3e8-184b-fd5434cb1283' joined, marking health alive
TestDNS_AddressLookup - 2019/11/27 02:20:21.999930 [INFO] agent: Requesting shutdown
TestDNS_AddressLookup - 2019/11/27 02:20:21.999989 [DEBUG] dns: request for name 7f000001.addr.dc1.consul. type SRV class IN (took 296.344µs) from client 127.0.0.1:60741 (udp)
TestDNS_AddressLookup - 2019/11/27 02:20:22.000022 [INFO] consul: shutting down server
TestDNS_AddressLookup - 2019/11/27 02:20:22.000102 [WARN] serf: Shutdown without a Leave
TestDNS_AddressLookup - 2019/11/27 02:20:22.173594 [WARN] serf: Shutdown without a Leave
TestDNS_AddressLookup - 2019/11/27 02:20:22.370429 [INFO] manager: shutting down
TestDNS_AddressLookup - 2019/11/27 02:20:22.680615 [INFO] agent: consul server down
TestDNS_AddressLookup - 2019/11/27 02:20:22.680708 [INFO] agent: shutdown complete
TestDNS_AddressLookup - 2019/11/27 02:20:22.680797 [INFO] agent: Stopping DNS server 127.0.0.1:11789 (tcp)
TestDNS_AddressLookup - 2019/11/27 02:20:22.681001 [INFO] agent: Stopping DNS server 127.0.0.1:11789 (udp)
TestDNS_AddressLookup - 2019/11/27 02:20:22.681197 [INFO] agent: Stopping HTTP server 127.0.0.1:11790 (tcp)
TestDNS_AddressLookup - 2019/11/27 02:20:22.681412 [INFO] agent: Waiting for endpoints to shut down
TestDNS_AddressLookup - 2019/11/27 02:20:22.681489 [INFO] agent: Endpoints down
--- PASS: TestDNS_AddressLookup (4.85s)
=== CONT  TestHealthServiceChecks
TestDNS_AddressLookup - 2019/11/27 02:20:22.682047 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestHealthServiceChecks - 2019/11/27 02:20:22.744316 [WARN] agent: Node name "Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthServiceChecks - 2019/11/27 02:20:22.744865 [DEBUG] tlsutil: Update with version 1
TestHealthServiceChecks - 2019/11/27 02:20:22.744943 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks - 2019/11/27 02:20:22.745111 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthServiceChecks - 2019/11/27 02:20:22.745217 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:22.972168 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:22.972634 [DEBUG] consul: Skipping self join check for "Node d919b0b9-ad8b-7924-4434-631c5b7cd32b" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:22.972751 [INFO] consul: member 'Node d919b0b9-ad8b-7924-4434-631c5b7cd32b' joined, marking health alive
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.326011 [DEBUG] consul: Skipping self join check for "Node d919b0b9-ad8b-7924-4434-631c5b7cd32b" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.326552 [DEBUG] consul: Skipping self join check for "Node d919b0b9-ad8b-7924-4434-631c5b7cd32b" since the cluster is too small
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.334739 [INFO] agent: (WAN) joining: [127.0.0.1:11757]
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.336414 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:11757
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.336542 [DEBUG] memberlist: Stream connection from=127.0.0.1:36998
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.340350 [INFO] serf: EventMemberJoin: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.341261 [INFO] serf: EventMemberJoin: Node 893f4515-886d-bb55-fe8d-4842186401f0.dc1 127.0.0.1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.341269 [INFO] consul: Handled member-join event for server "Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2" in area "wan"
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.341847 [INFO] agent: (WAN) joined: 1 Err: <nil>
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.342184 [INFO] consul: Handled member-join event for server "Node 893f4515-886d-bb55-fe8d-4842186401f0.dc1" in area "wan"
2019/11/27 02:20:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3a25d714-d367-9067-c963-0a92ceac3cef Address:127.0.0.1:11800}]
2019/11/27 02:20:23 [INFO]  raft: Node at 127.0.0.1:11800 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.494738 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.494820 [INFO] consul: shutting down server
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.494870 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.499497 [INFO] serf: EventMemberJoin: Node 3a25d714-d367-9067-c963-0a92ceac3cef.dc1 127.0.0.1
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.508772 [INFO] serf: EventMemberJoin: Node 3a25d714-d367-9067-c963-0a92ceac3cef 127.0.0.1
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.510090 [INFO] consul: Adding LAN server Node 3a25d714-d367-9067-c963-0a92ceac3cef (Addr: tcp/127.0.0.1:11800) (DC: dc1)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.510631 [INFO] consul: Handled member-join event for server "Node 3a25d714-d367-9067-c963-0a92ceac3cef.dc1" in area "wan"
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.512682 [INFO] agent: Started DNS server 127.0.0.1:11795 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.516239 [INFO] agent: Started DNS server 127.0.0.1:11795 (udp)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.518861 [INFO] agent: Started HTTP server on 127.0.0.1:11796 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:23.518973 [INFO] agent: started state syncer
2019/11/27 02:20:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:23 [INFO]  raft: Node at 127.0.0.1:11800 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.656540 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.830392 [DEBUG] serf: messageJoinType: Node d919b0b9-ad8b-7924-4434-631c5b7cd32b.dc2
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.967632 [INFO] manager: shutting down
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.968491 [INFO] agent: consul server down
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.968554 [INFO] agent: shutdown complete
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.968618 [INFO] agent: Stopping DNS server 127.0.0.1:11783 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.968825 [INFO] agent: Stopping DNS server 127.0.0.1:11783 (udp)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.969006 [INFO] agent: Stopping HTTP server 127.0.0.1:11784 (tcp)
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.969249 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_DistanceSort - 2019/11/27 02:20:23.969333 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_DistanceSort (11.62s)
=== CONT  TestHealthNodeChecks
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.979646 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.979746 [INFO] consul: shutting down server
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:23.979792 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestHealthNodeChecks - 2019/11/27 02:20:24.058813 [WARN] agent: Node name "Node 450a416e-35c1-e60c-fdda-637f449114b4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthNodeChecks - 2019/11/27 02:20:24.059885 [DEBUG] tlsutil: Update with version 1
TestHealthNodeChecks - 2019/11/27 02:20:24.060426 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks - 2019/11/27 02:20:24.060789 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthNodeChecks - 2019/11/27 02:20:24.061246 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.168816 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.523213 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.523332 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524079 [INFO] agent: consul server down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524145 [INFO] agent: shutdown complete
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524198 [INFO] agent: Stopping DNS server 127.0.0.1:11777 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524358 [INFO] agent: Stopping DNS server 127.0.0.1:11777 (udp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524547 [INFO] agent: Stopping HTTP server 127.0.0.1:11778 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524755 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524834 [INFO] agent: Endpoints down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524904 [INFO] agent: Requesting shutdown
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.524969 [INFO] consul: shutting down server
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.525041 [WARN] serf: Shutdown without a Leave
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:24.980809 [WARN] serf: Shutdown without a Leave
2019/11/27 02:20:25 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:25 [INFO]  raft: Node at 127.0.0.1:11800 [Leader] entering Leader state
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.302031 [INFO] manager: shutting down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.302099 [INFO] manager: shutting down
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:25.302936 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:25.304802 [INFO] consul: New leader elected: Node 3a25d714-d367-9067-c963-0a92ceac3cef
2019/11/27 02:20:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7 Address:127.0.0.1:11806}]
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.303064 [INFO] agent: consul server down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.305657 [INFO] agent: shutdown complete
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.305723 [INFO] agent: Stopping DNS server 127.0.0.1:11753 (tcp)
2019/11/27 02:20:25 [INFO]  raft: Node at 127.0.0.1:11806 [Follower] entering Follower state (Leader: "")
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.305878 [INFO] agent: Stopping DNS server 127.0.0.1:11753 (udp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.306050 [INFO] agent: Stopping HTTP server 127.0.0.1:11754 (tcp)
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.306306 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceNodes_WanTranslation - 2019/11/27 02:20:25.306385 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceNodes_WanTranslation (20.88s)
=== CONT  TestHealthChecksInState_DistanceSort
TestHealthServiceChecks - 2019/11/27 02:20:25.332129 [INFO] serf: EventMemberJoin: Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7.dc1 127.0.0.1
TestHealthServiceChecks - 2019/11/27 02:20:25.345987 [INFO] serf: EventMemberJoin: Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7 127.0.0.1
TestHealthServiceChecks - 2019/11/27 02:20:25.350760 [INFO] consul: Adding LAN server Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7 (Addr: tcp/127.0.0.1:11806) (DC: dc1)
2019/11/27 02:20:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:25 [INFO]  raft: Node at 127.0.0.1:11806 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks - 2019/11/27 02:20:25.367845 [INFO] agent: Started DNS server 127.0.0.1:11801 (udp)
TestHealthServiceChecks - 2019/11/27 02:20:25.371234 [INFO] consul: Handled member-join event for server "Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7.dc1" in area "wan"
TestHealthServiceChecks - 2019/11/27 02:20:25.395256 [INFO] agent: Started DNS server 127.0.0.1:11801 (tcp)
TestHealthServiceChecks - 2019/11/27 02:20:25.402789 [INFO] agent: Started HTTP server on 127.0.0.1:11802 (tcp)
TestHealthServiceChecks - 2019/11/27 02:20:25.402907 [INFO] agent: started state syncer
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:25.443015 [WARN] agent: Node name "Node 878b2e5d-bfe2-3c72-a428-8cf4b982260a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:25.443604 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:25.443807 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:25.444161 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:25.444439 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:20:25.491919 [DEBUG] consul: Skipping self join check for "Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6" since the cluster is too small
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:26.047144 [INFO] agent: Synced node info
2019/11/27 02:20:26 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:26 [INFO]  raft: Node at 127.0.0.1:11806 [Leader] entering Leader state
2019/11/27 02:20:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:450a416e-35c1-e60c-fdda-637f449114b4 Address:127.0.0.1:11812}]
2019/11/27 02:20:26 [INFO]  raft: Node at 127.0.0.1:11812 [Follower] entering Follower state (Leader: "")
TestHealthServiceChecks - 2019/11/27 02:20:26.382678 [INFO] consul: cluster leadership acquired
TestHealthServiceChecks - 2019/11/27 02:20:26.383025 [INFO] consul: New leader elected: Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7
TestHealthNodeChecks - 2019/11/27 02:20:26.383038 [INFO] serf: EventMemberJoin: Node 450a416e-35c1-e60c-fdda-637f449114b4.dc1 127.0.0.1
TestHealthNodeChecks - 2019/11/27 02:20:26.387915 [INFO] serf: EventMemberJoin: Node 450a416e-35c1-e60c-fdda-637f449114b4 127.0.0.1
TestHealthNodeChecks - 2019/11/27 02:20:26.389276 [INFO] consul: Adding LAN server Node 450a416e-35c1-e60c-fdda-637f449114b4 (Addr: tcp/127.0.0.1:11812) (DC: dc1)
TestHealthNodeChecks - 2019/11/27 02:20:26.389796 [INFO] consul: Handled member-join event for server "Node 450a416e-35c1-e60c-fdda-637f449114b4.dc1" in area "wan"
TestHealthNodeChecks - 2019/11/27 02:20:26.392740 [INFO] agent: Started DNS server 127.0.0.1:11807 (tcp)
TestHealthNodeChecks - 2019/11/27 02:20:26.392855 [INFO] agent: Started DNS server 127.0.0.1:11807 (udp)
TestHealthNodeChecks - 2019/11/27 02:20:26.396369 [INFO] agent: Started HTTP server on 127.0.0.1:11808 (tcp)
TestHealthNodeChecks - 2019/11/27 02:20:26.396470 [INFO] agent: started state syncer
2019/11/27 02:20:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:26 [INFO]  raft: Node at 127.0.0.1:11812 [Candidate] entering Candidate state in term 2
TestHealthServiceChecks - 2019/11/27 02:20:27.179801 [INFO] agent: Synced node info
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:27.195297 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:27.195420 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:20:27.323435 [DEBUG] consul: Skipping self join check for "Node ce219ef7-1c48-e006-7f93-81e0fe3f4967" since the cluster is too small
2019/11/27 02:20:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:878b2e5d-bfe2-3c72-a428-8cf4b982260a Address:127.0.0.1:11818}]
2019/11/27 02:20:27 [INFO]  raft: Node at 127.0.0.1:11818 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.327318 [INFO] serf: EventMemberJoin: Node 878b2e5d-bfe2-3c72-a428-8cf4b982260a.dc1 127.0.0.1
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.331004 [INFO] serf: EventMemberJoin: Node 878b2e5d-bfe2-3c72-a428-8cf4b982260a 127.0.0.1
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.331886 [INFO] consul: Handled member-join event for server "Node 878b2e5d-bfe2-3c72-a428-8cf4b982260a.dc1" in area "wan"
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.332210 [INFO] consul: Adding LAN server Node 878b2e5d-bfe2-3c72-a428-8cf4b982260a (Addr: tcp/127.0.0.1:11818) (DC: dc1)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.332787 [INFO] agent: Started DNS server 127.0.0.1:11813 (udp)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.333409 [INFO] agent: Started DNS server 127.0.0.1:11813 (tcp)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.335476 [INFO] agent: Started HTTP server on 127.0.0.1:11814 (tcp)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:27.335593 [INFO] agent: started state syncer
2019/11/27 02:20:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:27 [INFO]  raft: Node at 127.0.0.1:11818 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:27 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:27 [INFO]  raft: Node at 127.0.0.1:11812 [Leader] entering Leader state
TestHealthNodeChecks - 2019/11/27 02:20:27.539997 [INFO] consul: cluster leadership acquired
TestHealthNodeChecks - 2019/11/27 02:20:27.541064 [INFO] consul: New leader elected: Node 450a416e-35c1-e60c-fdda-637f449114b4
TestHealthServiceChecks - 2019/11/27 02:20:28.579849 [DEBUG] agent: Node info in sync
TestHealthServiceChecks - 2019/11/27 02:20:28.579969 [DEBUG] agent: Node info in sync
TestHealthNodeChecks - 2019/11/27 02:20:28.690861 [INFO] agent: Synced node info
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:28.693491 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:28.694080 [DEBUG] consul: Skipping self join check for "Node 3a25d714-d367-9067-c963-0a92ceac3cef" since the cluster is too small
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:28.694268 [INFO] consul: member 'Node 3a25d714-d367-9067-c963-0a92ceac3cef' joined, marking health alive
2019/11/27 02:20:29 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:29 [INFO]  raft: Node at 127.0.0.1:11818 [Leader] entering Leader state
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:29.445742 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:29.446208 [INFO] consul: New leader elected: Node 878b2e5d-bfe2-3c72-a428-8cf4b982260a
TestHealthNodeChecks - 2019/11/27 02:20:29.805645 [DEBUG] agent: Node info in sync
TestHealthNodeChecks - 2019/11/27 02:20:29.805761 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:20:29.811445 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:29.811525 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/11/27 02:20:29.811563 [DEBUG] agent: Node info in sync
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.060155 [INFO] agent: Requesting shutdown
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.060295 [INFO] consul: shutting down server
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.060355 [WARN] serf: Shutdown without a Leave
jones - 2019/11/27 02:20:30.213565 [DEBUG] consul: Skipping self join check for "Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22" since the cluster is too small
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.214159 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks - 2019/11/27 02:20:30.217360 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthServiceChecks - 2019/11/27 02:20:30.217818 [DEBUG] consul: Skipping self join check for "Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7" since the cluster is too small
TestHealthServiceChecks - 2019/11/27 02:20:30.217994 [INFO] consul: member 'Node 3c62e7ee-d8ce-0e3c-a3b0-00db288e1fa7' joined, marking health alive
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.384871 [INFO] manager: shutting down
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.387073 [INFO] agent: consul server down
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.387293 [INFO] agent: shutdown complete
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.387728 [INFO] agent: Stopping DNS server 127.0.0.1:11795 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.388066 [INFO] agent: Stopping DNS server 127.0.0.1:11795 (udp)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.388443 [INFO] agent: Stopping HTTP server 127.0.0.1:11796 (tcp)
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.388826 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks_NodeMetaFilter - 2019/11/27 02:20:30.389015 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks_NodeMetaFilter (8.74s)
=== CONT  TestHealthChecksInState_NodeMetaFilter
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:30.389437 [INFO] agent: Synced node info
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:30.389529 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:30.450456 [WARN] agent: Node name "Node a1ca5744-440f-24b3-838b-0a95785bee6e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:30.450994 [DEBUG] tlsutil: Update with version 1
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:30.451183 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:30.451454 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:30.451651 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthServiceChecks - 2019/11/27 02:20:31.203266 [INFO] agent: Requesting shutdown
TestHealthServiceChecks - 2019/11/27 02:20:31.203428 [INFO] consul: shutting down server
TestHealthServiceChecks - 2019/11/27 02:20:31.203498 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks - 2019/11/27 02:20:31.789391 [WARN] serf: Shutdown without a Leave
TestHealthServiceChecks - 2019/11/27 02:20:32.011668 [INFO] manager: shutting down
TestHealthServiceChecks - 2019/11/27 02:20:32.012539 [INFO] agent: consul server down
TestHealthServiceChecks - 2019/11/27 02:20:32.012600 [INFO] agent: shutdown complete
TestHealthServiceChecks - 2019/11/27 02:20:32.012662 [INFO] agent: Stopping DNS server 127.0.0.1:11801 (tcp)
TestHealthServiceChecks - 2019/11/27 02:20:32.012836 [INFO] agent: Stopping DNS server 127.0.0.1:11801 (udp)
TestHealthServiceChecks - 2019/11/27 02:20:32.013003 [INFO] agent: Stopping HTTP server 127.0.0.1:11802 (tcp)
TestHealthServiceChecks - 2019/11/27 02:20:32.013221 [INFO] agent: Waiting for endpoints to shut down
TestHealthServiceChecks - 2019/11/27 02:20:32.013295 [INFO] agent: Endpoints down
--- PASS: TestHealthServiceChecks (9.33s)
=== CONT  TestAgent_loadServices_sidecarInheritMeta
TestHealthNodeChecks - 2019/11/27 02:20:32.022957 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestHealthNodeChecks - 2019/11/27 02:20:32.023396 [DEBUG] consul: Skipping self join check for "Node 450a416e-35c1-e60c-fdda-637f449114b4" since the cluster is too small
TestHealthNodeChecks - 2019/11/27 02:20:32.023566 [INFO] consul: member 'Node 450a416e-35c1-e60c-fdda-637f449114b4' joined, marking health alive
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:32.176078 [WARN] agent: Node name "Node bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:32.176658 [DEBUG] tlsutil: Update with version 1
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:32.177603 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:32.178059 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:32.178305 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthNodeChecks - 2019/11/27 02:20:32.294344 [INFO] agent: Requesting shutdown
TestHealthNodeChecks - 2019/11/27 02:20:32.294444 [INFO] consul: shutting down server
TestHealthNodeChecks - 2019/11/27 02:20:32.294496 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks - 2019/11/27 02:20:32.490500 [WARN] serf: Shutdown without a Leave
TestHealthNodeChecks - 2019/11/27 02:20:32.622872 [INFO] manager: shutting down
TestHealthNodeChecks - 2019/11/27 02:20:32.623762 [INFO] agent: consul server down
TestHealthNodeChecks - 2019/11/27 02:20:32.623827 [INFO] agent: shutdown complete
TestHealthNodeChecks - 2019/11/27 02:20:32.623886 [INFO] agent: Stopping DNS server 127.0.0.1:11807 (tcp)
TestHealthNodeChecks - 2019/11/27 02:20:32.624043 [INFO] agent: Stopping DNS server 127.0.0.1:11807 (udp)
TestHealthNodeChecks - 2019/11/27 02:20:32.624264 [INFO] agent: Stopping HTTP server 127.0.0.1:11808 (tcp)
TestHealthNodeChecks - 2019/11/27 02:20:32.624537 [INFO] agent: Waiting for endpoints to shut down
TestHealthNodeChecks - 2019/11/27 02:20:32.624633 [INFO] agent: Endpoints down
--- PASS: TestHealthNodeChecks (8.66s)
=== CONT  TestUUIDToUint64
--- PASS: TestUUIDToUint64 (0.00s)
=== CONT  TestEventList_EventBufOrder
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_EventBufOrder - 2019/11/27 02:20:32.732338 [WARN] agent: Node name "Node 122ba86d-cd5d-f91c-7670-829e58e51c83" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_EventBufOrder - 2019/11/27 02:20:32.733410 [DEBUG] tlsutil: Update with version 1
TestEventList_EventBufOrder - 2019/11/27 02:20:32.733633 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_EventBufOrder - 2019/11/27 02:20:32.733901 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventList_EventBufOrder - 2019/11/27 02:20:32.734092 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:32.751089 [INFO] agent: Requesting shutdown
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:32.751191 [INFO] consul: shutting down server
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:32.751239 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:32.815481 [DEBUG] agent: Node info in sync
2019/11/27 02:20:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a1ca5744-440f-24b3-838b-0a95785bee6e Address:127.0.0.1:11824}]
2019/11/27 02:20:32 [INFO]  raft: Node at 127.0.0.1:11824 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.849344 [INFO] serf: EventMemberJoin: Node a1ca5744-440f-24b3-838b-0a95785bee6e.dc1 127.0.0.1
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.852988 [INFO] serf: EventMemberJoin: Node a1ca5744-440f-24b3-838b-0a95785bee6e 127.0.0.1
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.853816 [INFO] consul: Handled member-join event for server "Node a1ca5744-440f-24b3-838b-0a95785bee6e.dc1" in area "wan"
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.854091 [INFO] consul: Adding LAN server Node a1ca5744-440f-24b3-838b-0a95785bee6e (Addr: tcp/127.0.0.1:11824) (DC: dc1)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.854379 [INFO] agent: Started DNS server 127.0.0.1:11819 (udp)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.854622 [INFO] agent: Started DNS server 127.0.0.1:11819 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.856520 [INFO] agent: Started HTTP server on 127.0.0.1:11820 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:32.856607 [INFO] agent: started state syncer
2019/11/27 02:20:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:32 [INFO]  raft: Node at 127.0.0.1:11824 [Candidate] entering Candidate state in term 2
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.022617 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.322570 [INFO] manager: shutting down
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445255 [INFO] agent: consul server down
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445358 [INFO] agent: shutdown complete
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445418 [INFO] agent: Stopping DNS server 127.0.0.1:11813 (tcp)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445592 [INFO] agent: Stopping DNS server 127.0.0.1:11813 (udp)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445751 [INFO] agent: Stopping HTTP server 127.0.0.1:11814 (tcp)
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445913 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.445997 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_DistanceSort - 2019/11/27 02:20:33.446062 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_DistanceSort (8.14s)
=== CONT  TestEventList_Blocking
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_Blocking - 2019/11/27 02:20:33.514849 [WARN] agent: Node name "Node 760f254e-8889-d814-8d1d-174ac4699a23" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_Blocking - 2019/11/27 02:20:33.515416 [DEBUG] tlsutil: Update with version 1
TestEventList_Blocking - 2019/11/27 02:20:33.515574 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_Blocking - 2019/11/27 02:20:33.515842 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventList_Blocking - 2019/11/27 02:20:33.516047 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17 Address:127.0.0.1:11830}]
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.744288 [INFO] serf: EventMemberJoin: Node bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17.dc1 127.0.0.1
2019/11/27 02:20:33 [INFO]  raft: Node at 127.0.0.1:11830 [Follower] entering Follower state (Leader: "")
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.753954 [INFO] serf: EventMemberJoin: Node bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17 127.0.0.1
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.754805 [INFO] consul: Handled member-join event for server "Node bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17.dc1" in area "wan"
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.755073 [INFO] consul: Adding LAN server Node bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17 (Addr: tcp/127.0.0.1:11830) (DC: dc1)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.756392 [INFO] agent: Started DNS server 127.0.0.1:11825 (udp)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.756835 [INFO] agent: Started DNS server 127.0.0.1:11825 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.758814 [INFO] agent: Started HTTP server on 127.0.0.1:11826 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.758928 [INFO] agent: started state syncer
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:33.762770 [WARN] agent: Check "service:rabbitmq-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:21000: connect: connection refused
2019/11/27 02:20:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:33 [INFO]  raft: Node at 127.0.0.1:11830 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:33 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:33 [INFO]  raft: Node at 127.0.0.1:11824 [Leader] entering Leader state
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:33.883698 [INFO] consul: cluster leadership acquired
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:33.884093 [INFO] consul: New leader elected: Node a1ca5744-440f-24b3-838b-0a95785bee6e
2019/11/27 02:20:34 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:122ba86d-cd5d-f91c-7670-829e58e51c83 Address:127.0.0.1:11836}]
2019/11/27 02:20:34 [INFO]  raft: Node at 127.0.0.1:11836 [Follower] entering Follower state (Leader: "")
TestEventList_EventBufOrder - 2019/11/27 02:20:34.327059 [INFO] serf: EventMemberJoin: Node 122ba86d-cd5d-f91c-7670-829e58e51c83.dc1 127.0.0.1
TestEventList_EventBufOrder - 2019/11/27 02:20:34.330338 [INFO] serf: EventMemberJoin: Node 122ba86d-cd5d-f91c-7670-829e58e51c83 127.0.0.1
TestEventList_EventBufOrder - 2019/11/27 02:20:34.331182 [INFO] consul: Handled member-join event for server "Node 122ba86d-cd5d-f91c-7670-829e58e51c83.dc1" in area "wan"
TestEventList_EventBufOrder - 2019/11/27 02:20:34.331509 [INFO] consul: Adding LAN server Node 122ba86d-cd5d-f91c-7670-829e58e51c83 (Addr: tcp/127.0.0.1:11836) (DC: dc1)
TestEventList_EventBufOrder - 2019/11/27 02:20:34.331847 [INFO] agent: Started DNS server 127.0.0.1:11831 (udp)
TestEventList_EventBufOrder - 2019/11/27 02:20:34.332217 [INFO] agent: Started DNS server 127.0.0.1:11831 (tcp)
TestEventList_EventBufOrder - 2019/11/27 02:20:34.334209 [INFO] agent: Started HTTP server on 127.0.0.1:11832 (tcp)
TestEventList_EventBufOrder - 2019/11/27 02:20:34.334323 [INFO] agent: started state syncer
2019/11/27 02:20:34 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:34 [INFO]  raft: Node at 127.0.0.1:11836 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:20:34.487796 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:34.487882 [DEBUG] agent: Node info in sync
2019/11/27 02:20:34 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:34 [INFO]  raft: Node at 127.0.0.1:11830 [Leader] entering Leader state
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:34.691218 [INFO] consul: cluster leadership acquired
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:34.691831 [INFO] consul: New leader elected: Node bf3f18e0-6b57-5bbb-2d54-65a43b8a3e17
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:34.692001 [INFO] agent: Requesting shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:34.692076 [INFO] consul: shutting down server
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:34.692120 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:34.693447 [INFO] agent: Synced node info
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:34.833536 [WARN] serf: Shutdown without a Leave
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:34.944757 [INFO] manager: shutting down
2019/11/27 02:20:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:760f254e-8889-d814-8d1d-174ac4699a23 Address:127.0.0.1:11842}]
2019/11/27 02:20:35 [INFO]  raft: Node at 127.0.0.1:11842 [Follower] entering Follower state (Leader: "")
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.082582 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.082934 [INFO] agent: consul server down
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.082998 [INFO] agent: shutdown complete
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.083057 [INFO] agent: Stopping DNS server 127.0.0.1:11819 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.082949 [ERR] consul: failed to establish leadership: raft is already shutdown
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.083209 [INFO] agent: Stopping DNS server 127.0.0.1:11819 (udp)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.083392 [INFO] agent: Stopping HTTP server 127.0.0.1:11820 (tcp)
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.083616 [INFO] agent: Waiting for endpoints to shut down
TestHealthChecksInState_NodeMetaFilter - 2019/11/27 02:20:35.083689 [INFO] agent: Endpoints down
--- PASS: TestHealthChecksInState_NodeMetaFilter (4.69s)
=== CONT  TestEventList_ACLFilter
TestEventList_Blocking - 2019/11/27 02:20:35.088077 [INFO] serf: EventMemberJoin: Node 760f254e-8889-d814-8d1d-174ac4699a23.dc1 127.0.0.1
TestEventList_Blocking - 2019/11/27 02:20:35.099861 [INFO] serf: EventMemberJoin: Node 760f254e-8889-d814-8d1d-174ac4699a23 127.0.0.1
TestEventList_Blocking - 2019/11/27 02:20:35.100765 [INFO] consul: Adding LAN server Node 760f254e-8889-d814-8d1d-174ac4699a23 (Addr: tcp/127.0.0.1:11842) (DC: dc1)
TestEventList_Blocking - 2019/11/27 02:20:35.102073 [INFO] consul: Handled member-join event for server "Node 760f254e-8889-d814-8d1d-174ac4699a23.dc1" in area "wan"
TestEventList_Blocking - 2019/11/27 02:20:35.105193 [INFO] agent: Started DNS server 127.0.0.1:11837 (tcp)
TestEventList_Blocking - 2019/11/27 02:20:35.105497 [INFO] agent: Started DNS server 127.0.0.1:11837 (udp)
TestEventList_Blocking - 2019/11/27 02:20:35.112252 [INFO] agent: Started HTTP server on 127.0.0.1:11838 (tcp)
TestEventList_Blocking - 2019/11/27 02:20:35.112348 [INFO] agent: started state syncer
2019/11/27 02:20:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:35 [INFO]  raft: Node at 127.0.0.1:11842 [Candidate] entering Candidate state in term 2
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.212873 [INFO] agent: Synced service "rabbitmq"
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_ACLFilter - 2019/11/27 02:20:35.222504 [WARN] agent: Node name "Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_ACLFilter - 2019/11/27 02:20:35.222948 [DEBUG] tlsutil: Update with version 1
TestEventList_ACLFilter - 2019/11/27 02:20:35.223028 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_ACLFilter - 2019/11/27 02:20:35.223461 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventList_ACLFilter - 2019/11/27 02:20:35.223576 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:35 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:35 [INFO]  raft: Node at 127.0.0.1:11836 [Leader] entering Leader state
TestEventList_EventBufOrder - 2019/11/27 02:20:35.368259 [INFO] consul: cluster leadership acquired
TestEventList_EventBufOrder - 2019/11/27 02:20:35.368662 [INFO] consul: New leader elected: Node 122ba86d-cd5d-f91c-7670-829e58e51c83
TestEventList_EventBufOrder - 2019/11/27 02:20:35.956669 [INFO] agent: Synced node info
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.961276 [INFO] agent: Synced service "rabbitmq-sidecar-proxy"
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.961380 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:1" in sync
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.961432 [DEBUG] agent: Check "service:rabbitmq-sidecar-proxy:2" in sync
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.961462 [DEBUG] agent: Node info in sync
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.964491 [INFO] agent: Requesting shutdown
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.964738 [INFO] consul: shutting down server
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:35.964840 [WARN] serf: Shutdown without a Leave
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.066869 [WARN] serf: Shutdown without a Leave
2019/11/27 02:20:36 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:36 [INFO]  raft: Node at 127.0.0.1:11842 [Leader] entering Leader state
TestEventList_Blocking - 2019/11/27 02:20:36.068671 [INFO] consul: cluster leadership acquired
TestEventList_Blocking - 2019/11/27 02:20:36.069239 [INFO] consul: New leader elected: Node 760f254e-8889-d814-8d1d-174ac4699a23
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.202635 [INFO] manager: shutting down
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.512141 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.512280 [INFO] agent: consul server down
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.512329 [INFO] agent: shutdown complete
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.512381 [INFO] agent: Stopping DNS server 127.0.0.1:11825 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.512535 [INFO] agent: Stopping DNS server 127.0.0.1:11825 (udp)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.512722 [INFO] agent: Stopping HTTP server 127.0.0.1:11826 (tcp)
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.513091 [INFO] agent: Waiting for endpoints to shut down
TestAgent_loadServices_sidecarInheritMeta - 2019/11/27 02:20:36.513145 [INFO] agent: Endpoints down
--- PASS: TestAgent_loadServices_sidecarInheritMeta (4.50s)
=== CONT  TestEventList_Filter
2019/11/27 02:20:36 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9 Address:127.0.0.1:11848}]
TestEventList_Blocking - 2019/11/27 02:20:36.656796 [INFO] agent: Synced node info
TestEventList_Blocking - 2019/11/27 02:20:36.656918 [DEBUG] agent: Node info in sync
TestEventList_ACLFilter - 2019/11/27 02:20:36.659780 [INFO] serf: EventMemberJoin: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9.dc1 127.0.0.1
TestEventList_ACLFilter - 2019/11/27 02:20:36.663590 [INFO] serf: EventMemberJoin: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9 127.0.0.1
TestEventList_ACLFilter - 2019/11/27 02:20:36.664810 [INFO] agent: Started DNS server 127.0.0.1:11843 (udp)
TestEventList_ACLFilter - 2019/11/27 02:20:36.665275 [INFO] consul: Adding LAN server Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9 (Addr: tcp/127.0.0.1:11848) (DC: dc1)
TestEventList_ACLFilter - 2019/11/27 02:20:36.665855 [INFO] agent: Started DNS server 127.0.0.1:11843 (tcp)
WARNING: bootstrap = true: do not enable unless necessary
TestEventList_Filter - 2019/11/27 02:20:36.667351 [WARN] agent: Node name "Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList_Filter - 2019/11/27 02:20:36.667703 [DEBUG] tlsutil: Update with version 1
TestEventList_Filter - 2019/11/27 02:20:36.667770 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_Filter - 2019/11/27 02:20:36.667918 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventList_Filter - 2019/11/27 02:20:36.668025 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_ACLFilter - 2019/11/27 02:20:36.669159 [INFO] agent: Started HTTP server on 127.0.0.1:11844 (tcp)
TestEventList_ACLFilter - 2019/11/27 02:20:36.669261 [INFO] agent: started state syncer
2019/11/27 02:20:36 [INFO]  raft: Node at 127.0.0.1:11848 [Follower] entering Follower state (Leader: "")
TestEventList_ACLFilter - 2019/11/27 02:20:36.670408 [INFO] consul: Handled member-join event for server "Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9.dc1" in area "wan"
2019/11/27 02:20:36 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:36 [INFO]  raft: Node at 127.0.0.1:11848 [Candidate] entering Candidate state in term 2
TestEventList_Blocking - 2019/11/27 02:20:37.208411 [DEBUG] agent: Node info in sync
TestEventList_EventBufOrder - 2019/11/27 02:20:37.389851 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_EventBufOrder - 2019/11/27 02:20:37.390378 [DEBUG] consul: Skipping self join check for "Node 122ba86d-cd5d-f91c-7670-829e58e51c83" since the cluster is too small
TestEventList_EventBufOrder - 2019/11/27 02:20:37.390542 [INFO] consul: member 'Node 122ba86d-cd5d-f91c-7670-829e58e51c83' joined, marking health alive
2019/11/27 02:20:37 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:37 [INFO]  raft: Node at 127.0.0.1:11848 [Leader] entering Leader state
TestEventList_ACLFilter - 2019/11/27 02:20:37.512989 [INFO] consul: cluster leadership acquired
TestEventList_ACLFilter - 2019/11/27 02:20:37.513621 [INFO] consul: New leader elected: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9
TestEventList_ACLFilter - 2019/11/27 02:20:37.516151 [ERR] agent: failed to sync remote state: ACL not found
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620022 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620128 [DEBUG] consul: User event: bar
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620198 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620245 [DEBUG] consul: User event: foo
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620303 [DEBUG] consul: User event: bar
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620430 [DEBUG] agent: new event: foo (67d0dd4d-e235-e9cb-d9bc-4deb1c7b24e4)
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620525 [DEBUG] agent: new event: bar (75485b4b-598b-08db-cbbe-cbab4f10e676)
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620615 [DEBUG] agent: new event: foo (ff264a9b-2fa2-d281-e9f4-0f20cf1c74e1)
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620701 [DEBUG] agent: new event: foo (e4e9d0b0-3948-2328-a7e7-b170c1bab333)
TestEventList_EventBufOrder - 2019/11/27 02:20:37.620787 [DEBUG] agent: new event: bar (d281d2c3-d6de-3a4b-1366-0693deb6de87)
TestEventList_EventBufOrder - 2019/11/27 02:20:37.645492 [INFO] agent: Requesting shutdown
TestEventList_EventBufOrder - 2019/11/27 02:20:37.645605 [INFO] consul: shutting down server
TestEventList_EventBufOrder - 2019/11/27 02:20:37.645671 [WARN] serf: Shutdown without a Leave
TestEventList_EventBufOrder - 2019/11/27 02:20:37.866788 [WARN] serf: Shutdown without a Leave
TestEventList_EventBufOrder - 2019/11/27 02:20:38.000157 [INFO] manager: shutting down
2019/11/27 02:20:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:aa46d014-19e2-ac39-54eb-2d8d003bc7e1 Address:127.0.0.1:11854}]
TestEventList_EventBufOrder - 2019/11/27 02:20:38.001027 [INFO] agent: consul server down
TestEventList_EventBufOrder - 2019/11/27 02:20:38.001092 [INFO] agent: shutdown complete
TestEventList_EventBufOrder - 2019/11/27 02:20:38.001213 [INFO] agent: Stopping DNS server 127.0.0.1:11831 (tcp)
TestEventList_EventBufOrder - 2019/11/27 02:20:38.001535 [INFO] agent: Stopping DNS server 127.0.0.1:11831 (udp)
TestEventList_EventBufOrder - 2019/11/27 02:20:38.001882 [INFO] agent: Stopping HTTP server 127.0.0.1:11832 (tcp)
TestEventList_EventBufOrder - 2019/11/27 02:20:38.002285 [INFO] agent: Waiting for endpoints to shut down
TestEventList_EventBufOrder - 2019/11/27 02:20:38.002446 [INFO] agent: Endpoints down
--- PASS: TestEventList_EventBufOrder (5.38s)
=== CONT  TestEventList
TestEventList_Filter - 2019/11/27 02:20:38.004415 [INFO] serf: EventMemberJoin: Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1.dc1 127.0.0.1
2019/11/27 02:20:38 [INFO]  raft: Node at 127.0.0.1:11854 [Follower] entering Follower state (Leader: "")
TestEventList_ACLFilter - 2019/11/27 02:20:38.005992 [INFO] acl: initializing acls
TestEventList_Filter - 2019/11/27 02:20:38.009509 [INFO] serf: EventMemberJoin: Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1 127.0.0.1
TestEventList_Filter - 2019/11/27 02:20:38.013349 [INFO] agent: Started DNS server 127.0.0.1:11849 (udp)
TestEventList_Filter - 2019/11/27 02:20:38.014022 [INFO] consul: Adding LAN server Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1 (Addr: tcp/127.0.0.1:11854) (DC: dc1)
TestEventList_Filter - 2019/11/27 02:20:38.014334 [INFO] consul: Handled member-join event for server "Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1.dc1" in area "wan"
TestEventList_Filter - 2019/11/27 02:20:38.014955 [INFO] agent: Started DNS server 127.0.0.1:11849 (tcp)
TestEventList_Filter - 2019/11/27 02:20:38.017498 [INFO] agent: Started HTTP server on 127.0.0.1:11850 (tcp)
TestEventList_Filter - 2019/11/27 02:20:38.017608 [INFO] agent: started state syncer
2019/11/27 02:20:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:38 [INFO]  raft: Node at 127.0.0.1:11854 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestEventList - 2019/11/27 02:20:38.095112 [WARN] agent: Node name "Node eee43763-2298-a63b-851e-901f333301e6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventList - 2019/11/27 02:20:38.095711 [DEBUG] tlsutil: Update with version 1
TestEventList - 2019/11/27 02:20:38.095778 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList - 2019/11/27 02:20:38.096107 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventList - 2019/11/27 02:20:38.096220 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_ACLFilter - 2019/11/27 02:20:38.216917 [INFO] acl: initializing acls
TestEventList_Blocking - 2019/11/27 02:20:38.234275 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_Blocking - 2019/11/27 02:20:38.234749 [DEBUG] consul: Skipping self join check for "Node 760f254e-8889-d814-8d1d-174ac4699a23" since the cluster is too small
TestEventList_Blocking - 2019/11/27 02:20:38.234915 [INFO] consul: member 'Node 760f254e-8889-d814-8d1d-174ac4699a23' joined, marking health alive
TestEventList_Blocking - 2019/11/27 02:20:38.437124 [DEBUG] consul: User event: test
TestEventList_Blocking - 2019/11/27 02:20:38.437330 [DEBUG] agent: new event: test (2a4ffdf9-49ed-06f7-1bc7-e342222d5b05)
TestEventList_Blocking - 2019/11/27 02:20:38.487958 [DEBUG] consul: User event: second
TestEventList_Blocking - 2019/11/27 02:20:38.488183 [DEBUG] agent: new event: second (51da6aa1-c75d-0d80-1c51-986bf0d02c3a)
TestEventList_Blocking - 2019/11/27 02:20:38.488653 [INFO] agent: Requesting shutdown
TestEventList_Blocking - 2019/11/27 02:20:38.488739 [INFO] consul: shutting down server
TestEventList_Blocking - 2019/11/27 02:20:38.488852 [WARN] serf: Shutdown without a Leave
TestEventList_ACLFilter - 2019/11/27 02:20:38.534733 [INFO] consul: Created ACL 'global-management' policy
TestEventList_ACLFilter - 2019/11/27 02:20:38.534834 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_ACLFilter - 2019/11/27 02:20:38.536928 [INFO] consul: Created ACL 'global-management' policy
TestEventList_ACLFilter - 2019/11/27 02:20:38.537025 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList_Blocking - 2019/11/27 02:20:38.634334 [WARN] serf: Shutdown without a Leave
2019/11/27 02:20:38 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:38 [INFO]  raft: Node at 127.0.0.1:11854 [Leader] entering Leader state
TestEventList_Filter - 2019/11/27 02:20:38.769129 [INFO] consul: cluster leadership acquired
TestEventList_Filter - 2019/11/27 02:20:38.769594 [INFO] consul: New leader elected: Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1
TestEventList_Blocking - 2019/11/27 02:20:38.772830 [INFO] manager: shutting down
TestEventList_Blocking - 2019/11/27 02:20:38.773333 [INFO] agent: consul server down
TestEventList_Blocking - 2019/11/27 02:20:38.773395 [INFO] agent: shutdown complete
TestEventList_Blocking - 2019/11/27 02:20:38.773457 [INFO] agent: Stopping DNS server 127.0.0.1:11837 (tcp)
TestEventList_Blocking - 2019/11/27 02:20:38.773637 [INFO] agent: Stopping DNS server 127.0.0.1:11837 (udp)
TestEventList_Blocking - 2019/11/27 02:20:38.774021 [INFO] agent: Stopping HTTP server 127.0.0.1:11838 (tcp)
TestEventList_Blocking - 2019/11/27 02:20:38.774265 [INFO] agent: Waiting for endpoints to shut down
TestEventList_Blocking - 2019/11/27 02:20:38.774345 [INFO] agent: Endpoints down
--- PASS: TestEventList_Blocking (5.33s)
=== CONT  TestEventFire_token
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestEventFire_token - 2019/11/27 02:20:38.849683 [WARN] agent: Node name "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventFire_token - 2019/11/27 02:20:38.850245 [DEBUG] tlsutil: Update with version 1
TestEventFire_token - 2019/11/27 02:20:38.850318 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire_token - 2019/11/27 02:20:38.850708 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventFire_token - 2019/11/27 02:20:38.850862 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_ACLFilter - 2019/11/27 02:20:39.124311 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventList_ACLFilter - 2019/11/27 02:20:39.127079 [INFO] consul: Bootstrapped ACL master token from configuration
2019/11/27 02:20:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eee43763-2298-a63b-851e-901f333301e6 Address:127.0.0.1:11860}]
TestEventList - 2019/11/27 02:20:39.251558 [INFO] serf: EventMemberJoin: Node eee43763-2298-a63b-851e-901f333301e6.dc1 127.0.0.1
TestEventList_Filter - 2019/11/27 02:20:39.252880 [INFO] agent: Synced node info
2019/11/27 02:20:39 [INFO]  raft: Node at 127.0.0.1:11860 [Follower] entering Follower state (Leader: "")
TestEventList - 2019/11/27 02:20:39.268280 [INFO] serf: EventMemberJoin: Node eee43763-2298-a63b-851e-901f333301e6 127.0.0.1
TestEventList - 2019/11/27 02:20:39.269888 [INFO] consul: Adding LAN server Node eee43763-2298-a63b-851e-901f333301e6 (Addr: tcp/127.0.0.1:11860) (DC: dc1)
TestEventList - 2019/11/27 02:20:39.270584 [INFO] consul: Handled member-join event for server "Node eee43763-2298-a63b-851e-901f333301e6.dc1" in area "wan"
TestEventList - 2019/11/27 02:20:39.275551 [INFO] agent: Started DNS server 127.0.0.1:11855 (tcp)
TestEventList - 2019/11/27 02:20:39.275945 [INFO] agent: Started DNS server 127.0.0.1:11855 (udp)
TestEventList - 2019/11/27 02:20:39.278165 [INFO] agent: Started HTTP server on 127.0.0.1:11856 (tcp)
TestEventList - 2019/11/27 02:20:39.278282 [INFO] agent: started state syncer
2019/11/27 02:20:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:39 [INFO]  raft: Node at 127.0.0.1:11860 [Candidate] entering Candidate state in term 2
TestEventList_ACLFilter - 2019/11/27 02:20:39.415791 [INFO] consul: Created ACL anonymous token from configuration
TestEventList_ACLFilter - 2019/11/27 02:20:39.416831 [INFO] serf: EventMemberUpdate: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9
TestEventList_ACLFilter - 2019/11/27 02:20:39.417683 [INFO] serf: EventMemberUpdate: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9.dc1
TestEventList_ACLFilter - 2019/11/27 02:20:39.871278 [INFO] consul: Created ACL anonymous token from configuration
TestEventList_ACLFilter - 2019/11/27 02:20:39.871513 [DEBUG] acl: transitioning out of legacy ACL mode
TestEventList_ACLFilter - 2019/11/27 02:20:39.872666 [INFO] serf: EventMemberUpdate: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9
TestEventList_ACLFilter - 2019/11/27 02:20:39.874088 [INFO] serf: EventMemberUpdate: Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9.dc1
2019/11/27 02:20:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:40 [INFO]  raft: Node at 127.0.0.1:11860 [Leader] entering Leader state
TestEventList - 2019/11/27 02:20:40.079580 [INFO] consul: cluster leadership acquired
TestEventList - 2019/11/27 02:20:40.081085 [INFO] consul: New leader elected: Node eee43763-2298-a63b-851e-901f333301e6
2019/11/27 02:20:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8615ab6e-7097-e19a-2aa4-e41d0ed55c66 Address:127.0.0.1:11866}]
2019/11/27 02:20:40 [INFO]  raft: Node at 127.0.0.1:11866 [Follower] entering Follower state (Leader: "")
TestEventFire_token - 2019/11/27 02:20:40.104647 [INFO] serf: EventMemberJoin: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66.dc1 127.0.0.1
TestEventFire_token - 2019/11/27 02:20:40.123649 [INFO] serf: EventMemberJoin: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66 127.0.0.1
TestEventFire_token - 2019/11/27 02:20:40.133107 [INFO] agent: Started DNS server 127.0.0.1:11861 (udp)
2019/11/27 02:20:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:40 [INFO]  raft: Node at 127.0.0.1:11866 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/11/27 02:20:40.142671 [INFO] consul: Adding LAN server Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66 (Addr: tcp/127.0.0.1:11866) (DC: dc1)
TestEventFire_token - 2019/11/27 02:20:40.172980 [INFO] agent: Started DNS server 127.0.0.1:11861 (tcp)
TestEventFire_token - 2019/11/27 02:20:40.180843 [INFO] consul: Handled member-join event for server "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66.dc1" in area "wan"
TestEventFire_token - 2019/11/27 02:20:40.188654 [INFO] agent: Started HTTP server on 127.0.0.1:11862 (tcp)
TestEventFire_token - 2019/11/27 02:20:40.188922 [INFO] agent: started state syncer
TestEventList - 2019/11/27 02:20:40.689750 [INFO] agent: Synced node info
TestEventList - 2019/11/27 02:20:40.689871 [DEBUG] agent: Node info in sync
TestEventList_Filter - 2019/11/27 02:20:40.789782 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_Filter - 2019/11/27 02:20:40.790254 [DEBUG] consul: Skipping self join check for "Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1" since the cluster is too small
TestEventList_Filter - 2019/11/27 02:20:40.790443 [INFO] consul: member 'Node aa46d014-19e2-ac39-54eb-2d8d003bc7e1' joined, marking health alive
TestEventList_ACLFilter - 2019/11/27 02:20:40.916391 [INFO] agent: Synced node info
TestEventList_ACLFilter - 2019/11/27 02:20:40.916540 [DEBUG] agent: Node info in sync
TestEventList_ACLFilter - 2019/11/27 02:20:40.928129 [DEBUG] consul: dropping node "Node 48acd3a6-76b9-8f22-0885-ae7f4c6a0ee9" from result due to ACLs
=== RUN   TestEventList_ACLFilter/no_token
TestEventList_ACLFilter - 2019/11/27 02:20:40.928908 [DEBUG] consul: User event: foo
TestEventList_ACLFilter - 2019/11/27 02:20:40.929092 [DEBUG] agent: new event: foo (57d6cf9b-ada4-3df1-9299-f6a2237cc3af)
TestEventList_ACLFilter - 2019/11/27 02:20:40.929283 [DEBUG] agent: dropping event "foo" from result due to ACLs
=== RUN   TestEventList_ACLFilter/root_token
TestEventList_ACLFilter - 2019/11/27 02:20:40.930238 [INFO] agent: Requesting shutdown
TestEventList_ACLFilter - 2019/11/27 02:20:40.930323 [INFO] consul: shutting down server
TestEventList_ACLFilter - 2019/11/27 02:20:40.930374 [WARN] serf: Shutdown without a Leave
TestEventList_Filter - 2019/11/27 02:20:41.007830 [DEBUG] consul: User event: test
TestEventList_Filter - 2019/11/27 02:20:41.007926 [DEBUG] consul: User event: foo
TestEventList_Filter - 2019/11/27 02:20:41.008061 [DEBUG] agent: new event: test (d4fb4f43-db7a-f061-083b-fd80c331a87d)
TestEventList_Filter - 2019/11/27 02:20:41.008159 [DEBUG] agent: new event: foo (7daaf3b7-fa73-ece9-81c7-7881054e64c3)
TestEventList_Filter - 2019/11/27 02:20:41.033128 [INFO] agent: Requesting shutdown
TestEventList_Filter - 2019/11/27 02:20:41.033248 [INFO] consul: shutting down server
TestEventList_Filter - 2019/11/27 02:20:41.033301 [WARN] serf: Shutdown without a Leave
2019/11/27 02:20:41 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:41 [INFO]  raft: Node at 127.0.0.1:11866 [Leader] entering Leader state
TestEventList_ACLFilter - 2019/11/27 02:20:41.056152 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/11/27 02:20:41.058027 [INFO] consul: cluster leadership acquired
TestEventFire_token - 2019/11/27 02:20:41.058451 [INFO] consul: New leader elected: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66
TestEventFire_token - 2019/11/27 02:20:41.082428 [ERR] agent: failed to sync remote state: ACL not found
TestEventList_Filter - 2019/11/27 02:20:41.177768 [WARN] serf: Shutdown without a Leave
TestEventList_ACLFilter - 2019/11/27 02:20:41.178482 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList_ACLFilter - 2019/11/27 02:20:41.182636 [INFO] manager: shutting down
TestEventList_ACLFilter - 2019/11/27 02:20:41.183011 [INFO] agent: consul server down
TestEventList_ACLFilter - 2019/11/27 02:20:41.188411 [INFO] agent: shutdown complete
TestEventList_ACLFilter - 2019/11/27 02:20:41.188472 [INFO] agent: Stopping DNS server 127.0.0.1:11843 (tcp)
TestEventList_ACLFilter - 2019/11/27 02:20:41.188679 [INFO] agent: Stopping DNS server 127.0.0.1:11843 (udp)
TestEventList_ACLFilter - 2019/11/27 02:20:41.188845 [INFO] agent: Stopping HTTP server 127.0.0.1:11844 (tcp)
TestEventList_ACLFilter - 2019/11/27 02:20:41.189077 [INFO] agent: Waiting for endpoints to shut down
TestEventList_ACLFilter - 2019/11/27 02:20:41.189142 [INFO] agent: Endpoints down
--- PASS: TestEventList_ACLFilter (6.11s)
    --- PASS: TestEventList_ACLFilter/no_token (0.00s)
    --- PASS: TestEventList_ACLFilter/root_token (0.00s)
=== CONT  TestEventFire
WARNING: bootstrap = true: do not enable unless necessary
TestEventFire - 2019/11/27 02:20:41.253493 [WARN] agent: Node name "Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventFire - 2019/11/27 02:20:41.253889 [DEBUG] tlsutil: Update with version 1
TestEventFire - 2019/11/27 02:20:41.253958 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventFire - 2019/11/27 02:20:41.254119 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventFire - 2019/11/27 02:20:41.254231 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList_Filter - 2019/11/27 02:20:41.300080 [INFO] manager: shutting down
TestEventList_Filter - 2019/11/27 02:20:41.300452 [INFO] agent: consul server down
TestEventList_Filter - 2019/11/27 02:20:41.300522 [INFO] agent: shutdown complete
TestEventList_Filter - 2019/11/27 02:20:41.300580 [INFO] agent: Stopping DNS server 127.0.0.1:11849 (tcp)
TestEventList_Filter - 2019/11/27 02:20:41.300773 [INFO] agent: Stopping DNS server 127.0.0.1:11849 (udp)
TestEventList_Filter - 2019/11/27 02:20:41.300969 [INFO] agent: Stopping HTTP server 127.0.0.1:11850 (tcp)
TestEventList_Filter - 2019/11/27 02:20:41.301196 [INFO] agent: Waiting for endpoints to shut down
TestEventList_Filter - 2019/11/27 02:20:41.301278 [INFO] agent: Endpoints down
--- PASS: TestEventList_Filter (4.79s)
=== CONT  TestDNS_Compression_Recurse
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_Recurse - 2019/11/27 02:20:41.375282 [WARN] agent: Node name "Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_Recurse - 2019/11/27 02:20:41.375687 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_Recurse - 2019/11/27 02:20:41.375757 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_Recurse - 2019/11/27 02:20:41.375973 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_Compression_Recurse - 2019/11/27 02:20:41.376920 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventList - 2019/11/27 02:20:41.573506 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/11/27 02:20:41.707625 [INFO] acl: initializing acls
TestEventFire_token - 2019/11/27 02:20:41.807419 [INFO] acl: initializing acls
TestEventList_ACLFilter - 2019/11/27 02:20:41.876927 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
jones - 2019/11/27 02:20:42.018767 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:42.018876 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/11/27 02:20:42.680970 [ERR] agent: failed to sync remote state: ACL not found
TestEventList - 2019/11/27 02:20:42.883622 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventList - 2019/11/27 02:20:42.884049 [DEBUG] consul: Skipping self join check for "Node eee43763-2298-a63b-851e-901f333301e6" since the cluster is too small
TestEventList - 2019/11/27 02:20:42.884276 [INFO] consul: member 'Node eee43763-2298-a63b-851e-901f333301e6' joined, marking health alive
TestEventFire_token - 2019/11/27 02:20:43.000779 [INFO] consul: Created ACL 'global-management' policy
TestEventFire_token - 2019/11/27 02:20:43.000895 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventFire_token - 2019/11/27 02:20:43.001205 [INFO] consul: Created ACL 'global-management' policy
TestEventFire_token - 2019/11/27 02:20:43.001283 [WARN] consul: Configuring a non-UUID master token is deprecated
TestEventList - 2019/11/27 02:20:43.126131 [DEBUG] consul: User event: test
TestEventList - 2019/11/27 02:20:43.126323 [DEBUG] agent: new event: test (ac30629e-e3a1-67db-9dd3-d410f0ce9ea2)
TestEventList - 2019/11/27 02:20:43.151562 [INFO] agent: Requesting shutdown
TestEventList - 2019/11/27 02:20:43.151663 [INFO] consul: shutting down server
TestEventList - 2019/11/27 02:20:43.151783 [WARN] serf: Shutdown without a Leave
TestEventList - 2019/11/27 02:20:43.335658 [WARN] serf: Shutdown without a Leave
TestEventList - 2019/11/27 02:20:43.436615 [INFO] manager: shutting down
TestEventList - 2019/11/27 02:20:43.437511 [INFO] agent: consul server down
TestEventList - 2019/11/27 02:20:43.437580 [INFO] agent: shutdown complete
TestEventList - 2019/11/27 02:20:43.437647 [INFO] agent: Stopping DNS server 127.0.0.1:11855 (tcp)
TestEventList - 2019/11/27 02:20:43.437830 [INFO] agent: Stopping DNS server 127.0.0.1:11855 (udp)
TestEventList - 2019/11/27 02:20:43.438018 [INFO] agent: Stopping HTTP server 127.0.0.1:11856 (tcp)
TestEventList - 2019/11/27 02:20:43.438264 [INFO] agent: Waiting for endpoints to shut down
TestEventList - 2019/11/27 02:20:43.438351 [INFO] agent: Endpoints down
--- PASS: TestEventList (5.44s)
=== CONT  TestDNS_Compression_ReverseLookup
2019/11/27 02:20:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6 Address:127.0.0.1:11872}]
2019/11/27 02:20:43 [INFO]  raft: Node at 127.0.0.1:11872 [Follower] entering Follower state (Leader: "")
TestEventFire - 2019/11/27 02:20:43.444534 [INFO] serf: EventMemberJoin: Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6.dc1 127.0.0.1
TestEventFire - 2019/11/27 02:20:43.449672 [INFO] serf: EventMemberJoin: Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6 127.0.0.1
TestEventFire - 2019/11/27 02:20:43.450499 [INFO] consul: Handled member-join event for server "Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6.dc1" in area "wan"
TestEventFire - 2019/11/27 02:20:43.450810 [INFO] consul: Adding LAN server Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6 (Addr: tcp/127.0.0.1:11872) (DC: dc1)
TestEventFire - 2019/11/27 02:20:43.451357 [INFO] agent: Started DNS server 127.0.0.1:11867 (tcp)
TestEventFire - 2019/11/27 02:20:43.451782 [INFO] agent: Started DNS server 127.0.0.1:11867 (udp)
TestEventFire - 2019/11/27 02:20:43.454082 [INFO] agent: Started HTTP server on 127.0.0.1:11868 (tcp)
TestEventFire - 2019/11/27 02:20:43.454182 [INFO] agent: started state syncer
2019/11/27 02:20:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:43 [INFO]  raft: Node at 127.0.0.1:11872 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:43.527256 [WARN] agent: Node name "Node 1432dd6e-0915-3c58-3a41-ff38152fa68b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:43.527666 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:43.527737 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:43.527907 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:43.528021 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:43 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3 Address:127.0.0.1:11878}]
2019/11/27 02:20:43 [INFO]  raft: Node at 127.0.0.1:11878 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.661458 [INFO] serf: EventMemberJoin: Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3.dc1 127.0.0.1
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.668211 [INFO] serf: EventMemberJoin: Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3 127.0.0.1
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.669836 [INFO] agent: Started DNS server 127.0.0.1:11873 (udp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.670574 [INFO] consul: Adding LAN server Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3 (Addr: tcp/127.0.0.1:11878) (DC: dc1)
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.670974 [INFO] consul: Handled member-join event for server "Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3.dc1" in area "wan"
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.671566 [INFO] agent: Started DNS server 127.0.0.1:11873 (tcp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.680596 [INFO] agent: Started HTTP server on 127.0.0.1:11874 (tcp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:43.680861 [INFO] agent: started state syncer
2019/11/27 02:20:43 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:43 [INFO]  raft: Node at 127.0.0.1:11878 [Candidate] entering Candidate state in term 2
TestEventFire_token - 2019/11/27 02:20:43.758087 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventFire_token - 2019/11/27 02:20:43.762608 [INFO] consul: Bootstrapped ACL master token from configuration
TestEventList_ACLFilter - 2019/11/27 02:20:43.876932 [ERR] autopilot: Error updating cluster health: error getting Raft configuration raft is already shutdown
TestEventFire_token - 2019/11/27 02:20:44.228577 [INFO] consul: Created ACL anonymous token from configuration
TestEventFire_token - 2019/11/27 02:20:44.230287 [INFO] serf: EventMemberUpdate: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66
TestEventFire_token - 2019/11/27 02:20:44.231192 [INFO] serf: EventMemberUpdate: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66.dc1
TestEventFire_token - 2019/11/27 02:20:44.232034 [INFO] consul: Created ACL anonymous token from configuration
TestEventFire_token - 2019/11/27 02:20:44.232104 [DEBUG] acl: transitioning out of legacy ACL mode
TestEventFire_token - 2019/11/27 02:20:44.239367 [INFO] serf: EventMemberUpdate: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66
TestEventFire_token - 2019/11/27 02:20:44.240997 [INFO] serf: EventMemberUpdate: Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66.dc1
2019/11/27 02:20:44 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:44 [INFO]  raft: Node at 127.0.0.1:11872 [Leader] entering Leader state
TestEventFire - 2019/11/27 02:20:44.242083 [INFO] consul: cluster leadership acquired
TestEventFire - 2019/11/27 02:20:44.242568 [INFO] consul: New leader elected: Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6
2019/11/27 02:20:44 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:44 [INFO]  raft: Node at 127.0.0.1:11878 [Leader] entering Leader state
TestDNS_Compression_Recurse - 2019/11/27 02:20:44.535992 [INFO] consul: cluster leadership acquired
TestDNS_Compression_Recurse - 2019/11/27 02:20:44.536444 [INFO] consul: New leader elected: Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3
2019/11/27 02:20:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1432dd6e-0915-3c58-3a41-ff38152fa68b Address:127.0.0.1:11884}]
TestEventFire - 2019/11/27 02:20:44.745167 [INFO] agent: Synced node info
2019/11/27 02:20:44 [INFO]  raft: Node at 127.0.0.1:11884 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.748116 [INFO] serf: EventMemberJoin: Node 1432dd6e-0915-3c58-3a41-ff38152fa68b.dc1 127.0.0.1
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.754289 [INFO] serf: EventMemberJoin: Node 1432dd6e-0915-3c58-3a41-ff38152fa68b 127.0.0.1
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.754845 [INFO] consul: Handled member-join event for server "Node 1432dd6e-0915-3c58-3a41-ff38152fa68b.dc1" in area "wan"
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.755131 [INFO] consul: Adding LAN server Node 1432dd6e-0915-3c58-3a41-ff38152fa68b (Addr: tcp/127.0.0.1:11884) (DC: dc1)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.755398 [INFO] agent: Started DNS server 127.0.0.1:11879 (tcp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.755472 [INFO] agent: Started DNS server 127.0.0.1:11879 (udp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.757511 [INFO] agent: Started HTTP server on 127.0.0.1:11880 (tcp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:44.757610 [INFO] agent: started state syncer
2019/11/27 02:20:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:44 [INFO]  raft: Node at 127.0.0.1:11884 [Candidate] entering Candidate state in term 2
TestDNS_Compression_Recurse - 2019/11/27 02:20:45.014936 [INFO] agent: Synced node info
TestDNS_Compression_Recurse - 2019/11/27 02:20:45.015052 [DEBUG] agent: Node info in sync
TestEventFire - 2019/11/27 02:20:46.109356 [DEBUG] agent: Node info in sync
TestEventFire - 2019/11/27 02:20:46.109469 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/11/27 02:20:46.148562 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventFire_token - 2019/11/27 02:20:46.149020 [DEBUG] consul: Skipping self join check for "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66" since the cluster is too small
TestEventFire_token - 2019/11/27 02:20:46.149126 [INFO] consul: member 'Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66' joined, marking health alive
2019/11/27 02:20:46 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:46 [INFO]  raft: Node at 127.0.0.1:11884 [Leader] entering Leader state
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:46.294988 [INFO] consul: cluster leadership acquired
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:46.295498 [INFO] consul: New leader elected: Node 1432dd6e-0915-3c58-3a41-ff38152fa68b
TestEventFire_token - 2019/11/27 02:20:46.441555 [DEBUG] consul: Skipping self join check for "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66" since the cluster is too small
TestEventFire_token - 2019/11/27 02:20:46.442311 [DEBUG] consul: Skipping self join check for "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66" since the cluster is too small
TestEventFire_token - 2019/11/27 02:20:46.452567 [DEBUG] consul: dropping node "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66" from result due to ACLs
TestDNS_Compression_Recurse - 2019/11/27 02:20:46.804720 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:20:47.089515 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:47.089588 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/11/27 02:20:48.179369 [WARN] consul: user event "foo" blocked by ACLs
TestEventFire_token - 2019/11/27 02:20:48.179982 [WARN] consul: user event "bar" blocked by ACLs
TestEventFire_token - 2019/11/27 02:20:48.180716 [INFO] agent: Requesting shutdown
TestEventFire_token - 2019/11/27 02:20:48.180801 [INFO] consul: shutting down server
TestEventFire_token - 2019/11/27 02:20:48.180855 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/11/27 02:20:48.184584 [DEBUG] consul: User event: baz
TestEventFire_token - 2019/11/27 02:20:48.188053 [DEBUG] agent: new event: baz (c0bc9914-c441-6a25-bc5a-1c3ad20622db)
TestEventFire_token - 2019/11/27 02:20:48.189234 [WARN] consul: error getting server health from "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66": rpc error making call: EOF
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:48.333616 [INFO] agent: Synced node info
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:48.333751 [DEBUG] agent: Node info in sync
TestEventFire_token - 2019/11/27 02:20:48.336522 [WARN] serf: Shutdown without a Leave
TestEventFire - 2019/11/27 02:20:48.444877 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventFire - 2019/11/27 02:20:48.445302 [DEBUG] consul: Skipping self join check for "Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6" since the cluster is too small
TestEventFire - 2019/11/27 02:20:48.445469 [INFO] consul: member 'Node ae7d515f-9b81-ec5b-5aef-af3ea1c60ec6' joined, marking health alive
TestEventFire_token - 2019/11/27 02:20:48.556071 [INFO] manager: shutting down
TestEventFire_token - 2019/11/27 02:20:48.556614 [INFO] agent: consul server down
TestEventFire_token - 2019/11/27 02:20:48.556731 [INFO] agent: shutdown complete
TestEventFire_token - 2019/11/27 02:20:48.556898 [INFO] agent: Stopping DNS server 127.0.0.1:11861 (tcp)
TestEventFire_token - 2019/11/27 02:20:48.557094 [INFO] agent: Stopping DNS server 127.0.0.1:11861 (udp)
TestEventFire_token - 2019/11/27 02:20:48.557264 [INFO] agent: Stopping HTTP server 127.0.0.1:11862 (tcp)
TestEventFire_token - 2019/11/27 02:20:48.557546 [INFO] agent: Waiting for endpoints to shut down
TestEventFire_token - 2019/11/27 02:20:48.557641 [INFO] agent: Endpoints down
--- PASS: TestEventFire_token (9.78s)
=== CONT  TestDNS_Compression_Query
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Compression_Query - 2019/11/27 02:20:48.635303 [WARN] agent: Node name "Node 70e9d50a-0b36-02f7-2ac2-038666bc8fad" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Compression_Query - 2019/11/27 02:20:48.635915 [DEBUG] tlsutil: Update with version 1
TestDNS_Compression_Query - 2019/11/27 02:20:48.636198 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_Query - 2019/11/27 02:20:48.636418 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_Compression_Query - 2019/11/27 02:20:48.636553 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_Recurse - 2019/11/27 02:20:48.666896 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_Compression_Recurse - 2019/11/27 02:20:48.667321 [DEBUG] consul: Skipping self join check for "Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3" since the cluster is too small
TestDNS_Compression_Recurse - 2019/11/27 02:20:48.667504 [INFO] consul: member 'Node 2efb3a1e-8ce2-7c7b-a37a-76c4bd41d5e3' joined, marking health alive
TestEventFire - 2019/11/27 02:20:48.681152 [INFO] agent: Requesting shutdown
TestEventFire - 2019/11/27 02:20:48.681261 [INFO] consul: shutting down server
TestEventFire - 2019/11/27 02:20:48.681314 [WARN] serf: Shutdown without a Leave
TestEventFire - 2019/11/27 02:20:48.681326 [DEBUG] consul: User event: test
TestEventFire - 2019/11/27 02:20:48.988309 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.012413 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (868.699µs) from client 127.0.0.1:59101 (udp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.013676 [DEBUG] dns: request for {2.0.0.127.in-addr.arpa. 255 1} (560.353µs) from client 127.0.0.1:59101 (udp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.013876 [INFO] agent: Requesting shutdown
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.013968 [INFO] consul: shutting down server
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.014055 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.022709 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (1.439719ms) Recursor queried: 127.0.0.1:59713
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.023084 [DEBUG] dns: request for {apple.com. 255 1} (udp) (2.449423ms) from client 127.0.0.1:49209 (udp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.024678 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (479.684µs) Recursor queried: 127.0.0.1:59713
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.024985 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.327715ms) from client 127.0.0.1:49209 (udp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.025307 [INFO] agent: Requesting shutdown
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.025405 [INFO] consul: shutting down server
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.025470 [WARN] serf: Shutdown without a Leave
TestEventFire_token - 2019/11/27 02:20:49.183414 [WARN] consul: error getting server health from "Node 8615ab6e-7097-e19a-2aa4-e41d0ed55c66": context deadline exceeded
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.648403 [DEBUG] agent: Node info in sync
TestEventFire - 2019/11/27 02:20:49.812976 [INFO] manager: shutting down
TestEventFire - 2019/11/27 02:20:49.813581 [INFO] agent: consul server down
TestEventFire - 2019/11/27 02:20:49.813635 [INFO] agent: shutdown complete
TestEventFire - 2019/11/27 02:20:49.813688 [INFO] agent: Stopping DNS server 127.0.0.1:11867 (tcp)
TestEventFire - 2019/11/27 02:20:49.813834 [INFO] agent: Stopping DNS server 127.0.0.1:11867 (udp)
TestEventFire - 2019/11/27 02:20:49.813988 [INFO] agent: Stopping HTTP server 127.0.0.1:11868 (tcp)
TestEventFire - 2019/11/27 02:20:49.814196 [INFO] agent: Waiting for endpoints to shut down
TestEventFire - 2019/11/27 02:20:49.814277 [INFO] agent: Endpoints down
--- PASS: TestEventFire (8.62s)
=== CONT  TestDNS_Compression_trimUDPResponse
=== CONT  TestDNS_syncExtra
--- PASS: TestDNS_syncExtra (0.00s)
=== CONT  TestDNS_trimUDPResponse_TrimSizeEDNS
--- PASS: TestDNS_Compression_trimUDPResponse (0.05s)
TestEventList_ACLFilter - 2019/11/27 02:20:49.876941 [ERR] autopilot: Error promoting servers: error checking for non-voters to promote: failed to get raft configuration: raft is already shutdown
TestEventList_ACLFilter - 2019/11/27 02:20:49.877090 [ERR] autopilot: Error checking for dead servers to remove: raft is already shutdown
TestDNS_Compression_Recurse - 2019/11/27 02:20:49.922148 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:49.922722 [WARN] serf: Shutdown without a Leave
--- PASS: TestDNS_trimUDPResponse_TrimSizeEDNS (0.06s)
=== CONT  TestDNS_trimUDPResponse_TrimSize
--- PASS: TestDNS_trimUDPResponse_TrimSize (0.05s)
=== CONT  TestDNS_trimUDPResponse_TrimLimit
--- PASS: TestDNS_trimUDPResponse_TrimLimit (0.04s)
=== CONT  TestDNS_trimUDPResponse_NoTrim
=== CONT  TestDNS_PreparedQuery_AgentSource
--- PASS: TestDNS_trimUDPResponse_NoTrim (0.04s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:50.120978 [WARN] agent: Node name "Node b894dd0f-e077-ac1a-940f-82d2ca283312" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:50.121388 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:50.121457 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:50.121630 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:50.121817 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.188358 [INFO] manager: shutting down
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.188376 [INFO] manager: shutting down
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.189239 [INFO] agent: consul server down
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.189308 [INFO] agent: shutdown complete
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.189369 [INFO] agent: Stopping DNS server 127.0.0.1:11873 (tcp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.189545 [INFO] agent: Stopping DNS server 127.0.0.1:11873 (udp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.189728 [INFO] agent: Stopping HTTP server 127.0.0.1:11874 (tcp)
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.189999 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_Recurse - 2019/11/27 02:20:50.190110 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_Recurse (8.89s)
=== CONT  TestDNS_InvalidQueries
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.301165 [INFO] agent: consul server down
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.301266 [INFO] agent: shutdown complete
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.301394 [INFO] agent: Stopping DNS server 127.0.0.1:11879 (tcp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.301627 [INFO] agent: Stopping DNS server 127.0.0.1:11879 (udp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.302033 [INFO] agent: Stopping HTTP server 127.0.0.1:11880 (tcp)
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.302321 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.302419 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_ReverseLookup (6.86s)
=== CONT  TestDNS_PreparedQuery_AllowStale
TestDNS_Compression_ReverseLookup - 2019/11/27 02:20:50.331835 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_InvalidQueries - 2019/11/27 02:20:50.358710 [WARN] agent: Node name "Node 527abf9d-4ce6-d45d-ef9e-ea921ca2252c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_InvalidQueries - 2019/11/27 02:20:50.359164 [DEBUG] tlsutil: Update with version 1
TestDNS_InvalidQueries - 2019/11/27 02:20:50.359240 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_InvalidQueries - 2019/11/27 02:20:50.359420 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_InvalidQueries - 2019/11/27 02:20:50.359528 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:50.440238 [WARN] agent: Node name "Node 8ea9e535-e813-f6d4-ea99-dd070557faf5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:50.440982 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:50.441216 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:50.442063 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:50.442367 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:70e9d50a-0b36-02f7-2ac2-038666bc8fad Address:127.0.0.1:11890}]
2019/11/27 02:20:51 [INFO]  raft: Node at 127.0.0.1:11890 [Follower] entering Follower state (Leader: "")
TestDNS_Compression_Query - 2019/11/27 02:20:51.464267 [INFO] serf: EventMemberJoin: Node 70e9d50a-0b36-02f7-2ac2-038666bc8fad.dc1 127.0.0.1
TestDNS_Compression_Query - 2019/11/27 02:20:51.474378 [INFO] serf: EventMemberJoin: Node 70e9d50a-0b36-02f7-2ac2-038666bc8fad 127.0.0.1
TestDNS_Compression_Query - 2019/11/27 02:20:51.476670 [INFO] consul: Adding LAN server Node 70e9d50a-0b36-02f7-2ac2-038666bc8fad (Addr: tcp/127.0.0.1:11890) (DC: dc1)
TestDNS_Compression_Query - 2019/11/27 02:20:51.477371 [INFO] consul: Handled member-join event for server "Node 70e9d50a-0b36-02f7-2ac2-038666bc8fad.dc1" in area "wan"
TestDNS_Compression_Query - 2019/11/27 02:20:51.480494 [INFO] agent: Started DNS server 127.0.0.1:11885 (tcp)
TestDNS_Compression_Query - 2019/11/27 02:20:51.480707 [INFO] agent: Started DNS server 127.0.0.1:11885 (udp)
TestDNS_Compression_Query - 2019/11/27 02:20:51.485620 [INFO] agent: Started HTTP server on 127.0.0.1:11886 (tcp)
TestDNS_Compression_Query - 2019/11/27 02:20:51.488650 [INFO] agent: started state syncer
2019/11/27 02:20:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:51 [INFO]  raft: Node at 127.0.0.1:11890 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b894dd0f-e077-ac1a-940f-82d2ca283312 Address:127.0.0.1:11896}]
2019/11/27 02:20:51 [INFO]  raft: Node at 127.0.0.1:11896 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.848420 [INFO] serf: EventMemberJoin: Node b894dd0f-e077-ac1a-940f-82d2ca283312.dc1 127.0.0.1
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.852622 [INFO] serf: EventMemberJoin: Node b894dd0f-e077-ac1a-940f-82d2ca283312 127.0.0.1
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.853498 [INFO] consul: Adding LAN server Node b894dd0f-e077-ac1a-940f-82d2ca283312 (Addr: tcp/127.0.0.1:11896) (DC: dc1)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.854004 [INFO] consul: Handled member-join event for server "Node b894dd0f-e077-ac1a-940f-82d2ca283312.dc1" in area "wan"
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.855173 [INFO] agent: Started DNS server 127.0.0.1:11891 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.855631 [INFO] agent: Started DNS server 127.0.0.1:11891 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.857836 [INFO] agent: Started HTTP server on 127.0.0.1:11892 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:51.857922 [INFO] agent: started state syncer
2019/11/27 02:20:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:51 [INFO]  raft: Node at 127.0.0.1:11896 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:527abf9d-4ce6-d45d-ef9e-ea921ca2252c Address:127.0.0.1:11902}]
2019/11/27 02:20:52 [INFO]  raft: Node at 127.0.0.1:11902 [Follower] entering Follower state (Leader: "")
TestDNS_InvalidQueries - 2019/11/27 02:20:52.228064 [INFO] serf: EventMemberJoin: Node 527abf9d-4ce6-d45d-ef9e-ea921ca2252c.dc1 127.0.0.1
2019/11/27 02:20:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8ea9e535-e813-f6d4-ea99-dd070557faf5 Address:127.0.0.1:11908}]
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.234601 [INFO] serf: EventMemberJoin: Node 8ea9e535-e813-f6d4-ea99-dd070557faf5.dc1 127.0.0.1
2019/11/27 02:20:52 [INFO]  raft: Node at 127.0.0.1:11908 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.249685 [INFO] serf: EventMemberJoin: Node 8ea9e535-e813-f6d4-ea99-dd070557faf5 127.0.0.1
TestDNS_InvalidQueries - 2019/11/27 02:20:52.250786 [INFO] serf: EventMemberJoin: Node 527abf9d-4ce6-d45d-ef9e-ea921ca2252c 127.0.0.1
TestDNS_InvalidQueries - 2019/11/27 02:20:52.251437 [INFO] consul: Adding LAN server Node 527abf9d-4ce6-d45d-ef9e-ea921ca2252c (Addr: tcp/127.0.0.1:11902) (DC: dc1)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.252219 [INFO] consul: Adding LAN server Node 8ea9e535-e813-f6d4-ea99-dd070557faf5 (Addr: tcp/127.0.0.1:11908) (DC: dc1)
TestDNS_InvalidQueries - 2019/11/27 02:20:52.252962 [INFO] consul: Handled member-join event for server "Node 527abf9d-4ce6-d45d-ef9e-ea921ca2252c.dc1" in area "wan"
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.253660 [INFO] consul: Handled member-join event for server "Node 8ea9e535-e813-f6d4-ea99-dd070557faf5.dc1" in area "wan"
TestDNS_InvalidQueries - 2019/11/27 02:20:52.256468 [INFO] agent: Started DNS server 127.0.0.1:11897 (tcp)
TestDNS_InvalidQueries - 2019/11/27 02:20:52.256559 [INFO] agent: Started DNS server 127.0.0.1:11897 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.257705 [INFO] agent: Started DNS server 127.0.0.1:11903 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.258076 [INFO] agent: Started DNS server 127.0.0.1:11903 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.260106 [INFO] agent: Started HTTP server on 127.0.0.1:11904 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:52.260417 [INFO] agent: started state syncer
2019/11/27 02:20:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:52 [INFO]  raft: Node at 127.0.0.1:11902 [Candidate] entering Candidate state in term 2
TestDNS_InvalidQueries - 2019/11/27 02:20:52.274151 [INFO] agent: Started HTTP server on 127.0.0.1:11898 (tcp)
TestDNS_InvalidQueries - 2019/11/27 02:20:52.274275 [INFO] agent: started state syncer
2019/11/27 02:20:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:52 [INFO]  raft: Node at 127.0.0.1:11908 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:52 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:52 [INFO]  raft: Node at 127.0.0.1:11890 [Leader] entering Leader state
TestDNS_Compression_Query - 2019/11/27 02:20:52.501331 [INFO] consul: cluster leadership acquired
TestDNS_Compression_Query - 2019/11/27 02:20:52.501914 [INFO] consul: New leader elected: Node 70e9d50a-0b36-02f7-2ac2-038666bc8fad
jones - 2019/11/27 02:20:52.748178 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:52.748268 [DEBUG] agent: Node info in sync
2019/11/27 02:20:52 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:52 [INFO]  raft: Node at 127.0.0.1:11896 [Leader] entering Leader state
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:52.791338 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:52.791889 [INFO] consul: New leader elected: Node b894dd0f-e077-ac1a-940f-82d2ca283312
TestDNS_Compression_Query - 2019/11/27 02:20:53.089054 [INFO] agent: Synced node info
TestDNS_Compression_Query - 2019/11/27 02:20:53.089188 [DEBUG] agent: Node info in sync
2019/11/27 02:20:53 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:53 [INFO]  raft: Node at 127.0.0.1:11902 [Leader] entering Leader state
TestDNS_InvalidQueries - 2019/11/27 02:20:53.221793 [INFO] consul: cluster leadership acquired
TestDNS_InvalidQueries - 2019/11/27 02:20:53.222236 [INFO] consul: New leader elected: Node 527abf9d-4ce6-d45d-ef9e-ea921ca2252c
2019/11/27 02:20:53 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:53 [INFO]  raft: Node at 127.0.0.1:11908 [Leader] entering Leader state
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:53.470828 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:53.471273 [INFO] consul: New leader elected: Node 8ea9e535-e813-f6d4-ea99-dd070557faf5
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.590514 [INFO] agent: Synced node info
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.604227 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.605925 [DEBUG] dns: request for name foo.query.consul. type SRV class IN (took 450.016µs) from client 127.0.0.1:58754 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.606001 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.606063 [INFO] consul: shutting down server
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.606112 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.777016 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.932525 [INFO] manager: shutting down
TestDNS_InvalidQueries - 2019/11/27 02:20:53.933388 [INFO] agent: Synced node info
TestDNS_InvalidQueries - 2019/11/27 02:20:53.933507 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936034 [INFO] agent: consul server down
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936105 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936162 [INFO] agent: Stopping DNS server 127.0.0.1:11891 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936314 [INFO] agent: Stopping DNS server 127.0.0.1:11891 (udp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936497 [INFO] agent: Stopping HTTP server 127.0.0.1:11892 (tcp)
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936794 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.936882 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_AgentSource (3.88s)
=== CONT  TestDNS_NonExistingLookupEmptyAorAAAA
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.938114 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_PreparedQuery_AgentSource - 2019/11/27 02:20:53.938520 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_InvalidQueries - 2019/11/27 02:20:53.939166 [WARN] dns: QName invalid: 
TestDNS_InvalidQueries - 2019/11/27 02:20:53.939578 [DEBUG] dns: request for name consul. type SRV class IN (took 362.013µs) from client 127.0.0.1:51376 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:53.940274 [WARN] dns: QName invalid: node.
TestDNS_InvalidQueries - 2019/11/27 02:20:53.940852 [DEBUG] dns: request for name node.consul. type SRV class IN (took 546.02µs) from client 127.0.0.1:45973 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:53.941802 [WARN] dns: QName invalid: service.
TestDNS_InvalidQueries - 2019/11/27 02:20:53.942623 [DEBUG] dns: request for name service.consul. type SRV class IN (took 773.695µs) from client 127.0.0.1:46987 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:53.942975 [WARN] dns: QName invalid: query.
TestDNS_InvalidQueries - 2019/11/27 02:20:53.943776 [DEBUG] dns: request for name query.consul. type SRV class IN (took 761.361µs) from client 127.0.0.1:36104 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:53.944368 [WARN] dns: QName invalid: foo.node.dc1.extra.
TestDNS_InvalidQueries - 2019/11/27 02:20:53.945132 [DEBUG] dns: request for name foo.node.dc1.extra.consul. type SRV class IN (took 702.026µs) from client 127.0.0.1:56459 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:53.946230 [WARN] dns: QName invalid: foo.service.dc1.extra.
TestDNS_InvalidQueries - 2019/11/27 02:20:53.946957 [DEBUG] dns: request for name foo.service.dc1.extra.consul. type SRV class IN (took 681.358µs) from client 127.0.0.1:54215 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:53.947745 [WARN] dns: QName invalid: foo.query.dc1.extra.
TestDNS_InvalidQueries - 2019/11/27 02:20:53.948352 [INFO] agent: Requesting shutdown
TestDNS_InvalidQueries - 2019/11/27 02:20:53.948560 [INFO] consul: shutting down server
TestDNS_InvalidQueries - 2019/11/27 02:20:53.948742 [WARN] serf: Shutdown without a Leave
TestDNS_InvalidQueries - 2019/11/27 02:20:53.950643 [DEBUG] dns: request for name foo.query.dc1.extra.consul. type SRV class IN (took 2.872772ms) from client 127.0.0.1:41447 (udp)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:54.007818 [WARN] agent: Node name "Node 694a80d0-4393-aa36-aea1-2e462f9615a6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:54.008407 [DEBUG] tlsutil: Update with version 1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:54.008472 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:54.008747 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:54.008907 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.166975 [INFO] agent: Synced node info
TestDNS_InvalidQueries - 2019/11/27 02:20:54.167008 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.172127 [WARN] consul: endpoint injected; this should only be used for testing
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.174790 [WARN] dns: Query results too stale, re-requesting
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.175229 [DEBUG] dns: request for name nope.query.consul. type SRV class IN (took 570.688µs) from client 127.0.0.1:40539 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.175482 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.175553 [INFO] consul: shutting down server
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.175596 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.244856 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.245126 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.312040 [WARN] serf: Shutdown without a Leave
TestDNS_InvalidQueries - 2019/11/27 02:20:54.312142 [INFO] manager: shutting down
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.473080 [INFO] manager: shutting down
TestDNS_InvalidQueries - 2019/11/27 02:20:54.643889 [INFO] agent: consul server down
TestDNS_InvalidQueries - 2019/11/27 02:20:54.644002 [INFO] agent: shutdown complete
TestDNS_InvalidQueries - 2019/11/27 02:20:54.644085 [INFO] agent: Stopping DNS server 127.0.0.1:11897 (tcp)
TestDNS_InvalidQueries - 2019/11/27 02:20:54.644256 [INFO] agent: Stopping DNS server 127.0.0.1:11897 (udp)
TestDNS_InvalidQueries - 2019/11/27 02:20:54.644426 [INFO] agent: Stopping HTTP server 127.0.0.1:11898 (tcp)
TestDNS_InvalidQueries - 2019/11/27 02:20:54.644665 [INFO] agent: Waiting for endpoints to shut down
TestDNS_InvalidQueries - 2019/11/27 02:20:54.644742 [INFO] agent: Endpoints down
--- PASS: TestDNS_InvalidQueries (4.45s)
=== CONT  TestDNS_NonExistingLookup
TestDNS_InvalidQueries - 2019/11/27 02:20:54.649963 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_Compression_Query - 2019/11/27 02:20:54.651053 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 636.689µs) from client 127.0.0.1:59202 (udp)
TestDNS_Compression_Query - 2019/11/27 02:20:54.652042 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 574.354µs) from client 127.0.0.1:59202 (udp)
TestDNS_Compression_Query - 2019/11/27 02:20:54.653871 [DEBUG] dns: request for name 667a2cfc-e404-2c91-dfa0-de224359f495.query.consul. type SRV class IN (took 848.031µs) from client 127.0.0.1:34999 (udp)
TestDNS_Compression_Query - 2019/11/27 02:20:54.654891 [INFO] agent: Requesting shutdown
TestDNS_Compression_Query - 2019/11/27 02:20:54.654971 [INFO] consul: shutting down server
TestDNS_Compression_Query - 2019/11/27 02:20:54.655019 [WARN] serf: Shutdown without a Leave
TestDNS_Compression_Query - 2019/11/27 02:20:54.654912 [DEBUG] dns: request for name 667a2cfc-e404-2c91-dfa0-de224359f495.query.consul. type SRV class IN (took 647.691µs) from client 127.0.0.1:34999 (udp)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NonExistingLookup - 2019/11/27 02:20:54.784412 [WARN] agent: Node name "Node 36fd0194-f315-e599-097a-dc0928d41dc6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NonExistingLookup - 2019/11/27 02:20:54.787701 [DEBUG] tlsutil: Update with version 1
TestDNS_NonExistingLookup - 2019/11/27 02:20:54.787989 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NonExistingLookup - 2019/11/27 02:20:54.788655 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NonExistingLookup - 2019/11/27 02:20:54.789045 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_Query - 2019/11/27 02:20:54.948225 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951034 [INFO] agent: consul server down
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951098 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951159 [INFO] agent: Stopping DNS server 127.0.0.1:11903 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951301 [INFO] agent: Stopping DNS server 127.0.0.1:11903 (udp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951443 [INFO] agent: Stopping HTTP server 127.0.0.1:11904 (tcp)
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951623 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.951757 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_AllowStale (4.65s)
=== CONT  TestDNS_ServiceLookup_FilterACL
=== RUN   TestDNS_ServiceLookup_FilterACL/ACLToken_==_root
TestDNS_PreparedQuery_AllowStale - 2019/11/27 02:20:54.953228 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:55.026977 [WARN] agent: Node name "Node 51ab641d-22df-eb61-f737-13797501dd8b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:55.027539 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:55.027727 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:55.027991 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:55.028253 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Compression_Query - 2019/11/27 02:20:55.150091 [DEBUG] agent: Node info in sync
TestDNS_Compression_Query - 2019/11/27 02:20:55.176966 [INFO] manager: shutting down
TestDNS_Compression_Query - 2019/11/27 02:20:55.310773 [INFO] agent: consul server down
TestDNS_Compression_Query - 2019/11/27 02:20:55.310862 [INFO] agent: shutdown complete
TestDNS_Compression_Query - 2019/11/27 02:20:55.310940 [INFO] agent: Stopping DNS server 127.0.0.1:11885 (tcp)
TestDNS_Compression_Query - 2019/11/27 02:20:55.311111 [INFO] agent: Stopping DNS server 127.0.0.1:11885 (udp)
TestDNS_Compression_Query - 2019/11/27 02:20:55.311308 [INFO] agent: Stopping HTTP server 127.0.0.1:11886 (tcp)
TestDNS_Compression_Query - 2019/11/27 02:20:55.311559 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Compression_Query - 2019/11/27 02:20:55.311655 [INFO] agent: Endpoints down
--- PASS: TestDNS_Compression_Query (6.75s)
=== CONT  TestDNS_ServiceLookup_SRV_RFC_TCP_Default
TestDNS_Compression_Query - 2019/11/27 02:20:55.314804 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:55.391363 [WARN] agent: Node name "Node eb9dd2c5-ad88-1ae7-7281-27ba73a005cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:55.392040 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:55.392132 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:55.392319 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:55.392465 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:694a80d0-4393-aa36-aea1-2e462f9615a6 Address:127.0.0.1:11914}]
2019/11/27 02:20:56 [INFO]  raft: Node at 127.0.0.1:11914 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.584820 [INFO] serf: EventMemberJoin: Node 694a80d0-4393-aa36-aea1-2e462f9615a6.dc1 127.0.0.1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.589459 [INFO] serf: EventMemberJoin: Node 694a80d0-4393-aa36-aea1-2e462f9615a6 127.0.0.1
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.590184 [INFO] consul: Adding LAN server Node 694a80d0-4393-aa36-aea1-2e462f9615a6 (Addr: tcp/127.0.0.1:11914) (DC: dc1)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.590705 [INFO] consul: Handled member-join event for server "Node 694a80d0-4393-aa36-aea1-2e462f9615a6.dc1" in area "wan"
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.591454 [INFO] agent: Started DNS server 127.0.0.1:11909 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.591820 [INFO] agent: Started DNS server 127.0.0.1:11909 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.593792 [INFO] agent: Started HTTP server on 127.0.0.1:11910 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:56.593867 [INFO] agent: started state syncer
2019/11/27 02:20:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:56 [INFO]  raft: Node at 127.0.0.1:11914 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:36fd0194-f315-e599-097a-dc0928d41dc6 Address:127.0.0.1:11920}]
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11920 [Follower] entering Follower state (Leader: "")
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.247923 [INFO] serf: EventMemberJoin: Node 36fd0194-f315-e599-097a-dc0928d41dc6.dc1 127.0.0.1
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.252503 [INFO] serf: EventMemberJoin: Node 36fd0194-f315-e599-097a-dc0928d41dc6 127.0.0.1
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.256901 [INFO] consul: Adding LAN server Node 36fd0194-f315-e599-097a-dc0928d41dc6 (Addr: tcp/127.0.0.1:11920) (DC: dc1)
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.257433 [INFO] consul: Handled member-join event for server "Node 36fd0194-f315-e599-097a-dc0928d41dc6.dc1" in area "wan"
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.258852 [INFO] agent: Started DNS server 127.0.0.1:11915 (tcp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.259382 [INFO] agent: Started DNS server 127.0.0.1:11915 (udp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.261419 [INFO] agent: Started HTTP server on 127.0.0.1:11916 (tcp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:57.261519 [INFO] agent: started state syncer
2019/11/27 02:20:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11920 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:51ab641d-22df-eb61-f737-13797501dd8b Address:127.0.0.1:11926}]
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11926 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.394531 [INFO] serf: EventMemberJoin: Node 51ab641d-22df-eb61-f737-13797501dd8b.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.399551 [INFO] serf: EventMemberJoin: Node 51ab641d-22df-eb61-f737-13797501dd8b 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.400311 [INFO] consul: Handled member-join event for server "Node 51ab641d-22df-eb61-f737-13797501dd8b.dc1" in area "wan"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.400646 [INFO] consul: Adding LAN server Node 51ab641d-22df-eb61-f737-13797501dd8b (Addr: tcp/127.0.0.1:11926) (DC: dc1)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.400796 [INFO] agent: Started DNS server 127.0.0.1:11921 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.401202 [INFO] agent: Started DNS server 127.0.0.1:11921 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.403155 [INFO] agent: Started HTTP server on 127.0.0.1:11922 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:57.403266 [INFO] agent: started state syncer
2019/11/27 02:20:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11926 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:57 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11914 [Leader] entering Leader state
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:57.700570 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:57.701118 [INFO] consul: New leader elected: Node 694a80d0-4393-aa36-aea1-2e462f9615a6
2019/11/27 02:20:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eb9dd2c5-ad88-1ae7-7281-27ba73a005cf Address:127.0.0.1:11932}]
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.882135 [INFO] serf: EventMemberJoin: Node eb9dd2c5-ad88-1ae7-7281-27ba73a005cf.dc1 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.886015 [INFO] serf: EventMemberJoin: Node eb9dd2c5-ad88-1ae7-7281-27ba73a005cf 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.887865 [INFO] agent: Started DNS server 127.0.0.1:11927 (udp)
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11932 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.890342 [INFO] consul: Adding LAN server Node eb9dd2c5-ad88-1ae7-7281-27ba73a005cf (Addr: tcp/127.0.0.1:11932) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.890390 [INFO] consul: Handled member-join event for server "Node eb9dd2c5-ad88-1ae7-7281-27ba73a005cf.dc1" in area "wan"
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.890800 [INFO] agent: Started DNS server 127.0.0.1:11927 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.893356 [INFO] agent: Started HTTP server on 127.0.0.1:11928 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:57.893452 [INFO] agent: started state syncer
2019/11/27 02:20:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:57 [INFO]  raft: Node at 127.0.0.1:11932 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:20:57.931929 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:20:57.932028 [DEBUG] agent: Node info in sync
2019/11/27 02:20:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:58 [INFO]  raft: Node at 127.0.0.1:11920 [Leader] entering Leader state
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:20:58.690504 [INFO] agent: Synced node info
TestDNS_NonExistingLookup - 2019/11/27 02:20:58.690657 [INFO] consul: cluster leadership acquired
TestDNS_NonExistingLookup - 2019/11/27 02:20:58.691015 [INFO] consul: New leader elected: Node 36fd0194-f315-e599-097a-dc0928d41dc6
2019/11/27 02:20:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:58 [INFO]  raft: Node at 127.0.0.1:11926 [Leader] entering Leader state
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:58.802336 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:58.802742 [INFO] consul: New leader elected: Node 51ab641d-22df-eb61-f737-13797501dd8b
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:58.902877 [ERR] agent: failed to sync remote state: ACL not found
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:58.952202 [INFO] acl: initializing acls
2019/11/27 02:20:59 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:59 [INFO]  raft: Node at 127.0.0.1:11932 [Leader] entering Leader state
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:59.049678 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:59.050222 [INFO] consul: New leader elected: Node eb9dd2c5-ad88-1ae7-7281-27ba73a005cf
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.173709 [INFO] agent: Synced node info
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.191927 [WARN] dns: QName invalid: nonexisting.
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.192375 [DEBUG] dns: request for name nonexisting.consul. type ANY class IN (took 405.014µs) from client 127.0.0.1:39818 (udp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.192610 [INFO] agent: Requesting shutdown
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.192680 [INFO] consul: shutting down server
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.192734 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.476576 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:59.480017 [INFO] consul: Created ACL 'global-management' policy
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:59.480097 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:59.481298 [INFO] acl: initializing acls
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:59.481415 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.587713 [INFO] manager: shutting down
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.587765 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588140 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588156 [INFO] agent: consul server down
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588280 [INFO] agent: shutdown complete
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588336 [INFO] agent: Stopping DNS server 127.0.0.1:11915 (tcp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588492 [INFO] agent: Stopping DNS server 127.0.0.1:11915 (udp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588648 [INFO] agent: Stopping HTTP server 127.0.0.1:11916 (tcp)
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588861 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NonExistingLookup - 2019/11/27 02:20:59.588941 [INFO] agent: Endpoints down
--- PASS: TestDNS_NonExistingLookup (4.94s)
=== CONT  TestDNS_ServiceLookup_SRV_RFC
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:20:59.699721 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:20:59.717388 [WARN] agent: Node name "Node 6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:20:59.718728 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:20:59.718974 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:20:59.719200 [INFO] consul: Bootstrapped ACL master token from configuration
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:20:59.719734 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:20:59.720052 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.303967 [DEBUG] dns: request for name _db._tcp.service.dc1.consul. type SRV class IN (took 774.695µs) from client 127.0.0.1:45420 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.307186 [DEBUG] dns: request for name _db._tcp.service.consul. type SRV class IN (took 886.366µs) from client 127.0.0.1:34871 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.309224 [DEBUG] dns: request for name _db._tcp.dc1.consul. type SRV class IN (took 744.027µs) from client 127.0.0.1:55335 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.311584 [DEBUG] dns: request for name _db._tcp.consul. type SRV class IN (took 750.36µs) from client 127.0.0.1:45814 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.311995 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.312090 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.312154 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.326458 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.326591 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:21:00.501169 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:21:00.501254 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.613705 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.613824 [INFO] consul: Bootstrapped ACL master token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.614681 [INFO] serf: EventMemberUpdate: Node 51ab641d-22df-eb61-f737-13797501dd8b
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:00.618852 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.613808 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.620269 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.620836 [DEBUG] acl: transitioning out of legacy ACL mode
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.620852 [INFO] serf: EventMemberUpdate: Node 51ab641d-22df-eb61-f737-13797501dd8b.dc1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.621673 [INFO] serf: EventMemberUpdate: Node 51ab641d-22df-eb61-f737-13797501dd8b
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.622775 [INFO] serf: EventMemberUpdate: Node 51ab641d-22df-eb61-f737-13797501dd8b.dc1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.623757 [DEBUG] consul: dropping node "Node 51ab641d-22df-eb61-f737-13797501dd8b" from result due to ACLs
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:00.624550 [DEBUG] consul: dropping node "Node 51ab641d-22df-eb61-f737-13797501dd8b" from result due to ACLs
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:01.158836 [DEBUG] agent: Node info in sync
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:01.159192 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.465469 [INFO] manager: shutting down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.465685 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.466117 [INFO] agent: consul server down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.466330 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.466388 [INFO] agent: Stopping DNS server 127.0.0.1:11927 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.466533 [INFO] agent: Stopping DNS server 127.0.0.1:11927 (udp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.466768 [INFO] agent: Stopping HTTP server 127.0.0.1:11928 (tcp)
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.467023 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SRV_RFC_TCP_Default - 2019/11/27 02:21:01.467098 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SRV_RFC_TCP_Default (6.16s)
=== CONT  TestDNS_PreparedQuery_Failover
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:01.548731 [WARN] agent: Node name "Node 9e4cc3e3-4a2d-b679-188a-0994869927d2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:01.549214 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:01.549381 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:01.549627 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:01.550584 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:21:03.348897 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 96ea3298-4984-8452-8dce-62bd7caf6d71.dc1 (Addr: tcp/127.0.0.1:11674) (DC: dc1)
jones - 2019/11/27 02:21:06.006593 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 4c613484-61cd-f189-9fd4-637dea8a81e0.dc1 (Addr: tcp/127.0.0.1:11680) (DC: dc1)
2019/11/27 02:21:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0 Address:127.0.0.1:11938}]
2019/11/27 02:21:07 [INFO]  raft: Node at 127.0.0.1:11938 [Follower] entering Follower state (Leader: "")
jones - 2019/11/27 02:21:07.237649 [DEBUG] consul: Skipping self join check for "Node 96ea3298-4984-8452-8dce-62bd7caf6d71" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.238843 [DEBUG] dns: request for name foo.service.consul. type A class IN (took 1.142708ms) from client 127.0.0.1:53288 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.244917 [DEBUG] dns: request for name webv4.service.consul. type AAAA class IN (took 567.02µs) from client 127.0.0.1:40014 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.248162 [INFO] serf: EventMemberJoin: Node 6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.249732 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.249843 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.249934 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.250297 [DEBUG] dns: request for name webv4.query.consul. type AAAA class IN (took 4.511498ms) from client 127.0.0.1:33084 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.253000 [INFO] serf: EventMemberJoin: Node 6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0 127.0.0.1
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.254265 [INFO] agent: Started DNS server 127.0.0.1:11933 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.255183 [INFO] consul: Adding LAN server Node 6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0 (Addr: tcp/127.0.0.1:11938) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.255378 [INFO] consul: Handled member-join event for server "Node 6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0.dc1" in area "wan"
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.255843 [INFO] agent: Started DNS server 127.0.0.1:11933 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.258588 [DEBUG] dns: request for name webv6.service.consul. type A class IN (took 575.688µs) from client 127.0.0.1:41051 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.259833 [DEBUG] dns: request for name webv6.query.consul. type A class IN (took 555.687µs) from client 127.0.0.1:49656 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.259903 [INFO] agent: Requesting shutdown
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.259973 [INFO] consul: shutting down server
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.260018 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.265725 [INFO] agent: Started HTTP server on 127.0.0.1:11934 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:07.265829 [INFO] agent: started state syncer
2019/11/27 02:21:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:07 [INFO]  raft: Node at 127.0.0.1:11938 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.498279 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.506899 [WARN] serf: Shutdown without a Leave
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.631819 [INFO] manager: shutting down
jones - 2019/11/27 02:21:07.632599 [DEBUG] consul: Skipping self join check for "Node 4c613484-61cd-f189-9fd4-637dea8a81e0" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.632849 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.632907 [INFO] manager: shutting down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.633380 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.633441 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.633504 [INFO] agent: Stopping DNS server 127.0.0.1:11921 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.633655 [INFO] agent: Stopping DNS server 127.0.0.1:11921 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.633852 [INFO] agent: Stopping HTTP server 127.0.0.1:11922 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.634084 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_root - 2019/11/27 02:21:07.634155 [INFO] agent: Endpoints down
=== RUN   TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.633654 [INFO] agent: consul server down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.637842 [INFO] agent: shutdown complete
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.637920 [INFO] agent: Stopping DNS server 127.0.0.1:11909 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.636119 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.638122 [INFO] agent: Stopping DNS server 127.0.0.1:11909 (udp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.638309 [INFO] agent: Stopping HTTP server 127.0.0.1:11910 (tcp)
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.638520 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NonExistingLookupEmptyAorAAAA - 2019/11/27 02:21:07.638591 [INFO] agent: Endpoints down
--- PASS: TestDNS_NonExistingLookupEmptyAorAAAA (13.70s)
=== CONT  TestDNS_PreparedQuery_TTL
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:07.784893 [WARN] agent: Node name "Node 63684143-d351-c059-4f09-72ac341187f1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:07.785304 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:07.785371 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:07.785560 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:07.785659 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:07.816487 [WARN] agent: Node name "Node 4f04d904-4a9f-1a89-d926-be1f642f58cb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:07.822417 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:07.825573 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:07.825908 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:07.826113 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:08 [INFO]  raft: Node at 127.0.0.1:11938 [Leader] entering Leader state
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:08.154870 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:08.155328 [INFO] consul: New leader elected: Node 6d8d3a2c-01db-fe1c-e3be-7b0d22d4c3b0
2019/11/27 02:21:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9e4cc3e3-4a2d-b679-188a-0994869927d2 Address:127.0.0.1:11944}]
2019/11/27 02:21:08 [INFO]  raft: Node at 127.0.0.1:11944 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.249393 [INFO] serf: EventMemberJoin: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2.dc1 127.0.0.1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.252966 [INFO] serf: EventMemberJoin: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2 127.0.0.1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.254375 [INFO] consul: Adding LAN server Node 9e4cc3e3-4a2d-b679-188a-0994869927d2 (Addr: tcp/127.0.0.1:11944) (DC: dc1)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.255053 [INFO] consul: Handled member-join event for server "Node 9e4cc3e3-4a2d-b679-188a-0994869927d2.dc1" in area "wan"
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.256959 [INFO] agent: Started DNS server 127.0.0.1:11939 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.257038 [INFO] agent: Started DNS server 127.0.0.1:11939 (udp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.258923 [INFO] agent: Started HTTP server on 127.0.0.1:11940 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:08.258992 [INFO] agent: started state syncer
2019/11/27 02:21:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:08 [INFO]  raft: Node at 127.0.0.1:11944 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:08.888200 [INFO] agent: Synced node info
2019/11/27 02:21:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:63684143-d351-c059-4f09-72ac341187f1 Address:127.0.0.1:11950}]
2019/11/27 02:21:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4f04d904-4a9f-1a89-d926-be1f642f58cb Address:127.0.0.1:11956}]
2019/11/27 02:21:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11944 [Leader] entering Leader state
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11950 [Follower] entering Follower state (Leader: "")
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11956 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:09.128081 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:09.130444 [INFO] consul: New leader elected: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.136991 [INFO] serf: EventMemberJoin: Node 63684143-d351-c059-4f09-72ac341187f1.dc1 127.0.0.1
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.144601 [INFO] serf: EventMemberJoin: Node 4f04d904-4a9f-1a89-d926-be1f642f58cb.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.155255 [INFO] serf: EventMemberJoin: Node 63684143-d351-c059-4f09-72ac341187f1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.164349 [INFO] agent: Started DNS server 127.0.0.1:11945 (udp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.181435 [INFO] serf: EventMemberJoin: Node 4f04d904-4a9f-1a89-d926-be1f642f58cb 127.0.0.1
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.183708 [INFO] agent: Started DNS server 127.0.0.1:11951 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.193083 [INFO] consul: Adding LAN server Node 63684143-d351-c059-4f09-72ac341187f1 (Addr: tcp/127.0.0.1:11950) (DC: dc1)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.197074 [INFO] agent: Started DNS server 127.0.0.1:11945 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.198351 [INFO] consul: Handled member-join event for server "Node 63684143-d351-c059-4f09-72ac341187f1.dc1" in area "wan"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.203766 [INFO] agent: Started HTTP server on 127.0.0.1:11946 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.203870 [INFO] agent: started state syncer
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:09.217963 [ERR] agent: failed to sync remote state: ACL not found
2019/11/27 02:21:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11950 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11956 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.224426 [INFO] consul: Handled member-join event for server "Node 4f04d904-4a9f-1a89-d926-be1f642f58cb.dc1" in area "wan"
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.234482 [INFO] consul: Adding LAN server Node 4f04d904-4a9f-1a89-d926-be1f642f58cb (Addr: tcp/127.0.0.1:11956) (DC: dc1)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.235254 [INFO] agent: Started DNS server 127.0.0.1:11951 (tcp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.238047 [INFO] agent: Started HTTP server on 127.0.0.1:11952 (tcp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.238163 [INFO] agent: started state syncer
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.359440 [DEBUG] dns: request for name _db._master.service.dc1.consul. type SRV class IN (took 837.03µs) from client 127.0.0.1:46348 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.360848 [DEBUG] dns: request for name _db._master.service.consul. type SRV class IN (took 542.02µs) from client 127.0.0.1:55661 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.362128 [DEBUG] dns: request for name _db._master.dc1.consul. type SRV class IN (took 600.689µs) from client 127.0.0.1:44098 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.363800 [DEBUG] dns: request for name _db._master.consul. type SRV class IN (took 487.018µs) from client 127.0.0.1:48505 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.363895 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.363971 [INFO] consul: shutting down server
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.364023 [WARN] serf: Shutdown without a Leave
jones - 2019/11/27 02:21:09.379836 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 005cb1c3-f8e5-2827-9833-9849ba78d405.dc1 (Addr: tcp/127.0.0.1:11686) (DC: dc1)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.453765 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.564891 [INFO] manager: shutting down
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.653974 [INFO] agent: consul server down
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654057 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:09.654014 [INFO] acl: initializing acls
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654122 [INFO] agent: Stopping DNS server 127.0.0.1:11933 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654494 [INFO] agent: Stopping DNS server 127.0.0.1:11933 (udp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654681 [INFO] agent: Stopping HTTP server 127.0.0.1:11934 (tcp)
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654883 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654961 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_SRV_RFC (10.07s)
=== CONT  TestDNS_ServiceLookup_TTL
TestDNS_ServiceLookup_SRV_RFC - 2019/11/27 02:21:09.654073 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:09.724784 [WARN] agent: Node name "Node 9a7b0a42-9752-710d-f245-9a0233b36870" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:09.725137 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:09.725203 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:09.725365 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:09.725469 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:09.806472 [INFO] acl: initializing acls
2019/11/27 02:21:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11956 [Leader] entering Leader state
2019/11/27 02:21:09 [INFO]  raft: Node at 127.0.0.1:11950 [Leader] entering Leader state
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:09.878374 [INFO] consul: Created ACL 'global-management' policy
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.878726 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:09.879034 [INFO] consul: New leader elected: Node 4f04d904-4a9f-1a89-d926-be1f642f58cb
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.879237 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.879541 [INFO] consul: New leader elected: Node 63684143-d351-c059-4f09-72ac341187f1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:09.968165 [INFO] acl: initializing acls
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.059133 [ERR] agent: failed to sync remote state: ACL not found
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.331888 [INFO] acl: initializing acls
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.332371 [INFO] consul: Created ACL 'global-management' policy
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.332443 [WARN] consul: Configuring a non-UUID master token is deprecated
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:10.334134 [INFO] agent: Synced node info
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:10.334278 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.444014 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.444988 [INFO] serf: EventMemberUpdate: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.445636 [INFO] serf: EventMemberUpdate: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2.dc1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.447586 [INFO] consul: Created ACL 'global-management' policy
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.447756 [DEBUG] acl: transitioning out of legacy ACL mode
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.448536 [INFO] serf: EventMemberUpdate: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.449206 [INFO] serf: EventMemberUpdate: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2.dc1
jones - 2019/11/27 02:21:10.469656 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:21:10.469745 [DEBUG] agent: Service "web1-sidecar-proxy" in sync
jones - 2019/11/27 02:21:10.469781 [DEBUG] agent: Node info in sync
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.532833 [WARN] agent: Node name "Node 77722f74-8595-b3ee-90e4-e60c9478590d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.533671 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.533828 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.534121 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:10.534315 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:21:10.677656 [DEBUG] consul: Skipping self join check for "Node 005cb1c3-f8e5-2827-9833-9849ba78d405" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.826427 [ERR] agent: failed to sync remote state: ACL not found
2019/11/27 02:21:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9a7b0a42-9752-710d-f245-9a0233b36870 Address:127.0.0.1:11962}]
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.924557 [INFO] serf: EventMemberJoin: Node 9a7b0a42-9752-710d-f245-9a0233b36870.dc1 127.0.0.1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.928156 [INFO] consul: Bootstrapped ACL master token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.929784 [INFO] consul: Created ACL 'global-management' policy
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:10.929859 [WARN] consul: Configuring a non-UUID master token is deprecated
2019/11/27 02:21:10 [INFO]  raft: Node at 127.0.0.1:11962 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.932012 [INFO] serf: EventMemberJoin: Node 9a7b0a42-9752-710d-f245-9a0233b36870 127.0.0.1
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.933313 [INFO] agent: Started DNS server 127.0.0.1:11957 (udp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.933713 [INFO] consul: Adding LAN server Node 9a7b0a42-9752-710d-f245-9a0233b36870 (Addr: tcp/127.0.0.1:11962) (DC: dc1)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.934185 [INFO] agent: Started DNS server 127.0.0.1:11957 (tcp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.936423 [INFO] consul: Handled member-join event for server "Node 9a7b0a42-9752-710d-f245-9a0233b36870.dc1" in area "wan"
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.937726 [INFO] agent: Started HTTP server on 127.0.0.1:11958 (tcp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:10.937909 [INFO] agent: started state syncer
2019/11/27 02:21:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:10 [INFO]  raft: Node at 127.0.0.1:11962 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:11.000656 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:11.611810 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:11.611904 [DEBUG] acl: transitioning out of legacy ACL mode
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:11.612663 [INFO] serf: EventMemberUpdate: Node 63684143-d351-c059-4f09-72ac341187f1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:11.613275 [INFO] serf: EventMemberUpdate: Node 63684143-d351-c059-4f09-72ac341187f1.dc1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:12.087934 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:12.088840 [INFO] serf: EventMemberUpdate: Node 63684143-d351-c059-4f09-72ac341187f1
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:12.089576 [INFO] serf: EventMemberUpdate: Node 63684143-d351-c059-4f09-72ac341187f1.dc1
2019/11/27 02:21:12 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:12 [INFO]  raft: Node at 127.0.0.1:11962 [Leader] entering Leader state
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.302626 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:12.304295 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:12.304813 [INFO] consul: New leader elected: Node 9a7b0a42-9752-710d-f245-9a0233b36870
2019/11/27 02:21:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:77722f74-8595-b3ee-90e4-e60c9478590d Address:127.0.0.1:11968}]
2019/11/27 02:21:12 [INFO]  raft: Node at 127.0.0.1:11968 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.700786 [INFO] serf: EventMemberJoin: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2 127.0.0.1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.704554 [INFO] serf: EventMemberJoin: Node 77722f74-8595-b3ee-90e4-e60c9478590d 127.0.0.1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.705312 [INFO] consul: Handled member-join event for server "Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2" in area "wan"
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.705628 [INFO] consul: Adding LAN server Node 77722f74-8595-b3ee-90e4-e60c9478590d (Addr: tcp/127.0.0.1:11968) (DC: dc2)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.705895 [INFO] agent: Started DNS server 127.0.0.1:11963 (udp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.706166 [INFO] agent: Started DNS server 127.0.0.1:11963 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.708506 [INFO] agent: Started HTTP server on 127.0.0.1:11964 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.708849 [INFO] agent: started state syncer
2019/11/27 02:21:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:12 [INFO]  raft: Node at 127.0.0.1:11968 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.813586 [INFO] agent: Synced node info
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.813711 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.813810 [DEBUG] consul: Skipping self join check for "Node 9e4cc3e3-4a2d-b679-188a-0994869927d2" since the cluster is too small
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:12.813983 [INFO] consul: member 'Node 9e4cc3e3-4a2d-b679-188a-0994869927d2' joined, marking health alive
jones - 2019/11/27 02:21:12.853150 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 3a0dee63-0112-ab1b-d438-213ed51c845e.dc1 (Addr: tcp/127.0.0.1:11692) (DC: dc1)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:13.556949 [DEBUG] consul: Skipping self join check for "Node 9e4cc3e3-4a2d-b679-188a-0994869927d2" since the cluster is too small
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:13.557625 [DEBUG] consul: Skipping self join check for "Node 9e4cc3e3-4a2d-b679-188a-0994869927d2" since the cluster is too small
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:13.562856 [INFO] agent: Synced node info
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:13.562968 [DEBUG] agent: Node info in sync
2019/11/27 02:21:14 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:14 [INFO]  raft: Node at 127.0.0.1:11968 [Leader] entering Leader state
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:14.278964 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:14.279410 [INFO] consul: New leader elected: Node 77722f74-8595-b3ee-90e4-e60c9478590d
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:14.352205 [ERR] agent: failed to sync remote state: ACL not found
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:14.398742 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:14.399374 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:14.399763 [DEBUG] consul: Skipping self join check for "Node 63684143-d351-c059-4f09-72ac341187f1" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:14.399868 [INFO] consul: member 'Node 63684143-d351-c059-4f09-72ac341187f1' joined, marking health alive
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:14.698110 [DEBUG] consul: Skipping self join check for "Node 4f04d904-4a9f-1a89-d926-be1f642f58cb" since the cluster is too small
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:14.698344 [INFO] consul: member 'Node 4f04d904-4a9f-1a89-d926-be1f642f58cb' joined, marking health alive
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:14.700450 [DEBUG] consul: Skipping self join check for "Node 63684143-d351-c059-4f09-72ac341187f1" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:14.700955 [DEBUG] consul: Skipping self join check for "Node 63684143-d351-c059-4f09-72ac341187f1" since the cluster is too small
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:14.719976 [DEBUG] consul: dropping node "Node 63684143-d351-c059-4f09-72ac341187f1" from result due to ACLs
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:14.847837 [INFO] acl: initializing acls
jones - 2019/11/27 02:21:15.481509 [DEBUG] consul: Skipping self join check for "Node 3a0dee63-0112-ab1b-d438-213ed51c845e" since the cluster is too small
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:15.484021 [INFO] consul: Created ACL 'global-management' policy
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:15.529161 [DEBUG] consul: dropping node "foo" from result due to ACLs
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:15.543302 [DEBUG] dns: request for name foo.service.consul. type A class IN (took 14.705535ms) from client 127.0.0.1:33449 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:15.543712 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:15.543889 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:15.544078 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:15.779992 [DEBUG] agent: Node info in sync
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:15.853405 [ERR] agent: failed to sync remote state: ACL not found
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:15.857247 [INFO] acl: initializing acls
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:16.204375 [WARN] consul: error getting server health from "Node 4f04d904-4a9f-1a89-d926-be1f642f58cb": context deadline exceeded
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.561477 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.675644 [INFO] manager: shutting down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.676327 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.676384 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.676437 [INFO] agent: Stopping DNS server 127.0.0.1:11945 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.676589 [INFO] agent: Stopping DNS server 127.0.0.1:11945 (udp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.676832 [INFO] agent: Stopping HTTP server 127.0.0.1:11946 (tcp)
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.677041 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous - 2019/11/27 02:21:16.677107 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_FilterACL (21.73s)
    --- PASS: TestDNS_ServiceLookup_FilterACL/ACLToken_==_root (12.68s)
    --- PASS: TestDNS_ServiceLookup_FilterACL/ACLToken_==_anonymous (9.04s)
=== CONT  TestDNS_NodeLookup_TTL
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.679572 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.680548 [INFO] serf: EventMemberUpdate: Node 77722f74-8595-b3ee-90e4-e60c9478590d
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.682288 [INFO] serf: EventMemberUpdate: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.702558 [INFO] agent: (WAN) joining: [127.0.0.1:11943]
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.703347 [DEBUG] memberlist: Stream connection from=127.0.0.1:58154
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.703608 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:11943
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.706510 [INFO] serf: EventMemberJoin: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2 127.0.0.1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.707487 [INFO] consul: Handled member-join event for server "Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2" in area "wan"
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.708594 [INFO] serf: EventMemberJoin: Node 9e4cc3e3-4a2d-b679-188a-0994869927d2.dc1 127.0.0.1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.709045 [INFO] agent: (WAN) joined: 1 Err: <nil>
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.710121 [INFO] consul: Handled member-join event for server "Node 9e4cc3e3-4a2d-b679-188a-0994869927d2.dc1" in area "wan"
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:16.749661 [WARN] agent: Node name "Node edb206ab-5e46-0c38-e003-c29d0e26a0f7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:16.750069 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:16.750141 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:16.750313 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:16.751207 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.979336 [INFO] consul: Created ACL anonymous token from configuration
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.979423 [DEBUG] acl: transitioning out of legacy ACL mode
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.980241 [INFO] serf: EventMemberUpdate: Node 77722f74-8595-b3ee-90e4-e60c9478590d
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:16.980854 [INFO] serf: EventMemberUpdate: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
=== RUN   TestDNS_ServiceLookup_TTL/db.service.consul.
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.986482 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 613.356µs) from client 127.0.0.1:38485 (udp)
=== RUN   TestDNS_ServiceLookup_TTL/dblb.service.consul.
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.988202 [DEBUG] dns: request for name dblb.service.consul. type SRV class IN (took 512.352µs) from client 127.0.0.1:48575 (udp)
=== RUN   TestDNS_ServiceLookup_TTL/dk.service.consul.
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.989625 [DEBUG] dns: request for name dk.service.consul. type SRV class IN (took 476.684µs) from client 127.0.0.1:45513 (udp)
=== RUN   TestDNS_ServiceLookup_TTL/api.service.consul.
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.991130 [DEBUG] dns: request for name api.service.consul. type SRV class IN (took 536.686µs) from client 127.0.0.1:52940 (udp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.991430 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.991489 [INFO] consul: shutting down server
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:16.991530 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.122163 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.205140 [INFO] serf: EventMemberUpdate: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.205288 [DEBUG] serf: messageJoinType: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.219989 [INFO] manager: shutting down
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.220956 [INFO] agent: consul server down
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.221016 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.221076 [INFO] agent: Stopping DNS server 127.0.0.1:11957 (tcp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.221244 [INFO] agent: Stopping DNS server 127.0.0.1:11957 (udp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.221406 [INFO] agent: Stopping HTTP server 127.0.0.1:11958 (tcp)
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.221674 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.221835 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_TTL (7.57s)
    --- PASS: TestDNS_ServiceLookup_TTL/db.service.consul. (0.01s)
    --- PASS: TestDNS_ServiceLookup_TTL/dblb.service.consul. (0.00s)
    --- PASS: TestDNS_ServiceLookup_TTL/dk.service.consul. (0.00s)
    --- PASS: TestDNS_ServiceLookup_TTL/api.service.consul. (0.00s)
=== CONT  TestDNS_ServiceLookup_AnswerLimits
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{0_0_0_0_0_0_0_0_0_0_0}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{1_1_1_1_1_1_1_1_1_1_1}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{2_2_2_2_2_2_2_2_2_2_2}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{3_3_3_3_3_3_3_3_3_3_3}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{4_4_4_4_4_4_4_4_4_4_4}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{5_5_5_5_5_5_5_5_5_5_5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{6_6_6_6_6_6_6_5_6_6_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{7_7_7_7_6_7_7_5_7_7_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{8_8_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{9_9_8_8_6_8_8_5_8_8_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{20_20_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/A_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/A_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/AAAA_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== RUN   TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
=== PAUSE TestDNS_ServiceLookup_AnswerLimits/ANY_lookup_{30_30_8_8_6_8_8_5_8_-5_-5}
TestDNS_ServiceLookup_TTL - 2019/11/27 02:21:17.235139 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
=== CONT  TestDNS_ServiceLookup_LargeResponses
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.251115 [DEBUG] serf: messageJoinType: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:17.300235 [WARN] agent: Node name "Node cc5be898-7971-8573-8f8b-6a131f6f355a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:17.300788 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:17.300981 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:17.301330 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:17.301580 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:21:17.555627 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 975e7e65-a3c0-20b5-0590-acecff7cdae7.dc1 (Addr: tcp/127.0.0.1:11698) (DC: dc1)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.576312 [DEBUG] dns: request for name my-query.query.consul. type SRV class IN (took 8.041293ms) from client 127.0.0.1:40770 (udp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.576417 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.576489 [INFO] consul: shutting down server
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.576530 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.703862 [DEBUG] serf: messageJoinType: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.704485 [DEBUG] serf: messageJoinType: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.704581 [DEBUG] serf: messageJoinType: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:17.751042 [DEBUG] serf: messageJoinType: Node 77722f74-8595-b3ee-90e4-e60c9478590d.dc2
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.119964 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.331013 [INFO] manager: shutting down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.331115 [INFO] manager: shutting down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.333939 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.334203 [INFO] agent: consul server down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.334259 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.334323 [INFO] agent: Stopping DNS server 127.0.0.1:11963 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.334541 [INFO] agent: Stopping DNS server 127.0.0.1:11963 (udp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.334752 [INFO] agent: Stopping HTTP server 127.0.0.1:11964 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.334982 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.335057 [INFO] agent: Endpoints down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.335098 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.335144 [INFO] consul: shutting down server
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.335186 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.447394 [WARN] serf: Shutdown without a Leave
2019/11/27 02:21:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:edb206ab-5e46-0c38-e003-c29d0e26a0f7 Address:127.0.0.1:11974}]
2019/11/27 02:21:18 [INFO]  raft: Node at 127.0.0.1:11974 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.565517 [INFO] manager: shutting down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.565568 [INFO] manager: shutting down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.566464 [INFO] agent: consul server down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.566530 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.566587 [INFO] agent: Stopping DNS server 127.0.0.1:11939 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.566808 [INFO] agent: Stopping DNS server 127.0.0.1:11939 (udp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.566986 [INFO] agent: Stopping HTTP server 127.0.0.1:11940 (tcp)
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.567212 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:18.567299 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_Failover (17.10s)
=== CONT  TestDNS_ServiceLookup_Truncate
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.580094 [INFO] serf: EventMemberJoin: Node edb206ab-5e46-0c38-e003-c29d0e26a0f7.dc1 127.0.0.1
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.587957 [INFO] serf: EventMemberJoin: Node edb206ab-5e46-0c38-e003-c29d0e26a0f7 127.0.0.1
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.588632 [INFO] consul: Handled member-join event for server "Node edb206ab-5e46-0c38-e003-c29d0e26a0f7.dc1" in area "wan"
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.588933 [INFO] consul: Adding LAN server Node edb206ab-5e46-0c38-e003-c29d0e26a0f7 (Addr: tcp/127.0.0.1:11974) (DC: dc1)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.589294 [INFO] agent: Started DNS server 127.0.0.1:11969 (udp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.589469 [INFO] agent: Started DNS server 127.0.0.1:11969 (tcp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.591337 [INFO] agent: Started HTTP server on 127.0.0.1:11970 (tcp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:18.591426 [INFO] agent: started state syncer
2019/11/27 02:21:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:18 [INFO]  raft: Node at 127.0.0.1:11974 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:18.641243 [WARN] agent: Node name "Node b283eb2e-39d9-4c22-f094-158a4c34282d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:18.641626 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:18.641725 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:18.641914 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:18.642012 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
=== RUN   TestDNS_PreparedQuery_TTL/db.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.691566 [DEBUG] dns: request for name db.query.consul. type SRV class IN (took 821.363µs) from client 127.0.0.1:49003 (udp)
jones - 2019/11/27 02:21:18.692288 [DEBUG] consul: Skipping self join check for "Node 975e7e65-a3c0-20b5-0590-acecff7cdae7" since the cluster is too small
=== RUN   TestDNS_PreparedQuery_TTL/db-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.694669 [DEBUG] dns: request for name db-ttl.query.consul. type SRV class IN (took 600.355µs) from client 127.0.0.1:44980 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dblb.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.696303 [DEBUG] dns: request for name dblb.query.consul. type SRV class IN (took 559.02µs) from client 127.0.0.1:37457 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dblb-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.697916 [DEBUG] dns: request for name dblb-ttl.query.consul. type SRV class IN (took 597.021µs) from client 127.0.0.1:36661 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dk.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.699498 [DEBUG] dns: request for name dk.query.consul. type SRV class IN (took 514.352µs) from client 127.0.0.1:59060 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/dk-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.700936 [DEBUG] dns: request for name dk-ttl.query.consul. type SRV class IN (took 535.02µs) from client 127.0.0.1:36236 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/api.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.702539 [DEBUG] dns: request for name api.query.consul. type SRV class IN (took 506.352µs) from client 127.0.0.1:37060 (udp)
=== RUN   TestDNS_PreparedQuery_TTL/api-ttl.query.consul.
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.704128 [DEBUG] dns: request for name api-ttl.query.consul. type SRV class IN (took 481.017µs) from client 127.0.0.1:55463 (udp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.704357 [INFO] agent: Requesting shutdown
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.704422 [INFO] consul: shutting down server
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.704469 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.808791 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.908766 [INFO] manager: shutting down
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.909656 [INFO] agent: consul server down
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.909724 [INFO] agent: shutdown complete
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.909786 [INFO] agent: Stopping DNS server 127.0.0.1:11951 (tcp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.909956 [INFO] agent: Stopping DNS server 127.0.0.1:11951 (udp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.910168 [INFO] agent: Stopping HTTP server 127.0.0.1:11952 (tcp)
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.910450 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQuery_TTL - 2019/11/27 02:21:18.910536 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQuery_TTL (11.27s)
    --- PASS: TestDNS_PreparedQuery_TTL/db.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/db-ttl.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dblb.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dblb-ttl.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dk.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/dk-ttl.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/api.query.consul. (0.00s)
    --- PASS: TestDNS_PreparedQuery_TTL/api-ttl.query.consul. (0.00s)
=== CONT  TestBinarySearch
=== RUN   TestBinarySearch/binarySearch_12
2019/11/27 02:21:19 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cc5be898-7971-8573-8f8b-6a131f6f355a Address:127.0.0.1:11980}]
2019/11/27 02:21:19 [INFO]  raft: Node at 127.0.0.1:11980 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.043417 [INFO] serf: EventMemberJoin: Node cc5be898-7971-8573-8f8b-6a131f6f355a.dc1 127.0.0.1
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.049949 [INFO] serf: EventMemberJoin: Node cc5be898-7971-8573-8f8b-6a131f6f355a 127.0.0.1
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.051402 [INFO] agent: Started DNS server 127.0.0.1:11975 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.052094 [INFO] consul: Adding LAN server Node cc5be898-7971-8573-8f8b-6a131f6f355a (Addr: tcp/127.0.0.1:11980) (DC: dc1)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.052364 [INFO] consul: Handled member-join event for server "Node cc5be898-7971-8573-8f8b-6a131f6f355a.dc1" in area "wan"
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.052937 [INFO] agent: Started DNS server 127.0.0.1:11975 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.056389 [INFO] agent: Started HTTP server on 127.0.0.1:11976 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:19.056477 [INFO] agent: started state syncer
=== RUN   TestBinarySearch/binarySearch_256
2019/11/27 02:21:19 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:19 [INFO]  raft: Node at 127.0.0.1:11980 [Candidate] entering Candidate state in term 2
=== RUN   TestBinarySearch/binarySearch_512
=== RUN   TestBinarySearch/binarySearch_8192
=== RUN   TestBinarySearch/binarySearch_65535
=== RUN   TestBinarySearch/binarySearch_12#01
=== RUN   TestBinarySearch/binarySearch_256#01
=== RUN   TestBinarySearch/binarySearch_512#01
=== RUN   TestBinarySearch/binarySearch_8192#01
2019/11/27 02:21:19 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:19 [INFO]  raft: Node at 127.0.0.1:11974 [Leader] entering Leader state
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:19.497420 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:19.497859 [INFO] consul: New leader elected: Node edb206ab-5e46-0c38-e003-c29d0e26a0f7
=== RUN   TestBinarySearch/binarySearch_65535#01
jones - 2019/11/27 02:21:19.570691 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 403133b7-b420-a957-4062-918c86f7ac39.dc1 (Addr: tcp/127.0.0.1:11704) (DC: dc1)
=== CONT  TestDNS_ServiceLookup_Randomize
--- PASS: TestBinarySearch (0.68s)
    --- PASS: TestBinarySearch/binarySearch_12 (0.11s)
    --- PASS: TestBinarySearch/binarySearch_256 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_512 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_8192 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_65535 (0.08s)
    --- PASS: TestBinarySearch/binarySearch_12#01 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_256#01 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_512#01 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_8192#01 (0.05s)
    --- PASS: TestBinarySearch/binarySearch_65535#01 (0.08s)
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:19.664976 [WARN] agent: Node name "Node 589994c0-b728-e303-5e94-c0d8ae44013a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:19.665427 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:19.665565 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:19.665809 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:19.665980 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:20 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:20 [INFO]  raft: Node at 127.0.0.1:11980 [Leader] entering Leader state
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:20.342318 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:20.342807 [INFO] consul: New leader elected: Node cc5be898-7971-8573-8f8b-6a131f6f355a
2019/11/27 02:21:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b283eb2e-39d9-4c22-f094-158a4c34282d Address:127.0.0.1:11986}]
2019/11/27 02:21:20 [INFO]  raft: Node at 127.0.0.1:11986 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.603349 [INFO] serf: EventMemberJoin: Node b283eb2e-39d9-4c22-f094-158a4c34282d.dc1 127.0.0.1
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.606471 [INFO] serf: EventMemberJoin: Node b283eb2e-39d9-4c22-f094-158a4c34282d 127.0.0.1
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.607781 [INFO] agent: Started DNS server 127.0.0.1:11981 (udp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.608258 [INFO] consul: Adding LAN server Node b283eb2e-39d9-4c22-f094-158a4c34282d (Addr: tcp/127.0.0.1:11986) (DC: dc1)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.608521 [INFO] consul: Handled member-join event for server "Node b283eb2e-39d9-4c22-f094-158a4c34282d.dc1" in area "wan"
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.609001 [INFO] agent: Started DNS server 127.0.0.1:11981 (tcp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:20.610046 [INFO] agent: Synced node info
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:20.611942 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.613202 [INFO] agent: Started HTTP server on 127.0.0.1:11982 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:20.613360 [INFO] agent: started state syncer
2019/11/27 02:21:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:20 [INFO]  raft: Node at 127.0.0.1:11986 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:21.021281 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:21.303600 [INFO] agent: Synced node info
jones - 2019/11/27 02:21:21.555332 [DEBUG] consul: Skipping self join check for "Node 403133b7-b420-a957-4062-918c86f7ac39" since the cluster is too small
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:21.981154 [DEBUG] consul: shutting down leader loop
TestDNS_PreparedQuery_Failover - 2019/11/27 02:21:21.981222 [INFO] consul: cluster leadership lost
2019/11/27 02:21:22 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:22 [INFO]  raft: Node at 127.0.0.1:11986 [Leader] entering Leader state
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:22.311262 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:22.311971 [INFO] consul: New leader elected: Node b283eb2e-39d9-4c22-f094-158a4c34282d
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:22.422383 [DEBUG] dns: request for name foo.node.consul. type ANY class IN (took 728.693µs) from client 127.0.0.1:56647 (udp)
2019/11/27 02:21:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:589994c0-b728-e303-5e94-c0d8ae44013a Address:127.0.0.1:11992}]
2019/11/27 02:21:22 [INFO]  raft: Node at 127.0.0.1:11992 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.605970 [INFO] serf: EventMemberJoin: Node 589994c0-b728-e303-5e94-c0d8ae44013a.dc1 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.610414 [INFO] serf: EventMemberJoin: Node 589994c0-b728-e303-5e94-c0d8ae44013a 127.0.0.1
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.611001 [INFO] consul: Adding LAN server Node 589994c0-b728-e303-5e94-c0d8ae44013a (Addr: tcp/127.0.0.1:11992) (DC: dc1)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.611417 [INFO] consul: Handled member-join event for server "Node 589994c0-b728-e303-5e94-c0d8ae44013a.dc1" in area "wan"
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.612634 [INFO] agent: Started DNS server 127.0.0.1:11987 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.612707 [INFO] agent: Started DNS server 127.0.0.1:11987 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.614638 [INFO] agent: Started HTTP server on 127.0.0.1:11988 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:22.614725 [INFO] agent: started state syncer
2019/11/27 02:21:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:22 [INFO]  raft: Node at 127.0.0.1:11992 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:22.999677 [DEBUG] dns: request for name bar.node.consul. type ANY class IN (took 461.683µs) from client 127.0.0.1:59036 (udp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:23.160480 [INFO] agent: Synced node info
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:23.178738 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:23.178866 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.314361 [DEBUG] dns: cname recurse RTT for www.google.com. (759.361µs)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.314901 [DEBUG] dns: request for name google.node.consul. type ANY class IN (took 2.47109ms) from client 127.0.0.1:38017 (udp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.316845 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.317122 [INFO] consul: shutting down server
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.317258 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:23.354521 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:23.354619 [DEBUG] agent: Node info in sync
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.452793 [WARN] serf: Shutdown without a Leave
jones - 2019/11/27 02:21:23.567376 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6.dc1 (Addr: tcp/127.0.0.1:11710) (DC: dc1)
2019/11/27 02:21:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:23 [INFO]  raft: Node at 127.0.0.1:11992 [Leader] entering Leader state
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:23.588001 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:23.588426 [INFO] consul: New leader elected: Node 589994c0-b728-e303-5e94-c0d8ae44013a
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.591381 [WARN] consul: error getting server health from "Node edb206ab-5e46-0c38-e003-c29d0e26a0f7": rpc error making call: EOF
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.593329 [INFO] manager: shutting down
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.854503 [INFO] agent: consul server down
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.854616 [INFO] agent: shutdown complete
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.854695 [INFO] agent: Stopping DNS server 127.0.0.1:11969 (tcp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.854906 [INFO] agent: Stopping DNS server 127.0.0.1:11969 (udp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.855091 [INFO] agent: Stopping HTTP server 127.0.0.1:11970 (tcp)
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.855397 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.855483 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_TTL (7.18s)
=== CONT  TestDNS_ServiceLookup_OnlyPassing
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.857999 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:23.858078 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:23.938991 [WARN] agent: Node name "Node 70fdec8b-4226-1759-1f2a-87753cbfae19" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:23.939359 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:23.939420 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:23.939573 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:23.939667 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:24.242883 [INFO] agent: Synced node info
TestDNS_NodeLookup_TTL - 2019/11/27 02:21:24.586741 [WARN] consul: error getting server health from "Node edb206ab-5e46-0c38-e003-c29d0e26a0f7": context deadline exceeded
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:24.680867 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:24.680999 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:21:24.742388 [DEBUG] consul: Skipping self join check for "Node 2347fd59-3fd9-73da-16e7-50f89d3e62a6" since the cluster is too small
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.866082 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.866503 [DEBUG] consul: Skipping self join check for "Node cc5be898-7971-8573-8f8b-6a131f6f355a" since the cluster is too small
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.866732 [INFO] consul: member 'Node cc5be898-7971-8573-8f8b-6a131f6f355a' joined, marking health alive
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.868144 [DEBUG] dns: request for name _this-is-a-very-very-very-very-very-long-name-for-a-service._master.service.consul. type SRV class IN (took 1.083706ms) from client 127.0.0.1:56392 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.870261 [DEBUG] dns: request for name this-is-a-very-very-very-very-very-long-name-for-a-service.query.consul. type SRV class IN (took 969.369µs) from client 127.0.0.1:37917 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.870427 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.870494 [INFO] consul: shutting down server
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.870539 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:24.987892 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.130732 [INFO] manager: shutting down
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.134970 [ERR] consul: failed to reconcile member: {Node cc5be898-7971-8573-8f8b-6a131f6f355a 127.0.0.1 11978 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:cc5be898-7971-8573-8f8b-6a131f6f355a port:11980 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:11979] alive 1 5 2 2 5 4}: leadership lost while committing log
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.135302 [INFO] agent: consul server down
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.135363 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.135421 [INFO] agent: Stopping DNS server 127.0.0.1:11975 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.135566 [INFO] agent: Stopping DNS server 127.0.0.1:11975 (udp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.135757 [INFO] agent: Stopping HTTP server 127.0.0.1:11976 (tcp)
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.135998 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_LargeResponses - 2019/11/27 02:21:25.136072 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_LargeResponses (7.90s)
=== CONT  TestDNS_ServiceLookup_OnlyFailing
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:25.214900 [WARN] agent: Node name "Node 554e9f60-b8a5-62ed-5696-188e1f54ecd5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:25.215449 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:25.215594 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:25.215821 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:25.215994 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:21:25.360485 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:21:25.360579 [DEBUG] agent: Node info in sync
2019/11/27 02:21:25 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:70fdec8b-4226-1759-1f2a-87753cbfae19 Address:127.0.0.1:11998}]
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.437559 [INFO] serf: EventMemberJoin: Node 70fdec8b-4226-1759-1f2a-87753cbfae19.dc1 127.0.0.1
2019/11/27 02:21:25 [INFO]  raft: Node at 127.0.0.1:11998 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.444362 [INFO] serf: EventMemberJoin: Node 70fdec8b-4226-1759-1f2a-87753cbfae19 127.0.0.1
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.446376 [INFO] consul: Adding LAN server Node 70fdec8b-4226-1759-1f2a-87753cbfae19 (Addr: tcp/127.0.0.1:11998) (DC: dc1)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.447469 [INFO] consul: Handled member-join event for server "Node 70fdec8b-4226-1759-1f2a-87753cbfae19.dc1" in area "wan"
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.449832 [INFO] agent: Started DNS server 127.0.0.1:11993 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.450561 [INFO] agent: Started DNS server 127.0.0.1:11993 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.453449 [INFO] agent: Started HTTP server on 127.0.0.1:11994 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:25.456784 [INFO] agent: started state syncer
2019/11/27 02:21:25 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:25 [INFO]  raft: Node at 127.0.0.1:11998 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:21:25.769987 [DEBUG] manager: Rebalanced 1 servers, next active server is Node ce219ef7-1c48-e006-7f93-81e0fe3f4967.dc1 (Addr: tcp/127.0.0.1:11716) (DC: dc1)
2019/11/27 02:21:26 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:26 [INFO]  raft: Node at 127.0.0.1:11998 [Leader] entering Leader state
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:26.401597 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:26.402093 [INFO] consul: New leader elected: Node 70fdec8b-4226-1759-1f2a-87753cbfae19
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:26.545960 [INFO] connect: initialized primary datacenter CA with provider "consul"
2019/11/27 02:21:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:554e9f60-b8a5-62ed-5696-188e1f54ecd5 Address:127.0.0.1:12004}]
2019/11/27 02:21:26 [INFO]  raft: Node at 127.0.0.1:12004 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.679048 [INFO] serf: EventMemberJoin: Node 554e9f60-b8a5-62ed-5696-188e1f54ecd5.dc1 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.684036 [INFO] serf: EventMemberJoin: Node 554e9f60-b8a5-62ed-5696-188e1f54ecd5 127.0.0.1
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.690304 [INFO] consul: Adding LAN server Node 554e9f60-b8a5-62ed-5696-188e1f54ecd5 (Addr: tcp/127.0.0.1:12004) (DC: dc1)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.691427 [INFO] agent: Started DNS server 127.0.0.1:11999 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.695449 [INFO] agent: Started DNS server 127.0.0.1:11999 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.691571 [INFO] consul: Handled member-join event for server "Node 554e9f60-b8a5-62ed-5696-188e1f54ecd5.dc1" in area "wan"
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.697551 [INFO] agent: Started HTTP server on 127.0.0.1:12000 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:26.697719 [INFO] agent: started state syncer
2019/11/27 02:21:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:26 [INFO]  raft: Node at 127.0.0.1:12004 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:26.844215 [DEBUG] consul: Skipping self join check for "Node b283eb2e-39d9-4c22-f094-158a4c34282d" since the cluster is too small
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:21:26.844401 [INFO] consul: member 'Node b283eb2e-39d9-4c22-f094-158a4c34282d' joined, marking health alive
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:27.151803 [INFO] connect: initialized primary datacenter CA with provider "consul"
jones - 2019/11/27 02:21:27.697680 [DEBUG] consul: Skipping self join check for "Node ce219ef7-1c48-e006-7f93-81e0fe3f4967" since the cluster is too small
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:28.044636 [DEBUG] consul: Skipping self join check for "Node 589994c0-b728-e303-5e94-c0d8ae44013a" since the cluster is too small
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:21:28.044947 [INFO] consul: member 'Node 589994c0-b728-e303-5e94-c0d8ae44013a' joined, marking health alive
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:28.051224 [INFO] agent: Synced node info
2019/11/27 02:21:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:28 [INFO]  raft: Node at 127.0.0.1:12004 [Leader] entering Leader state
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:28.328210 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:28.328671 [INFO] consul: New leader elected: Node 554e9f60-b8a5-62ed-5696-188e1f54ecd5
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:28.898199 [INFO] agent: Synced node info
jones - 2019/11/27 02:21:28.959524 [DEBUG] manager: Rebalanced 1 servers, next active server is Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22.dc1 (Addr: tcp/127.0.0.1:11722) (DC: dc1)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:29.220373 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:29.220507 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.285646 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.285780 [DEBUG] agent: Node info in sync
jones - 2019/11/27 02:21:30.668582 [DEBUG] consul: Skipping self join check for "Node 36e68f41-fcf1-fd4e-fd91-f1e92fff0f22" since the cluster is too small
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.849040 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 866.032µs) from client 127.0.0.1:57511 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.852002 [DEBUG] dns: request for name e47e62c8-678d-e4ac-1828-0eb75acc42a0.query.consul. type ANY class IN (took 953.368µs) from client 127.0.0.1:53545 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.852094 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.852182 [INFO] consul: shutting down server
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.852232 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:30.942032 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.041484 [INFO] manager: shutting down
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.042532 [INFO] agent: consul server down
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.042589 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.042640 [INFO] agent: Stopping DNS server 127.0.0.1:11993 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.042808 [INFO] agent: Stopping DNS server 127.0.0.1:11993 (udp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.042998 [INFO] agent: Stopping HTTP server 127.0.0.1:11994 (tcp)
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.043212 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.043304 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_OnlyPassing (7.19s)
=== CONT  TestDNS_ServiceLookup_FilterCritical
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.051946 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.054636 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestDNS_ServiceLookup_OnlyPassing - 2019/11/27 02:21:31.055083 [ERR] consul: failed to reconcile member: {Node 70fdec8b-4226-1759-1f2a-87753cbfae19 127.0.0.1 11996 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:70fdec8b-4226-1759-1f2a-87753cbfae19 port:11998 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:11997] alive 1 5 2 2 5 4}: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:31.211231 [WARN] agent: Node name "Node 318db47b-c7ca-1b57-900c-50d092149e76" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:31.211857 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:31.211936 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:31.212306 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:31.212543 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.343980 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 729.693µs) from client 127.0.0.1:38315 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.346010 [DEBUG] dns: request for name 7168b484-7cf7-441d-e424-09f977528e25.query.consul. type ANY class IN (took 1.100707ms) from client 127.0.0.1:35668 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.346055 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.346145 [INFO] consul: shutting down server
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.346192 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.508071 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.663585 [INFO] manager: shutting down
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.664622 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.664876 [INFO] agent: consul server down
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.664929 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.664982 [INFO] agent: Stopping DNS server 127.0.0.1:11999 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.665135 [INFO] agent: Stopping DNS server 127.0.0.1:11999 (udp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.665484 [INFO] agent: Stopping HTTP server 127.0.0.1:12000 (tcp)
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.665756 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_OnlyFailing - 2019/11/27 02:21:31.665829 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_OnlyFailing (6.53s)
=== CONT  TestDNS_RecursorTimeout
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_RecursorTimeout - 2019/11/27 02:21:31.778839 [WARN] agent: Node name "Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_RecursorTimeout - 2019/11/27 02:21:31.779254 [DEBUG] tlsutil: Update with version 1
TestDNS_RecursorTimeout - 2019/11/27 02:21:31.779324 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_RecursorTimeout - 2019/11/27 02:21:31.779499 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_RecursorTimeout - 2019/11/27 02:21:31.779611 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:318db47b-c7ca-1b57-900c-50d092149e76 Address:127.0.0.1:12010}]
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.212341 [INFO] serf: EventMemberJoin: Node 318db47b-c7ca-1b57-900c-50d092149e76.dc1 127.0.0.1
2019/11/27 02:21:33 [INFO]  raft: Node at 127.0.0.1:12010 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.215694 [INFO] serf: EventMemberJoin: Node 318db47b-c7ca-1b57-900c-50d092149e76 127.0.0.1
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.216491 [INFO] consul: Handled member-join event for server "Node 318db47b-c7ca-1b57-900c-50d092149e76.dc1" in area "wan"
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.216868 [INFO] consul: Adding LAN server Node 318db47b-c7ca-1b57-900c-50d092149e76 (Addr: tcp/127.0.0.1:12010) (DC: dc1)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.220944 [INFO] agent: Started DNS server 127.0.0.1:12005 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.221036 [INFO] agent: Started DNS server 127.0.0.1:12005 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.223237 [INFO] agent: Started HTTP server on 127.0.0.1:12006 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:33.223336 [INFO] agent: started state syncer
2019/11/27 02:21:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:33 [INFO]  raft: Node at 127.0.0.1:12010 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:33 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:85aa58c8-1613-8ca9-6af9-a83adc6fe96a Address:127.0.0.1:12016}]
2019/11/27 02:21:33 [INFO]  raft: Node at 127.0.0.1:12016 [Follower] entering Follower state (Leader: "")
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.670310 [INFO] serf: EventMemberJoin: Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a.dc1 127.0.0.1
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.673641 [INFO] serf: EventMemberJoin: Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a 127.0.0.1
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.674428 [INFO] consul: Handled member-join event for server "Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a.dc1" in area "wan"
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.674744 [INFO] consul: Adding LAN server Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a (Addr: tcp/127.0.0.1:12016) (DC: dc1)
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.674970 [INFO] agent: Started DNS server 127.0.0.1:12011 (udp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.675329 [INFO] agent: Started DNS server 127.0.0.1:12011 (tcp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.677390 [INFO] agent: Started HTTP server on 127.0.0.1:12012 (tcp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:33.677489 [INFO] agent: started state syncer
2019/11/27 02:21:33 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:33 [INFO]  raft: Node at 127.0.0.1:12016 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:34 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:34 [INFO]  raft: Node at 127.0.0.1:12010 [Leader] entering Leader state
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:34.108912 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:34.109501 [INFO] consul: New leader elected: Node 318db47b-c7ca-1b57-900c-50d092149e76
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:34.610294 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:34.610432 [DEBUG] agent: Node info in sync
2019/11/27 02:21:34 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:34 [INFO]  raft: Node at 127.0.0.1:12016 [Leader] entering Leader state
TestDNS_RecursorTimeout - 2019/11/27 02:21:34.613091 [INFO] consul: cluster leadership acquired
TestDNS_RecursorTimeout - 2019/11/27 02:21:34.613642 [INFO] consul: New leader elected: Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a
TestDNS_RecursorTimeout - 2019/11/27 02:21:35.179618 [INFO] agent: Synced node info
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:37.538386 [DEBUG] agent: Node info in sync
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.048930 [DEBUG] agent: Node info in sync
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.049045 [DEBUG] agent: Node info in sync
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.191780 [ERR] dns: recurse failed: read udp 127.0.0.1:33051->127.0.0.1:58723: i/o timeout
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.191920 [ERR] dns: all resolvers failed for {apple.com. 255 1} from client 127.0.0.1:46581 (udp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.192251 [DEBUG] dns: request for {apple.com. 255 1} (udp) (3.001116161s) from client 127.0.0.1:46581 (udp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.192338 [INFO] agent: Requesting shutdown
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.192402 [INFO] consul: shutting down server
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.192449 [WARN] serf: Shutdown without a Leave
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.322378 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.322866 [DEBUG] consul: Skipping self join check for "Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a" since the cluster is too small
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.323079 [INFO] consul: member 'Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a' joined, marking health alive
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.440906 [WARN] serf: Shutdown without a Leave
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.522838 [INFO] manager: shutting down
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.678056 [ERR] consul: failed to reconcile member: {Node 85aa58c8-1613-8ca9-6af9-a83adc6fe96a 127.0.0.1 12014 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:85aa58c8-1613-8ca9-6af9-a83adc6fe96a port:12016 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:12015] alive 1 5 2 2 5 4}: leadership lost while committing log
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.678380 [INFO] agent: consul server down
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.678435 [INFO] agent: shutdown complete
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.678491 [INFO] agent: Stopping DNS server 127.0.0.1:12011 (tcp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.678665 [INFO] agent: Stopping DNS server 127.0.0.1:12011 (udp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.678825 [INFO] agent: Stopping HTTP server 127.0.0.1:12012 (tcp)
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.679067 [INFO] agent: Waiting for endpoints to shut down
TestDNS_RecursorTimeout - 2019/11/27 02:21:38.679147 [INFO] agent: Endpoints down
--- PASS: TestDNS_RecursorTimeout (7.01s)
=== CONT  TestDNS_Recurse_Truncation
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Recurse_Truncation - 2019/11/27 02:21:38.908557 [WARN] agent: Node name "Node 23e19d96-ab5d-d140-22a1-1b10b1ca8078" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Recurse_Truncation - 2019/11/27 02:21:38.909093 [DEBUG] tlsutil: Update with version 1
TestDNS_Recurse_Truncation - 2019/11/27 02:21:38.909172 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Recurse_Truncation - 2019/11/27 02:21:38.909367 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_Recurse_Truncation - 2019/11/27 02:21:38.909484 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.220962 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.221420 [DEBUG] consul: Skipping self join check for "Node 318db47b-c7ca-1b57-900c-50d092149e76" since the cluster is too small
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.221584 [INFO] consul: member 'Node 318db47b-c7ca-1b57-900c-50d092149e76' joined, marking health alive
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.224882 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 879.032µs) from client 127.0.0.1:40737 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.227013 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.227095 [INFO] consul: shutting down server
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.227141 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.234715 [DEBUG] dns: request for name 5b285771-a500-9973-5df2-ed074b9a136c.query.consul. type ANY class IN (took 813.696µs) from client 127.0.0.1:44875 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.341354 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.499015 [INFO] manager: shutting down
2019/11/27 02:21:39 [ERR] yamux: Failed to read stream data: read tcp 127.0.0.1:55485->127.0.0.1:12010: use of closed network connection
2019/11/27 02:21:39 [WARN] yamux: failed to send go away: session shutdown
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.499977 [INFO] agent: consul server down
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.500014 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.500059 [INFO] agent: Stopping DNS server 127.0.0.1:12005 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.500206 [INFO] agent: Stopping DNS server 127.0.0.1:12005 (udp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.500360 [INFO] agent: Stopping HTTP server 127.0.0.1:12006 (tcp)
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.500552 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.500627 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_FilterCritical (8.46s)
=== CONT  TestDNS_Recurse
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.503092 [WARN] consul: error getting server health from "Node 318db47b-c7ca-1b57-900c-50d092149e76": rpc error making call: EOF
2019/11/27 02:21:39 [ERR] yamux: Failed to write header: write tcp 127.0.0.1:12010->127.0.0.1:55485: write: broken pipe
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:39.504153 [ERR] consul.rpc: multiplex conn accept failed: write tcp 127.0.0.1:12010->127.0.0.1:55485: write: broken pipe from=127.0.0.1:55485
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_Recurse - 2019/11/27 02:21:39.566006 [WARN] agent: Node name "Node 1ae0bc5b-f7dd-d80c-16ac-30184f57fe74" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_Recurse - 2019/11/27 02:21:39.566429 [DEBUG] tlsutil: Update with version 1
TestDNS_Recurse - 2019/11/27 02:21:39.566612 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Recurse - 2019/11/27 02:21:39.566856 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_Recurse - 2019/11/27 02:21:39.566978 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_FilterCritical - 2019/11/27 02:21:40.222909 [WARN] consul: error getting server health from "Node 318db47b-c7ca-1b57-900c-50d092149e76": context deadline exceeded
2019/11/27 02:21:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:23e19d96-ab5d-d140-22a1-1b10b1ca8078 Address:127.0.0.1:12022}]
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.395925 [INFO] serf: EventMemberJoin: Node 23e19d96-ab5d-d140-22a1-1b10b1ca8078.dc1 127.0.0.1
2019/11/27 02:21:40 [INFO]  raft: Node at 127.0.0.1:12022 [Follower] entering Follower state (Leader: "")
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.399708 [INFO] serf: EventMemberJoin: Node 23e19d96-ab5d-d140-22a1-1b10b1ca8078 127.0.0.1
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.401349 [INFO] agent: Started DNS server 127.0.0.1:12017 (udp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.401963 [INFO] consul: Handled member-join event for server "Node 23e19d96-ab5d-d140-22a1-1b10b1ca8078.dc1" in area "wan"
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.402416 [INFO] consul: Adding LAN server Node 23e19d96-ab5d-d140-22a1-1b10b1ca8078 (Addr: tcp/127.0.0.1:12022) (DC: dc1)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.403755 [INFO] agent: Started DNS server 127.0.0.1:12017 (tcp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.411506 [INFO] agent: Started HTTP server on 127.0.0.1:12018 (tcp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:40.412097 [INFO] agent: started state syncer
2019/11/27 02:21:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:40 [INFO]  raft: Node at 127.0.0.1:12022 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1ae0bc5b-f7dd-d80c-16ac-30184f57fe74 Address:127.0.0.1:12028}]
2019/11/27 02:21:40 [INFO]  raft: Node at 127.0.0.1:12028 [Follower] entering Follower state (Leader: "")
TestDNS_Recurse - 2019/11/27 02:21:40.945099 [INFO] serf: EventMemberJoin: Node 1ae0bc5b-f7dd-d80c-16ac-30184f57fe74.dc1 127.0.0.1
TestDNS_Recurse - 2019/11/27 02:21:40.950578 [INFO] serf: EventMemberJoin: Node 1ae0bc5b-f7dd-d80c-16ac-30184f57fe74 127.0.0.1
TestDNS_Recurse - 2019/11/27 02:21:40.951622 [INFO] consul: Adding LAN server Node 1ae0bc5b-f7dd-d80c-16ac-30184f57fe74 (Addr: tcp/127.0.0.1:12028) (DC: dc1)
TestDNS_Recurse - 2019/11/27 02:21:40.952341 [INFO] consul: Handled member-join event for server "Node 1ae0bc5b-f7dd-d80c-16ac-30184f57fe74.dc1" in area "wan"
TestDNS_Recurse - 2019/11/27 02:21:40.952350 [INFO] agent: Started DNS server 127.0.0.1:12023 (udp)
TestDNS_Recurse - 2019/11/27 02:21:40.952868 [INFO] agent: Started DNS server 127.0.0.1:12023 (tcp)
TestDNS_Recurse - 2019/11/27 02:21:40.954947 [INFO] agent: Started HTTP server on 127.0.0.1:12024 (tcp)
TestDNS_Recurse - 2019/11/27 02:21:40.955044 [INFO] agent: started state syncer
2019/11/27 02:21:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:41 [INFO]  raft: Node at 127.0.0.1:12028 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:41 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:41 [INFO]  raft: Node at 127.0.0.1:12022 [Leader] entering Leader state
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.401198 [INFO] consul: cluster leadership acquired
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.401754 [INFO] consul: New leader elected: Node 23e19d96-ab5d-d140-22a1-1b10b1ca8078
2019/11/27 02:21:41 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:41 [INFO]  raft: Node at 127.0.0.1:12028 [Leader] entering Leader state
TestDNS_Recurse - 2019/11/27 02:21:41.786953 [INFO] consul: cluster leadership acquired
TestDNS_Recurse - 2019/11/27 02:21:41.787469 [INFO] consul: New leader elected: Node 1ae0bc5b-f7dd-d80c-16ac-30184f57fe74
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.941755 [INFO] agent: Synced node info
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.941886 [DEBUG] agent: Node info in sync
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.964903 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (2.120744ms) Recursor queried: 127.0.0.1:45073
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.965154 [DEBUG] dns: request for {apple.com. 255 1} (udp) (2.837436ms) from client 127.0.0.1:54796 (udp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.967264 [INFO] agent: Requesting shutdown
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.967371 [INFO] consul: shutting down server
TestDNS_Recurse_Truncation - 2019/11/27 02:21:41.967427 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.388940 [DEBUG] agent: Node info in sync
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.674067 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.785194 [INFO] manager: shutting down
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.785389 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.785702 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.785733 [INFO] agent: consul server down
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.785865 [INFO] agent: shutdown complete
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.785914 [INFO] agent: Stopping DNS server 127.0.0.1:12017 (tcp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.786068 [INFO] agent: Stopping DNS server 127.0.0.1:12017 (udp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.786269 [INFO] agent: Stopping HTTP server 127.0.0.1:12018 (tcp)
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.786480 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Recurse_Truncation - 2019/11/27 02:21:42.786555 [INFO] agent: Endpoints down
--- PASS: TestDNS_Recurse_Truncation (4.11s)
=== CONT  TestDNS_ServiceLookup_Dedup_SRV
TestDNS_Recurse - 2019/11/27 02:21:42.789488 [INFO] agent: Synced node info
TestDNS_Recurse - 2019/11/27 02:21:42.789644 [DEBUG] agent: Node info in sync
TestDNS_Recurse - 2019/11/27 02:21:42.801626 [DEBUG] dns: recurse RTT for {apple.com. 255 1} (666.024µs) Recursor queried: 127.0.0.1:45897
TestDNS_Recurse - 2019/11/27 02:21:42.801932 [DEBUG] dns: request for {apple.com. 255 1} (udp) (1.363716ms) from client 127.0.0.1:50775 (udp)
TestDNS_Recurse - 2019/11/27 02:21:42.802107 [INFO] agent: Requesting shutdown
TestDNS_Recurse - 2019/11/27 02:21:42.802181 [INFO] consul: shutting down server
TestDNS_Recurse - 2019/11/27 02:21:42.802240 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:42.935107 [WARN] agent: Node name "Node 1652489f-5ee9-611d-8555-7c6cda4bcb0b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:42.936636 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:42.937120 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:42.937574 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:42.939860 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_Recurse - 2019/11/27 02:21:42.941457 [WARN] serf: Shutdown without a Leave
TestDNS_Recurse - 2019/11/27 02:21:43.082808 [INFO] manager: shutting down
TestDNS_Recurse - 2019/11/27 02:21:43.396550 [INFO] agent: consul server down
TestDNS_Recurse - 2019/11/27 02:21:43.396646 [INFO] agent: shutdown complete
TestDNS_Recurse - 2019/11/27 02:21:43.396781 [INFO] agent: Stopping DNS server 127.0.0.1:12023 (tcp)
TestDNS_Recurse - 2019/11/27 02:21:43.396938 [INFO] agent: Stopping DNS server 127.0.0.1:12023 (udp)
TestDNS_Recurse - 2019/11/27 02:21:43.397097 [INFO] agent: Stopping HTTP server 127.0.0.1:12024 (tcp)
TestDNS_Recurse - 2019/11/27 02:21:43.397296 [INFO] agent: Waiting for endpoints to shut down
TestDNS_Recurse - 2019/11/27 02:21:43.397358 [INFO] agent: Endpoints down
--- PASS: TestDNS_Recurse (3.90s)
=== CONT  TestDNS_ServiceLookup_Dedup
TestDNS_Recurse - 2019/11/27 02:21:43.404358 [ERR] consul: failed to establish leadership: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:43.461039 [WARN] agent: Node name "Node 5d4fea63-8d66-195c-e407-7abe0656c4d2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:43.461591 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:43.461662 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:43.461905 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:43.462023 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:21:43.675048 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:21:43.675140 [DEBUG] agent: Node info in sync
2019/11/27 02:21:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1652489f-5ee9-611d-8555-7c6cda4bcb0b Address:127.0.0.1:12034}]
2019/11/27 02:21:44 [INFO]  raft: Node at 127.0.0.1:12034 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.269233 [INFO] serf: EventMemberJoin: Node 1652489f-5ee9-611d-8555-7c6cda4bcb0b.dc1 127.0.0.1
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.273832 [INFO] serf: EventMemberJoin: Node 1652489f-5ee9-611d-8555-7c6cda4bcb0b 127.0.0.1
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.275013 [INFO] consul: Adding LAN server Node 1652489f-5ee9-611d-8555-7c6cda4bcb0b (Addr: tcp/127.0.0.1:12034) (DC: dc1)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.275729 [INFO] consul: Handled member-join event for server "Node 1652489f-5ee9-611d-8555-7c6cda4bcb0b.dc1" in area "wan"
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.277728 [INFO] agent: Started DNS server 127.0.0.1:12029 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.278210 [INFO] agent: Started DNS server 127.0.0.1:12029 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.288415 [INFO] agent: Started HTTP server on 127.0.0.1:12030 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:44.288835 [INFO] agent: started state syncer
2019/11/27 02:21:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:44 [INFO]  raft: Node at 127.0.0.1:12034 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5d4fea63-8d66-195c-e407-7abe0656c4d2 Address:127.0.0.1:12040}]
2019/11/27 02:21:44 [INFO]  raft: Node at 127.0.0.1:12040 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.613189 [INFO] serf: EventMemberJoin: Node 5d4fea63-8d66-195c-e407-7abe0656c4d2.dc1 127.0.0.1
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.643233 [INFO] serf: EventMemberJoin: Node 5d4fea63-8d66-195c-e407-7abe0656c4d2 127.0.0.1
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.645062 [INFO] consul: Adding LAN server Node 5d4fea63-8d66-195c-e407-7abe0656c4d2 (Addr: tcp/127.0.0.1:12040) (DC: dc1)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.653676 [INFO] consul: Handled member-join event for server "Node 5d4fea63-8d66-195c-e407-7abe0656c4d2.dc1" in area "wan"
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.659850 [INFO] agent: Started DNS server 127.0.0.1:12035 (udp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.670587 [INFO] agent: Started DNS server 127.0.0.1:12035 (tcp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.676649 [INFO] agent: Started HTTP server on 127.0.0.1:12036 (tcp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:44.682188 [INFO] agent: started state syncer
2019/11/27 02:21:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:44 [INFO]  raft: Node at 127.0.0.1:12040 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:45 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:45 [INFO]  raft: Node at 127.0.0.1:12034 [Leader] entering Leader state
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:45.275259 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:45.275801 [INFO] consul: New leader elected: Node 1652489f-5ee9-611d-8555-7c6cda4bcb0b
2019/11/27 02:21:45 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:45 [INFO]  raft: Node at 127.0.0.1:12040 [Leader] entering Leader state
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:45.531649 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:45.532122 [INFO] consul: New leader elected: Node 5d4fea63-8d66-195c-e407-7abe0656c4d2
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:45.742765 [INFO] agent: Synced node info
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:46.130245 [INFO] agent: Synced node info
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:46.130371 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:46.195196 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:46.195340 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:47.072849 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.431906 [DEBUG] dns: request for name db.service.consul. type ANY class IN (took 716.359µs) from client 127.0.0.1:32833 (udp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.433744 [DEBUG] dns: request for name 8501b220-e98a-9d86-d98d-45f844450c7f.query.consul. type ANY class IN (took 697.358µs) from client 127.0.0.1:55794 (udp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.433980 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.434047 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.434090 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.519017 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.523020 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 757.361µs) from client 127.0.0.1:46258 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.524662 [DEBUG] dns: request for name 01a62d06-f238-b692-6469-1025b4e0bf3f.query.consul. type SRV class IN (took 776.028µs) from client 127.0.0.1:35511 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.524944 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.525011 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.525057 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.665040 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.667141 [INFO] manager: shutting down
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.884946 [INFO] manager: shutting down
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.984742 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.984842 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985014 [INFO] agent: consul server down
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985068 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985119 [INFO] agent: Stopping DNS server 127.0.0.1:12035 (tcp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985272 [INFO] agent: Stopping DNS server 127.0.0.1:12035 (udp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985437 [INFO] agent: Stopping HTTP server 127.0.0.1:12036 (tcp)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985639 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985710 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Dedup (5.59s)
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985805 [ERR] connect: Apply failed leadership lost while committing log
=== CONT  TestDNS_ServiceLookup_PreparedQueryNamePeriod
TestDNS_ServiceLookup_Dedup - 2019/11/27 02:21:48.985849 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986152 [INFO] agent: consul server down
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986202 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986258 [INFO] agent: Stopping DNS server 127.0.0.1:12029 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986392 [INFO] agent: Stopping DNS server 127.0.0.1:12029 (udp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986544 [INFO] agent: Stopping HTTP server 127.0.0.1:12030 (tcp)
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986816 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Dedup_SRV - 2019/11/27 02:21:48.986897 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Dedup_SRV (6.20s)
=== CONT  TestDNS_PreparedQueryNearIP
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:49.052414 [WARN] agent: Node name "Node c56ac6e2-d377-c192-dde8-1fbff6730970" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:49.053110 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:49.053338 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:49.053727 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:49.054024 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:49.078273 [WARN] agent: Node name "Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:49.078708 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:49.078774 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:49.078935 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:49.079041 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6a0a47ea-d980-3757-8b6f-1e92321a7dac Address:127.0.0.1:12052}]
2019/11/27 02:21:50 [INFO]  raft: Node at 127.0.0.1:12052 [Follower] entering Follower state (Leader: "")
2019/11/27 02:21:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c56ac6e2-d377-c192-dde8-1fbff6730970 Address:127.0.0.1:12046}]
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.622592 [INFO] serf: EventMemberJoin: Node c56ac6e2-d377-c192-dde8-1fbff6730970.dc1 127.0.0.1
2019/11/27 02:21:50 [INFO]  raft: Node at 127.0.0.1:12046 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.627847 [INFO] serf: EventMemberJoin: Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac.dc1 127.0.0.1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.628050 [INFO] serf: EventMemberJoin: Node c56ac6e2-d377-c192-dde8-1fbff6730970 127.0.0.1
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.629619 [INFO] agent: Started DNS server 127.0.0.1:12041 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.630092 [INFO] consul: Adding LAN server Node c56ac6e2-d377-c192-dde8-1fbff6730970 (Addr: tcp/127.0.0.1:12046) (DC: dc1)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.630155 [INFO] consul: Handled member-join event for server "Node c56ac6e2-d377-c192-dde8-1fbff6730970.dc1" in area "wan"
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.630515 [INFO] agent: Started DNS server 127.0.0.1:12041 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.635990 [INFO] agent: Started HTTP server on 127.0.0.1:12042 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:50.636110 [INFO] agent: started state syncer
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.639953 [INFO] serf: EventMemberJoin: Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac 127.0.0.1
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.642357 [INFO] consul: Adding LAN server Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac (Addr: tcp/127.0.0.1:12052) (DC: dc1)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.643527 [INFO] consul: Handled member-join event for server "Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac.dc1" in area "wan"
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.648463 [INFO] agent: Started DNS server 127.0.0.1:12047 (tcp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.649008 [INFO] agent: Started DNS server 127.0.0.1:12047 (udp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.651340 [INFO] agent: Started HTTP server on 127.0.0.1:12048 (tcp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:50.651507 [INFO] agent: started state syncer
2019/11/27 02:21:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:50 [INFO]  raft: Node at 127.0.0.1:12052 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:50 [INFO]  raft: Node at 127.0.0.1:12046 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:51 [INFO]  raft: Node at 127.0.0.1:12052 [Leader] entering Leader state
2019/11/27 02:21:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:51 [INFO]  raft: Node at 127.0.0.1:12046 [Leader] entering Leader state
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:51.390094 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:51.390566 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:51.390931 [INFO] consul: New leader elected: Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:51.391152 [INFO] consul: New leader elected: Node c56ac6e2-d377-c192-dde8-1fbff6730970
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:52.141248 [INFO] agent: Synced node info
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:52.141365 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:52.243706 [INFO] agent: Synced node info
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:52.243854 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:53.112037 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:55.062996 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:55.901364 [DEBUG] dns: request for name some.query.we.like.query.consul. type SRV class IN (took 729.36µs) from client 127.0.0.1:38763 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:55.901667 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:55.901804 [INFO] consul: shutting down server
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:55.901856 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.111945 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:56.182679 [WARN] consul: error getting server health from "Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac": context deadline exceeded
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.576311 [INFO] manager: shutting down
Added 3 service nodes
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.582459 [INFO] agent: consul server down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.582543 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.582626 [INFO] agent: Stopping DNS server 127.0.0.1:12041 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.582835 [INFO] agent: Stopping DNS server 127.0.0.1:12041 (udp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.583052 [INFO] agent: Stopping HTTP server 127.0.0.1:12042 (tcp)
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.583301 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.583391 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_PreparedQueryNamePeriod (7.60s)
=== CONT  TestDNS_PreparedQueryNearIPEDNS
TestDNS_ServiceLookup_PreparedQueryNamePeriod - 2019/11/27 02:21:56.585520 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:56.742472 [WARN] agent: Node name "Node 2220cb2e-7917-b120-91cb-d53f308171d7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:56.743315 [DEBUG] tlsutil: Update with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:56.743561 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:56.743994 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:56.744323 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:21:58.031215 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:21:58.031318 [DEBUG] agent: Service "api-proxy-sidecar" in sync
jones - 2019/11/27 02:21:58.031361 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:58.987368 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 971.035µs) from client 127.0.0.1:51919 (udp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:58.988042 [INFO] agent: Requesting shutdown
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:58.988122 [INFO] consul: shutting down server
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:58.988171 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.084073 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.173103 [INFO] manager: shutting down
2019/11/27 02:21:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2220cb2e-7917-b120-91cb-d53f308171d7 Address:127.0.0.1:12058}]
2019/11/27 02:21:59 [INFO]  raft: Node at 127.0.0.1:12058 [Follower] entering Follower state (Leader: "")
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.177250 [INFO] serf: EventMemberJoin: Node 2220cb2e-7917-b120-91cb-d53f308171d7.dc1 127.0.0.1
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179255 [INFO] agent: consul server down
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179323 [INFO] agent: shutdown complete
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179375 [INFO] agent: Stopping DNS server 127.0.0.1:12047 (tcp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179504 [INFO] agent: Stopping DNS server 127.0.0.1:12047 (udp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179641 [INFO] agent: Stopping HTTP server 127.0.0.1:12048 (tcp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179825 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.179892 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQueryNearIP (10.19s)
=== CONT  TestDNS_ServiceLookup_TagPeriod
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.180584 [INFO] serf: EventMemberJoin: Node 2220cb2e-7917-b120-91cb-d53f308171d7 127.0.0.1
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.181648 [INFO] agent: Started DNS server 127.0.0.1:12053 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.182039 [INFO] consul: Handled member-join event for server "Node 2220cb2e-7917-b120-91cb-d53f308171d7.dc1" in area "wan"
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.182225 [INFO] consul: Adding LAN server Node 2220cb2e-7917-b120-91cb-d53f308171d7 (Addr: tcp/127.0.0.1:12058) (DC: dc1)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.182513 [INFO] agent: Started DNS server 127.0.0.1:12053 (tcp)
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.182940 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.182990 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.183154 [WARN] consul: error getting server health from "Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac": rpc error making call: EOF
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.192652 [INFO] agent: Started HTTP server on 127.0.0.1:12054 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.192755 [INFO] agent: started state syncer
2019/11/27 02:21:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:59 [INFO]  raft: Node at 127.0.0.1:12058 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:21:59.323204 [WARN] agent: Node name "Node b75e3479-f4a5-947e-d56b-10cc97aafa5e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:21:59.323600 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:21:59.323670 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:21:59.323819 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:21:59.323923 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:59 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:59 [INFO]  raft: Node at 127.0.0.1:12058 [Leader] entering Leader state
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.910463 [INFO] consul: cluster leadership acquired
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:21:59.910881 [INFO] consul: New leader elected: Node 2220cb2e-7917-b120-91cb-d53f308171d7
TestDNS_PreparedQueryNearIP - 2019/11/27 02:21:59.976751 [WARN] consul: error getting server health from "Node 6a0a47ea-d980-3757-8b6f-1e92321a7dac": context deadline exceeded
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:00.607324 [INFO] agent: Synced node info
2019/11/27 02:22:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b75e3479-f4a5-947e-d56b-10cc97aafa5e Address:127.0.0.1:12064}]
2019/11/27 02:22:00 [INFO]  raft: Node at 127.0.0.1:12064 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.909898 [INFO] serf: EventMemberJoin: Node b75e3479-f4a5-947e-d56b-10cc97aafa5e.dc1 127.0.0.1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.915031 [INFO] serf: EventMemberJoin: Node b75e3479-f4a5-947e-d56b-10cc97aafa5e 127.0.0.1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.915667 [INFO] consul: Adding LAN server Node b75e3479-f4a5-947e-d56b-10cc97aafa5e (Addr: tcp/127.0.0.1:12064) (DC: dc1)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.916167 [INFO] consul: Handled member-join event for server "Node b75e3479-f4a5-947e-d56b-10cc97aafa5e.dc1" in area "wan"
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.917327 [INFO] agent: Started DNS server 127.0.0.1:12059 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.917411 [INFO] agent: Started DNS server 127.0.0.1:12059 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.919505 [INFO] agent: Started HTTP server on 127.0.0.1:12060 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:00.919615 [INFO] agent: started state syncer
2019/11/27 02:22:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:00 [INFO]  raft: Node at 127.0.0.1:12064 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:02 [INFO]  raft: Node at 127.0.0.1:12064 [Leader] entering Leader state
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:02.224277 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:02.224672 [INFO] consul: New leader elected: Node b75e3479-f4a5-947e-d56b-10cc97aafa5e
Added 3 service nodes
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:02.686342 [INFO] agent: Synced node info
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:02.686464 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.693437 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 7.051921ms) from client 127.0.0.1:49379 (udp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.701237 [DEBUG] dns: request for name 49a0bb3e-c62f-218b-5226-e702ea27a751.query.consul. type ANY class IN (took 7.789615ms) from client 127.0.0.1:49840 (udp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.701279 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.701795 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.702032 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.703143 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.703641 [INFO] manager: shutting down
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704263 [INFO] agent: consul server down
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704316 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704367 [INFO] agent: Stopping DNS server 127.0.0.1:11981 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704496 [INFO] agent: Stopping DNS server 127.0.0.1:11981 (udp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704631 [INFO] agent: Stopping HTTP server 127.0.0.1:11982 (tcp)
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704808 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Truncate - 2019/11/27 02:22:02.704875 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Truncate (44.14s)
=== CONT  TestDNS_CaseInsensitiveServiceLookup
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:02.838395 [WARN] agent: Node name "Node a1e31202-6875-9b74-dffe-dc2a06b33864" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:02.841577 [DEBUG] tlsutil: Update with version 1
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:02.841668 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:02.841995 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:02.842105 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.405973 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.458146 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.259523ms) from client 127.0.0.1:36870 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.463887 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.019182ms) from client 127.0.0.1:42610 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.474208 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.424195ms) from client 127.0.0.1:42639 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.486042 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 7.136925ms) from client 127.0.0.1:48661 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.490553 [DEBUG] agent: Node info in sync
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.490656 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.492084 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.163186ms) from client 127.0.0.1:33365 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.497561 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 4.898511ms) from client 127.0.0.1:58723 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.503505 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.206188ms) from client 127.0.0.1:58221 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.514422 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.168187ms) from client 127.0.0.1:46638 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.524643 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.26019ms) from client 127.0.0.1:55396 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.530942 [DEBUG] dns: request for name web.service.consul. type ANY class IN (took 5.594535ms) from client 127.0.0.1:51442 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.536974 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.008514ms) from client 127.0.0.1:52360 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.542948 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.123185ms) from client 127.0.0.1:38308 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.560251 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 10.561714ms) from client 127.0.0.1:54922 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.560994 [DEBUG] dns: request for name v1.master2.db.service.consul. type SRV class IN (took 426.349µs) from client 127.0.0.1:38849 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.567804 [DEBUG] dns: request for name v1.master.db.service.consul. type SRV class IN (took 593.021µs) from client 127.0.0.1:58909 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.567947 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.568024 [INFO] consul: shutting down server
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.568071 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.570815 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 8.227297ms) from client 127.0.0.1:41081 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.576563 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.130851ms) from client 127.0.0.1:34286 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.598762 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 21.394106ms) from client 127.0.0.1:57759 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.604607 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.080183ms) from client 127.0.0.1:59366 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.617325 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.218855ms) from client 127.0.0.1:34471 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.632261 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.086851ms) from client 127.0.0.1:37283 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.638373 [DEBUG] dns: request for name f9bdb5fc-d650-aca8-7262-f1bb1098fd3e.query.consul. type ANY class IN (took 5.387528ms) from client 127.0.0.1:56030 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.638497 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.638574 [INFO] consul: shutting down server
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.638621 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.639159 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.639711 [INFO] manager: shutting down
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640323 [INFO] agent: consul server down
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640372 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640422 [INFO] agent: Stopping DNS server 127.0.0.1:11987 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640544 [INFO] agent: Stopping DNS server 127.0.0.1:11987 (udp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640680 [INFO] agent: Stopping HTTP server 127.0.0.1:11988 (tcp)
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640875 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_Randomize - 2019/11/27 02:22:03.640939 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_Randomize (44.05s)
=== CONT  TestDNS_ServiceLookup_ServiceAddressIPV6
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.672654 [WARN] serf: Shutdown without a Leave
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.677061 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.677644 [DEBUG] consul: Skipping self join check for "Node 2220cb2e-7917-b120-91cb-d53f308171d7" since the cluster is too small
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.677899 [INFO] consul: member 'Node 2220cb2e-7917-b120-91cb-d53f308171d7' joined, marking health alive
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.690239 [DEBUG] dns: request for name some.query.we.like.query.consul. type A class IN (took 953.367µs) from client 127.0.0.1:44178 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.690569 [INFO] agent: Requesting shutdown
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.690630 [INFO] consul: shutting down server
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.690674 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:03.752212 [WARN] agent: Node name "Node 39435977-7551-4546-c1d6-ab376711804c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:03.752564 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:03.752629 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:03.752786 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:03.752894 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.772798 [INFO] manager: shutting down
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.772858 [ERR] consul: failed to establish leadership: error configuring provider: raft is already shutdown
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.772947 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775030 [INFO] agent: consul server down
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775095 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775155 [INFO] agent: Stopping DNS server 127.0.0.1:12059 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775302 [INFO] agent: Stopping DNS server 127.0.0.1:12059 (udp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775479 [INFO] agent: Stopping HTTP server 127.0.0.1:12060 (tcp)
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775711 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_TagPeriod - 2019/11/27 02:22:03.775791 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_TagPeriod (4.60s)
=== CONT  TestDNS_ServiceLookup_ServiceAddress_CNAME
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.885180 [INFO] manager: shutting down
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.887143 [INFO] agent: consul server down
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.887326 [INFO] agent: shutdown complete
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.887484 [INFO] agent: Stopping DNS server 127.0.0.1:12053 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.887738 [INFO] agent: Stopping DNS server 127.0.0.1:12053 (udp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.888008 [INFO] agent: Stopping HTTP server 127.0.0.1:12054 (tcp)
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.888329 [INFO] agent: Waiting for endpoints to shut down
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.888522 [INFO] agent: Endpoints down
--- PASS: TestDNS_PreparedQueryNearIPEDNS (7.31s)
=== CONT  TestDNS_ServiceLookup_ServiceAddress_A
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.890180 [ERR] consul.rpc: multiplex conn accept failed: read tcp 127.0.0.1:12058->127.0.0.1:45419: read: connection reset by peer from=127.0.0.1:45419
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:03.898142 [WARN] agent: Node name "Node 9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:03.900692 [WARN] consul: error getting server health from "Node 2220cb2e-7917-b120-91cb-d53f308171d7": rpc error making call: EOF
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:03.901650 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:03.902004 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:03.902561 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:03.902963 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:04.060956 [WARN] agent: Node name "Node f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:04.061333 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:04.061397 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:04.061550 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:04.061648 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
jones - 2019/11/27 02:22:04.266656 [DEBUG] consul: Skipping self join check for "Node 96ea3298-4984-8452-8dce-62bd7caf6d71" since the cluster is too small
2019/11/27 02:22:04 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a1e31202-6875-9b74-dffe-dc2a06b33864 Address:127.0.0.1:12070}]
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.355007 [INFO] serf: EventMemberJoin: Node a1e31202-6875-9b74-dffe-dc2a06b33864.dc1 127.0.0.1
2019/11/27 02:22:04 [INFO]  raft: Node at 127.0.0.1:12070 [Follower] entering Follower state (Leader: "")
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.359739 [INFO] serf: EventMemberJoin: Node a1e31202-6875-9b74-dffe-dc2a06b33864 127.0.0.1
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.360426 [INFO] consul: Adding LAN server Node a1e31202-6875-9b74-dffe-dc2a06b33864 (Addr: tcp/127.0.0.1:12070) (DC: dc1)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.360548 [INFO] consul: Handled member-join event for server "Node a1e31202-6875-9b74-dffe-dc2a06b33864.dc1" in area "wan"
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.361133 [INFO] agent: Started DNS server 127.0.0.1:12065 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.361218 [INFO] agent: Started DNS server 127.0.0.1:12065 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.363262 [INFO] agent: Started HTTP server on 127.0.0.1:12066 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:04.363369 [INFO] agent: started state syncer
2019/11/27 02:22:04 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:04 [INFO]  raft: Node at 127.0.0.1:12070 [Candidate] entering Candidate state in term 2
TestDNS_PreparedQueryNearIPEDNS - 2019/11/27 02:22:04.681090 [WARN] consul: error getting server health from "Node 2220cb2e-7917-b120-91cb-d53f308171d7": context deadline exceeded
2019/11/27 02:22:06 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:06 [INFO]  raft: Node at 127.0.0.1:12070 [Leader] entering Leader state
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:06.186077 [INFO] consul: cluster leadership acquired
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:06.186523 [INFO] consul: New leader elected: Node a1e31202-6875-9b74-dffe-dc2a06b33864
2019/11/27 02:22:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:39435977-7551-4546-c1d6-ab376711804c Address:127.0.0.1:12076}]
2019/11/27 02:22:06 [INFO]  raft: Node at 127.0.0.1:12076 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.192936 [INFO] serf: EventMemberJoin: Node 39435977-7551-4546-c1d6-ab376711804c.dc1 127.0.0.1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.196315 [INFO] serf: EventMemberJoin: Node 39435977-7551-4546-c1d6-ab376711804c 127.0.0.1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.197135 [INFO] consul: Adding LAN server Node 39435977-7551-4546-c1d6-ab376711804c (Addr: tcp/127.0.0.1:12076) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.197443 [INFO] consul: Handled member-join event for server "Node 39435977-7551-4546-c1d6-ab376711804c.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.197570 [INFO] agent: Started DNS server 127.0.0.1:12071 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.197884 [INFO] agent: Started DNS server 127.0.0.1:12071 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.199818 [INFO] agent: Started HTTP server on 127.0.0.1:12072 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:06.199899 [INFO] agent: started state syncer
2019/11/27 02:22:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:06 [INFO]  raft: Node at 127.0.0.1:12076 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7 Address:127.0.0.1:12082}]
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.866945 [INFO] serf: EventMemberJoin: Node 9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7.dc1 127.0.0.1
2019/11/27 02:22:06 [INFO]  raft: Node at 127.0.0.1:12082 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.875608 [INFO] serf: EventMemberJoin: Node 9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.877221 [INFO] consul: Adding LAN server Node 9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7 (Addr: tcp/127.0.0.1:12082) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.877324 [INFO] consul: Handled member-join event for server "Node 9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.879345 [INFO] agent: Started DNS server 127.0.0.1:12077 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.879454 [INFO] agent: Started DNS server 127.0.0.1:12077 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.882327 [INFO] agent: Started HTTP server on 127.0.0.1:12078 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:06.882611 [INFO] agent: started state syncer
2019/11/27 02:22:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:06 [INFO]  raft: Node at 127.0.0.1:12082 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:22:06.988493 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:22:06.988564 [DEBUG] agent: Node info in sync
2019/11/27 02:22:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f Address:127.0.0.1:12088}]
2019/11/27 02:22:07 [INFO]  raft: Node at 127.0.0.1:12088 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.444078 [INFO] serf: EventMemberJoin: Node f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f.dc1 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.461933 [INFO] serf: EventMemberJoin: Node f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f 127.0.0.1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.463286 [INFO] consul: Handled member-join event for server "Node f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f.dc1" in area "wan"
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.463394 [INFO] consul: Adding LAN server Node f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f (Addr: tcp/127.0.0.1:12088) (DC: dc1)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.463924 [INFO] agent: Started DNS server 127.0.0.1:12083 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.465001 [INFO] agent: Started DNS server 127.0.0.1:12083 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.467572 [INFO] agent: Started HTTP server on 127.0.0.1:12084 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:07.467682 [INFO] agent: started state syncer
2019/11/27 02:22:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:07 [INFO]  raft: Node at 127.0.0.1:12088 [Candidate] entering Candidate state in term 2
jones - 2019/11/27 02:22:07.672978 [DEBUG] consul: Skipping self join check for "Node 4c613484-61cd-f189-9fd4-637dea8a81e0" since the cluster is too small
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:07.683493 [INFO] agent: Synced node info
2019/11/27 02:22:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:07 [INFO]  raft: Node at 127.0.0.1:12076 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:07.943788 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:07.946299 [INFO] consul: New leader elected: Node 39435977-7551-4546-c1d6-ab376711804c
2019/11/27 02:22:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:08 [INFO]  raft: Node at 127.0.0.1:12082 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:08.153255 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:08.153736 [INFO] consul: New leader elected: Node 9f6dcc0b-67ae-e0e7-4b60-dbd54f5b65f7
2019/11/27 02:22:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:08 [INFO]  raft: Node at 127.0.0.1:12088 [Leader] entering Leader state
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:08.277055 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:08.277551 [INFO] consul: New leader elected: Node f6439fb6-7a3c-f1b3-1e5d-758f1d620c4f
jones - 2019/11/27 02:22:08.483177 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:22:08.483265 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:08.485595 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:08.485711 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:08.578681 [INFO] agent: Synced node info
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:08.578811 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:08.677595 [INFO] agent: Synced node info
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.488203 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.488322 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.611504 [DEBUG] dns: request for name master.db.service.consul. type SRV class IN (took 4.527163ms) from client 127.0.0.1:42214 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.614849 [DEBUG] dns: request for name mASTER.dB.service.consul. type SRV class IN (took 1.098707ms) from client 127.0.0.1:52091 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.616666 [DEBUG] dns: request for name MASTER.dB.service.consul. type SRV class IN (took 775.695µs) from client 127.0.0.1:38031 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.618583 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 729.693µs) from client 127.0.0.1:49468 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.620629 [DEBUG] dns: request for name DB.service.consul. type SRV class IN (took 844.364µs) from client 127.0.0.1:45807 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.622677 [DEBUG] dns: request for name Db.service.consul. type SRV class IN (took 850.031µs) from client 127.0.0.1:43564 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.624947 [DEBUG] dns: request for name somequery.query.consul. type SRV class IN (took 955.034µs) from client 127.0.0.1:47276 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.627203 [DEBUG] dns: request for name SomeQuery.query.consul. type SRV class IN (took 1.112707ms) from client 127.0.0.1:47674 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.634873 [DEBUG] dns: request for name SOMEQUERY.query.consul. type SRV class IN (took 2.451089ms) from client 127.0.0.1:55943 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.635024 [INFO] agent: Requesting shutdown
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.635136 [INFO] consul: shutting down server
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:09.635207 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.200149 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.372280 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.506078 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.510154 [DEBUG] agent: Node info in sync
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.510193 [INFO] agent: consul server down
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.510311 [INFO] agent: shutdown complete
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.510406 [INFO] agent: Stopping DNS server 127.0.0.1:12065 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.510599 [INFO] agent: Stopping DNS server 127.0.0.1:12065 (udp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.510786 [INFO] agent: Stopping HTTP server 127.0.0.1:12066 (tcp)
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.511028 [INFO] agent: Waiting for endpoints to shut down
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.511120 [INFO] agent: Endpoints down
--- PASS: TestDNS_CaseInsensitiveServiceLookup (7.81s)
=== CONT  TestDNS_ExternalServiceToConsulCNAMENestedLookup
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.518792 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 707.358µs) from client 127.0.0.1:37087 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.522020 [DEBUG] dns: request for name b286630d-34ff-144e-69e1-22697f498054.query.consul. type SRV class IN (took 1.01037ms) from client 127.0.0.1:55564 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.523010 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.523098 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.523148 [WARN] serf: Shutdown without a Leave
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.510256 [ERR] connect: Apply failed leadership lost while committing log
TestDNS_CaseInsensitiveServiceLookup - 2019/11/27 02:22:10.523353 [ERR] consul: failed to establish leadership: leadership lost while committing log
jones - 2019/11/27 02:22:10.524650 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:22:10.524730 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:10.594273 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:10.594410 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:10.594732 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:10.594942 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.695997 [WARN] serf: Shutdown without a Leave
jones - 2019/11/27 02:22:10.697341 [DEBUG] consul: Skipping self join check for "Node 005cb1c3-f8e5-2827-9833-9849ba78d405" since the cluster is too small
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.701397 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 787.028µs) from client 127.0.0.1:56566 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.703297 [DEBUG] dns: request for name 886087fb-53e3-9be5-8da8-b216b0232af0.query.consul. type SRV class IN (took 787.029µs) from client 127.0.0.1:44206 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.703412 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.703833 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.704040 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.759477 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.759597 [DEBUG] agent: Node info in sync
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.783406 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.791455 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792058 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792134 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792188 [INFO] agent: Stopping DNS server 127.0.0.1:12071 (tcp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792339 [INFO] agent: Stopping DNS server 127.0.0.1:12071 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792522 [INFO] agent: Stopping HTTP server 127.0.0.1:12072 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.792635 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 719.359µs) from client 127.0.0.1:37697 (udp)
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792743 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.792814 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddressIPV6 (7.15s)
=== CONT  TestDNS_NSRecords_IPV6
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.794272 [DEBUG] dns: request for name 0f3d1213-cac9-6e63-578d-f7b52a900a62.query.consul. type SRV class IN (took 780.028µs) from client 127.0.0.1:37890 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.794500 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.794564 [INFO] consul: shutting down server
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.794608 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.797939 [WARN] consul: error getting server health from "Node 39435977-7551-4546-c1d6-ab376711804c": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:12076: connect: connection refused
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:10.823691 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:10.883237 [DEBUG] tlsutil: Update with version 1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:10.883408 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:10.883640 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:10.883825 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.906935 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:10.907228 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.909384 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.909440 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.909494 [INFO] agent: Stopping DNS server 127.0.0.1:12077 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.909619 [INFO] agent: Stopping DNS server 127.0.0.1:12077 (udp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.909760 [INFO] agent: Stopping HTTP server 127.0.0.1:12078 (tcp)
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.909944 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.910009 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddress_CNAME (7.13s)
=== CONT  TestDNS_ExternalServiceToConsulCNAMELookup
TestDNS_ServiceLookup_ServiceAddress_CNAME - 2019/11/27 02:22:10.912624 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:10.994625 [WARN] agent: Node name "test node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:10.995168 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:10.995236 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:10.995403 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:10.995513 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.073336 [INFO] manager: shutting down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074034 [INFO] agent: consul server down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074099 [INFO] agent: shutdown complete
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074157 [INFO] agent: Stopping DNS server 127.0.0.1:12083 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074312 [INFO] agent: Stopping DNS server 127.0.0.1:12083 (udp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074528 [INFO] agent: Stopping HTTP server 127.0.0.1:12084 (tcp)
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074768 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.074864 [INFO] agent: Endpoints down
--- PASS: TestDNS_ServiceLookup_ServiceAddress_A (7.19s)
=== CONT  TestDNS_InifiniteRecursion
TestDNS_ServiceLookup_ServiceAddress_A - 2019/11/27 02:22:11.076018 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_InifiniteRecursion - 2019/11/27 02:22:11.148717 [WARN] agent: Node name "test node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_InifiniteRecursion - 2019/11/27 02:22:11.152962 [DEBUG] tlsutil: Update with version 1
TestDNS_InifiniteRecursion - 2019/11/27 02:22:11.153070 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_InifiniteRecursion - 2019/11/27 02:22:11.153240 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_InifiniteRecursion - 2019/11/27 02:22:11.153345 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:178329cb-b6e3-4fe7-c827-d8f04504458b Address:127.0.0.1:12094}]
2019/11/27 02:22:11 [INFO]  raft: Node at 127.0.0.1:12094 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.635085 [INFO] serf: EventMemberJoin: test-node.dc1 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.641618 [INFO] serf: EventMemberJoin: test-node 127.0.0.1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.642290 [INFO] consul: Handled member-join event for server "test-node.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.642620 [INFO] consul: Adding LAN server test-node (Addr: tcp/127.0.0.1:12094) (DC: dc1)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.642890 [INFO] agent: Started DNS server 127.0.0.1:12089 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.642971 [INFO] agent: Started DNS server 127.0.0.1:12089 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.645376 [INFO] agent: Started HTTP server on 127.0.0.1:12090 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:11.645459 [INFO] agent: started state syncer
2019/11/27 02:22:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:11 [INFO]  raft: Node at 127.0.0.1:12094 [Candidate] entering Candidate state in term 2
TestDNS_ServiceLookup_ServiceAddressIPV6 - 2019/11/27 02:22:11.797106 [WARN] consul: error getting server health from "Node 39435977-7551-4546-c1d6-ab376711804c": context deadline exceeded
2019/11/27 02:22:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:667678b4-8b16-6b0f-746e-4f4f027da761 Address:127.0.0.1:12106}]
2019/11/27 02:22:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:40b58503-ccfa-9f48-0af4-79f6a6b02471 Address:[::1]:12100}]
2019/11/27 02:22:12 [INFO]  raft: Node at 127.0.0.1:12106 [Follower] entering Follower state (Leader: "")
2019/11/27 02:22:12 [INFO]  raft: Node at [::1]:12100 [Follower] entering Follower state (Leader: "")
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.099565 [INFO] serf: EventMemberJoin: server1.dc1 ::1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.103259 [INFO] serf: EventMemberJoin: server1 ::1
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.103597 [INFO] serf: EventMemberJoin: test node.dc1 127.0.0.1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.104433 [INFO] agent: Started DNS server 127.0.0.1:12095 (udp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.105310 [INFO] consul: Adding LAN server server1 (Addr: tcp/[::1]:12100) (DC: dc1)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.105393 [INFO] agent: Started DNS server 127.0.0.1:12095 (tcp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.105727 [INFO] consul: Handled member-join event for server "server1.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.107311 [INFO] serf: EventMemberJoin: test node 127.0.0.1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.107400 [INFO] agent: Started HTTP server on 127.0.0.1:12096 (tcp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:12.107468 [INFO] agent: started state syncer
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.107872 [INFO] consul: Handled member-join event for server "test node.dc1" in area "wan"
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.108147 [INFO] consul: Adding LAN server test node (Addr: tcp/127.0.0.1:12106) (DC: dc1)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.108717 [INFO] agent: Started DNS server 127.0.0.1:12101 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.108795 [INFO] agent: Started DNS server 127.0.0.1:12101 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.110859 [INFO] agent: Started HTTP server on 127.0.0.1:12102 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:12.110942 [INFO] agent: started state syncer
2019/11/27 02:22:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:12 [INFO]  raft: Node at 127.0.0.1:12106 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:12 [INFO]  raft: Node at [::1]:12100 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a36d7bcb-6347-c11c-1f74-bcbd137a9287 Address:127.0.0.1:12112}]
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.320462 [INFO] serf: EventMemberJoin: test node.dc1 127.0.0.1
2019/11/27 02:22:12 [INFO]  raft: Node at 127.0.0.1:12112 [Follower] entering Follower state (Leader: "")
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.328158 [INFO] serf: EventMemberJoin: test node 127.0.0.1
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.329028 [INFO] consul: Adding LAN server test node (Addr: tcp/127.0.0.1:12112) (DC: dc1)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.329328 [INFO] consul: Handled member-join event for server "test node.dc1" in area "wan"
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.329360 [INFO] agent: Started DNS server 127.0.0.1:12107 (udp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.329741 [INFO] agent: Started DNS server 127.0.0.1:12107 (tcp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.331605 [INFO] agent: Started HTTP server on 127.0.0.1:12108 (tcp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:12.331787 [INFO] agent: started state syncer
2019/11/27 02:22:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:12 [INFO]  raft: Node at 127.0.0.1:12112 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:12 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:12 [INFO]  raft: Node at 127.0.0.1:12094 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:12.541364 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:12.541845 [INFO] consul: New leader elected: test-node
jones - 2019/11/27 02:22:12.614176 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
jones - 2019/11/27 02:22:12.614275 [DEBUG] agent: Node info in sync
2019/11/27 02:22:13 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:13 [INFO]  raft: Node at 127.0.0.1:12106 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:13.150907 [INFO] agent: Synced node info
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:13.151038 [DEBUG] agent: Node info in sync
2019/11/27 02:22:13 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:13 [INFO]  raft: Node at [::1]:12100 [Leader] entering Leader state
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:13.158575 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:13.159013 [INFO] consul: New leader elected: test node
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:13.159273 [INFO] consul: cluster leadership acquired
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:13.159688 [INFO] consul: New leader elected: server1
2019/11/27 02:22:13 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:13 [INFO]  raft: Node at 127.0.0.1:12112 [Leader] entering Leader state
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.252165 [INFO] consul: cluster leadership acquired
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.252559 [INFO] consul: New leader elected: test node
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:13.625203 [INFO] agent: Synced node info
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:13.649798 [INFO] agent: Synced node info
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:13.649940 [DEBUG] agent: Node info in sync
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.695585 [INFO] agent: Synced node info
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.695841 [DEBUG] agent: Node info in sync
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.870041 [ERR] dns: Infinite recursion detected for web.service.consul., won't perform any CNAME resolution.
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.870378 [DEBUG] dns: request for name web.service.consul. type A class IN (took 1.348049ms) from client 127.0.0.1:50042 (udp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.870482 [INFO] agent: Requesting shutdown
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.870564 [INFO] consul: shutting down server
TestDNS_InifiniteRecursion - 2019/11/27 02:22:13.870614 [WARN] serf: Shutdown without a Leave
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.020426 [WARN] serf: Shutdown without a Leave
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.119190 [INFO] manager: shutting down
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.121619 [INFO] agent: consul server down
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.121754 [INFO] agent: shutdown complete
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.121832 [INFO] agent: Stopping DNS server 127.0.0.1:12107 (tcp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.122007 [INFO] agent: Stopping DNS server 127.0.0.1:12107 (udp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.122179 [INFO] agent: Stopping HTTP server 127.0.0.1:12108 (tcp)
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.122431 [INFO] agent: Waiting for endpoints to shut down
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.122509 [INFO] agent: Endpoints down
--- PASS: TestDNS_InifiniteRecursion (3.05s)
=== CONT  TestDNS_NodeLookup_CNAME
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.129186 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDNS_InifiniteRecursion - 2019/11/27 02:22:14.129556 [ERR] consul: failed to establish leadership: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:14.200965 [WARN] agent: Node name "Node ee455325-a2fd-d54a-2629-8b27df26f810" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:14.201391 [DEBUG] tlsutil: Update with version 1
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:14.201459 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:14.201621 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:14.201779 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:14.635717 [DEBUG] agent: Node info in sync
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:14.635827 [DEBUG] agent: Node info in sync
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:14.888312 [DEBUG] dns: request for name alias.service.consul. type SRV class IN (took 1.245045ms) from client 127.0.0.1:51359 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:14.891580 [DEBUG] dns: request for name alias.service.CoNsUl. type SRV class IN (took 1.048037ms) from client 127.0.0.1:54825 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:14.891740 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:14.891828 [INFO] consul: shutting down server
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:14.891898 [WARN] serf: Shutdown without a Leave
jones - 2019/11/27 02:22:15.087710 [DEBUG] consul: Skipping self join check for "Node 3a0dee63-0112-ab1b-d438-213ed51c845e" since the cluster is too small
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.088126 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.098217 [DEBUG] dns: request for name alias2.service.consul. type SRV class IN (took 1.227378ms) from client 127.0.0.1:37574 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.098494 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.098555 [INFO] consul: shutting down server
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.098595 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.172045 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.173026 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.173422 [INFO] agent: consul server down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.173478 [INFO] agent: shutdown complete
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.173535 [INFO] agent: Stopping DNS server 127.0.0.1:12101 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.173691 [INFO] agent: Stopping DNS server 127.0.0.1:12101 (udp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.173857 [INFO] agent: Stopping HTTP server 127.0.0.1:12102 (tcp)
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.174088 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.174159 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceToConsulCNAMELookup (4.26s)
=== CONT  TestDNS_ExternalServiceLookup
TestDNS_ExternalServiceToConsulCNAMELookup - 2019/11/27 02:22:15.188905 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:15.257488 [WARN] agent: Node name "Node c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:15.257955 [DEBUG] tlsutil: Update with version 1
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:15.258026 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:15.258436 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:15.258543 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.262848 [INFO] manager: shutting down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.263345 [INFO] agent: consul server down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.263408 [INFO] agent: shutdown complete
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.263467 [INFO] agent: Stopping DNS server 127.0.0.1:12089 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.263604 [INFO] agent: Stopping DNS server 127.0.0.1:12089 (udp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.263766 [INFO] agent: Stopping HTTP server 127.0.0.1:12090 (tcp)
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.263994 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.264095 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceToConsulCNAMENestedLookup (4.75s)
=== CONT  TestDNS_ConnectServiceLookup
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.265119 [ERR] connect: Apply failed raft is already shutdown
TestDNS_ExternalServiceToConsulCNAMENestedLookup - 2019/11/27 02:22:15.265188 [ERR] consul: failed to establish leadership: raft is already shutdown
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.265606 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.266004 [DEBUG] consul: Skipping self join check for "server1" since the cluster is too small
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.266160 [INFO] consul: member 'server1' joined, marking health alive
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:15.339925 [WARN] agent: Node name "Node e9c0da27-055f-2471-5118-35cc9ec7918c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:15.340509 [DEBUG] tlsutil: Update with version 1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:15.340603 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:15.340837 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:15.340954 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:15 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ee455325-a2fd-d54a-2629-8b27df26f810 Address:127.0.0.1:12118}]
2019/11/27 02:22:15 [INFO]  raft: Node at 127.0.0.1:12118 [Follower] entering Follower state (Leader: "")
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.445240 [INFO] serf: EventMemberJoin: Node ee455325-a2fd-d54a-2629-8b27df26f810.dc1 127.0.0.1
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.452231 [DEBUG] dns: request for name server1.node.dc1.consul. type NS class IN (took 601.022µs) from client 127.0.0.1:38463 (udp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.452430 [INFO] agent: Requesting shutdown
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.452504 [INFO] consul: shutting down server
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.452564 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.456306 [INFO] serf: EventMemberJoin: Node ee455325-a2fd-d54a-2629-8b27df26f810 127.0.0.1
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.457479 [INFO] consul: Adding LAN server Node ee455325-a2fd-d54a-2629-8b27df26f810 (Addr: tcp/127.0.0.1:12118) (DC: dc1)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.458157 [INFO] consul: Handled member-join event for server "Node ee455325-a2fd-d54a-2629-8b27df26f810.dc1" in area "wan"
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.459761 [INFO] agent: Started DNS server 127.0.0.1:12113 (tcp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.460045 [INFO] agent: Started DNS server 127.0.0.1:12113 (udp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.462577 [INFO] agent: Started HTTP server on 127.0.0.1:12114 (tcp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:15.462692 [INFO] agent: started state syncer
2019/11/27 02:22:15 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:15 [INFO]  raft: Node at 127.0.0.1:12118 [Candidate] entering Candidate state in term 2
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.616470 [WARN] serf: Shutdown without a Leave
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.694317 [INFO] manager: shutting down
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.694900 [INFO] agent: consul server down
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.694966 [INFO] agent: shutdown complete
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.695022 [INFO] agent: Stopping DNS server 127.0.0.1:12095 (tcp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.697788 [INFO] agent: Stopping DNS server 127.0.0.1:12095 (udp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.698380 [INFO] agent: Stopping HTTP server 127.0.0.1:12096 (tcp)
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.698646 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NSRecords_IPV6 - 2019/11/27 02:22:15.698729 [INFO] agent: Endpoints down
--- PASS: TestDNS_NSRecords_IPV6 (4.91s)
=== CONT  TestDNS_ServiceLookupWithInternalServiceAddress
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:15.781132 [WARN] agent: Node name "my.test-node" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:15.781539 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:15.781608 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:15.781880 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:15.781997 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:16 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12118 [Leader] entering Leader state
2019/11/27 02:22:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18 Address:127.0.0.1:12124}]
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.109466 [INFO] serf: EventMemberJoin: Node c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18.dc1 127.0.0.1
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:16.110952 [INFO] consul: cluster leadership acquired
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:16.111472 [INFO] consul: New leader elected: Node ee455325-a2fd-d54a-2629-8b27df26f810
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12124 [Follower] entering Follower state (Leader: "")
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.120166 [INFO] serf: EventMemberJoin: Node c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18 127.0.0.1
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.121738 [INFO] consul: Adding LAN server Node c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18 (Addr: tcp/127.0.0.1:12124) (DC: dc1)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.121993 [INFO] consul: Handled member-join event for server "Node c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18.dc1" in area "wan"
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.124353 [INFO] agent: Started DNS server 127.0.0.1:12119 (tcp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.124942 [INFO] agent: Started DNS server 127.0.0.1:12119 (udp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.129867 [INFO] agent: Started HTTP server on 127.0.0.1:12120 (tcp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.129969 [INFO] agent: started state syncer
2019/11/27 02:22:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12124 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e9c0da27-055f-2471-5118-35cc9ec7918c Address:127.0.0.1:12130}]
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12130 [Follower] entering Follower state (Leader: "")
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.377095 [INFO] serf: EventMemberJoin: Node e9c0da27-055f-2471-5118-35cc9ec7918c.dc1 127.0.0.1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.385388 [INFO] serf: EventMemberJoin: Node e9c0da27-055f-2471-5118-35cc9ec7918c 127.0.0.1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.386255 [INFO] consul: Handled member-join event for server "Node e9c0da27-055f-2471-5118-35cc9ec7918c.dc1" in area "wan"
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.386541 [INFO] consul: Adding LAN server Node e9c0da27-055f-2471-5118-35cc9ec7918c (Addr: tcp/127.0.0.1:12130) (DC: dc1)
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.387273 [INFO] agent: Started DNS server 127.0.0.1:12125 (udp)
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.387462 [INFO] agent: Started DNS server 127.0.0.1:12125 (tcp)
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.389779 [INFO] agent: Started HTTP server on 127.0.0.1:12126 (tcp)
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:16.389883 [INFO] agent: started state syncer
2019/11/27 02:22:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12130 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:16.543122 [INFO] agent: Synced node info
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:16.543266 [DEBUG] agent: Node info in sync
2019/11/27 02:22:16 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12124 [Leader] entering Leader state
2019/11/27 02:22:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5d63da73-3d21-5ea8-13bd-015479ed2439 Address:127.0.0.1:12136}]
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.775357 [INFO] consul: cluster leadership acquired
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:16.775821 [INFO] consul: New leader elected: Node c086fdf2-d71a-7c0f-26c8-6e07ecaa4a18
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12136 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.780164 [INFO] serf: EventMemberJoin: my.test-node.dc1 127.0.0.1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.783679 [INFO] serf: EventMemberJoin: my.test-node 127.0.0.1
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.784921 [INFO] consul: Adding LAN server my.test-node (Addr: tcp/127.0.0.1:12136) (DC: dc1)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.785017 [INFO] consul: Handled member-join event for server "my.test-node.dc1" in area "wan"
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.786254 [INFO] agent: Started DNS server 127.0.0.1:12131 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.786351 [INFO] agent: Started DNS server 127.0.0.1:12131 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.788365 [INFO] agent: Started HTTP server on 127.0.0.1:12132 (tcp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:16.788466 [INFO] agent: started state syncer
2019/11/27 02:22:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:12136 [Candidate] entering Candidate state in term 2
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.020782 [DEBUG] dns: cname recurse RTT for www.google.com. (656.357µs)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.021042 [DEBUG] dns: request for name google.node.consul. type ANY class IN (took 1.658393ms) from client 127.0.0.1:34168 (udp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.021224 [INFO] agent: Requesting shutdown
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.021288 [INFO] consul: shutting down server
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.021333 [WARN] serf: Shutdown without a Leave
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.095090 [WARN] serf: Shutdown without a Leave
2019/11/27 02:22:17 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:17 [INFO]  raft: Node at 127.0.0.1:12130 [Leader] entering Leader state
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:17.100454 [INFO] consul: cluster leadership acquired
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:17.100938 [INFO] consul: New leader elected: Node e9c0da27-055f-2471-5118-35cc9ec7918c
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:17.173055 [INFO] agent: Synced node info
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.175198 [INFO] manager: shutting down
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.249968 [INFO] agent: consul server down
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.250050 [INFO] agent: shutdown complete
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.250149 [INFO] agent: Stopping DNS server 127.0.0.1:12113 (tcp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.250336 [INFO] agent: Stopping DNS server 127.0.0.1:12113 (udp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.250525 [INFO] agent: Stopping HTTP server 127.0.0.1:12114 (tcp)
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.250763 [INFO] agent: Waiting for endpoints to shut down
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.250843 [INFO] agent: Endpoints down
--- PASS: TestDNS_NodeLookup_CNAME (3.13s)
=== CONT  TestDNS_ServiceLookup
TestDNS_NodeLookup_CNAME - 2019/11/27 02:22:17.254762 [ERR] consul: failed to establish leadership: leadership lost while committing log
2019/11/27 02:22:17 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:17 [INFO]  raft: Node at 127.0.0.1:12136 [Leader] entering Leader state
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:17.328685 [INFO] consul: cluster leadership acquired
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:17.329135 [INFO] consul: New leader elected: my.test-node
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookup - 2019/11/27 02:22:17.364851 [WARN] agent: Node name "Node b632957b-67b4-8a91-e324-5e2ddcefc5e5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookup - 2019/11/27 02:22:17.365261 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookup - 2019/11/27 02:22:17.365328 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookup - 2019/11/27 02:22:17.365520 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookup - 2019/11/27 02:22:17.365630 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:17.672961 [INFO] agent: Synced node info
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:17.828342 [INFO] agent: Synced node info
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:17.828490 [DEBUG] agent: Node info in sync
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:17.832954 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 654.357µs) from client 127.0.0.1:39341 (udp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:17.833380 [INFO] agent: Requesting shutdown
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:17.833465 [INFO] consul: shutting down server
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:17.833511 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:17.916471 [WARN] serf: Shutdown without a Leave
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.005369 [INFO] manager: shutting down
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.005979 [INFO] agent: consul server down
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.006042 [INFO] agent: shutdown complete
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.006103 [INFO] agent: Stopping DNS server 127.0.0.1:12119 (tcp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.006269 [INFO] agent: Stopping DNS server 127.0.0.1:12119 (udp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.006471 [INFO] agent: Stopping HTTP server 127.0.0.1:12120 (tcp)
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.006831 [INFO] agent: Waiting for endpoints to shut down
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.006918 [INFO] agent: Endpoints down
--- PASS: TestDNS_ExternalServiceLookup (2.83s)
=== CONT  TestDNS_ServiceLookupMultiAddrNoCNAME
TestDNS_ExternalServiceLookup - 2019/11/27 02:22:18.008866 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/11/27 02:22:18.127078 [WARN] agent: Node name "Node 84b4f1a8-606c-6f86-8be9-7753a72f814f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/11/27 02:22:18.129827 [DEBUG] tlsutil: Update with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/11/27 02:22:18.132114 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/11/27 02:22:18.133745 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDNS_ServiceLookupMultiAddrNoCNAME - 2019/11/27 02:22:18.133899 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:18.355591 [DEBUG] dns: request for name db.connect.consul. type SRV class IN (took 690.359µs) from client 127.0.0.1:38577 (udp)
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:18.356135 [INFO] agent: Requesting shutdown
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:18.356218 [INFO] consul: shutting down server
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:18.356266 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:18.361166 [DEBUG] dns: request for name db.service.consul. type SRV class IN (took 994.703µs) from client 127.0.0.1:58588 (udp)
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:18.361525 [INFO] agent: Requesting shutdown
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:18.361588 [INFO] consul: shutting down server
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:18.361632 [WARN] serf: Shutdown without a Leave
TestDNS_ConnectServiceLookup - 2019/11/27 02:22:18.460788 [WARN] serf: Shutdown without a Leave
TestDNS_ServiceLookupWithInternalServiceAddress - 2019/11/27 02:22:18.460788 [WARN] serf: Shutdown without a Leave
2019/11/27 02:22:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b632957b-67b4-8a91-e324-5e2ddcefc5e5 Address:127.0.0.1:12142}]
2019/11/27 02:22:18 [INFO]  raft: Node at 127.0.0.1:12142 [Follower] entering Follower state (Leader: "")
TestDNS_ServiceLookup - 2019/11/27 02:22:18.467144 [INFO] serf: EventMemberJoin: Node b632957b-67b4-8a91-e324-5e2ddcefc5e5.dc1 127.0.0.1
TestDNS_ServiceLookup - 2019/11/27 02:22:18.472443 [INFO] serf: EventMemberJoin: Node b632957b-67b4-8a91-e324-5e2ddcefc5e5 127.0.0.1
TestDNS_ServiceLookup - 2019/11/27 02:22:18.473362 [INFO] consul: Adding LAN server Node b632957b-67b4-8a91-e324-5e2ddcefc5e5 (Addr: tcp/127.0.0.1:12142) (DC: dc1)
TestDNS_ServiceLookup - 2019/11/27 02:22:18.473448 [INFO] consul: Handled member-join event for server "Node b632957b-67b4-8a91-e324-5e2ddcefc5e5.dc1" in area "wan"
TestDNS_ServiceLookup - 2019/11/27 02:22:18.474697 [INFO] agent: Started DNS server 127.0.0.1:12137 (tcp)
TestDNS_ServiceLookup - 2019/11/27 02:22:18.475119 [INFO] agent: Started DNS server 127.0.0.1:12137 (udp)
TestDNS_ServiceLookup - 2019/11/27 02:22:18.477240 [INFO] agent: Started HTTP server on 127.0.0.1:12138 (tcp)
TestDNS_ServiceLookup - 2019/11/27 02:22:18.477359 [INFO] agent: started state syncer
2019/11/27 02:22:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:18 [INFO]  raft: Node at 127.0.0.1:12142 [Candidate] entering Candidate state in term 2
panic: test timed out after 5m0s

goroutine 18292 [running]:
testing.(*M).startAlarm.func1()
	/usr/lib/go-1.13/src/testing/testing.go:1377 +0xbc
created by time.goFunc
	/usr/lib/go-1.13/src/time/sleep.go:168 +0x34

goroutine 1 [chan receive, 2 minutes]:
testing.tRunner.func1(0x466a140)
	/usr/lib/go-1.13/src/testing/testing.go:885 +0x1b4
testing.tRunner(0x466a140, 0x4765ed0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
testing.runTests(0x4536c30, 0x2651ac0, 0x1d9, 0x1d9, 0x0)
	/usr/lib/go-1.13/src/testing/testing.go:1200 +0x238
testing.(*M).Run(0x47a6e40, 0x0)
	/usr/lib/go-1.13/src/testing/testing.go:1117 +0x13c
main.main()
	_testmain.go:988 +0x120

goroutine 19 [syscall, 4 minutes]:
os/signal.signal_recv(0x0)
	/usr/lib/go-1.13/src/runtime/sigqueue.go:147 +0x130
os/signal.loop()
	/usr/lib/go-1.13/src/os/signal/signal_unix.go:23 +0x14
created by os/signal.init.0
	/usr/lib/go-1.13/src/os/signal/signal_unix.go:29 +0x30

goroutine 7032 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4550160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 36 [select]:
go.opencensus.io/stats/view.(*worker).start(0x47a6700)
	/<<PKGBUILDDIR>>/_build/src/go.opencensus.io/stats/view/worker.go:154 +0xb0
created by go.opencensus.io/stats/view.init.0
	/<<PKGBUILDDIR>>/_build/src/go.opencensus.io/stats/view/worker.go:32 +0x48

goroutine 7092 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4554400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4554400, 0x4b8a0d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 15117 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x5018a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x5018a00, 0x5586ec0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 38 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Update(0x466a280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:73 +0x20
testing.tRunner(0x466a280, 0x15c1ed0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 39 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_UpdateUpsert(0x466a320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:103 +0x20
testing.tRunner(0x466a320, 0x15c1ecc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 40 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a3c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Destroy(0x466a3c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:132 +0x20
testing.tRunner(0x466a3c0, 0x15c1eb4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 41 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Clone(0x466a460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:164 +0x20
testing.tRunner(0x466a460, 0x15c1eb0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 42 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Legacy_Get(0x466a500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:208 +0x1c
testing.tRunner(0x466a500, 0x15c1ec4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 44 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACLReplicationStatus(0x466a640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_legacy_test.go:283 +0x1c
testing.tRunner(0x466a640, 0x15c1e98)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 45 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a6e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Disabled_Response(0x466a6e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:21 +0x20
testing.tRunner(0x466a6e0, 0x15c1ea8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 46 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Bootstrap(0x466a780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:66 +0x20
testing.tRunner(0x466a780, 0x15c1ea4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 47 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_HTTP(0x466a820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_endpoint_test.go:114 +0x1c
testing.tRunner(0x466a820, 0x15c1eac)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 48 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a8c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_Version8(0x466a8c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:151 +0x1c
testing.tRunner(0x466a8c0, 0x15c1ee0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 49 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_AgentMasterToken(0x466a960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:185 +0x1c
testing.tRunner(0x466a960, 0x15c1ea0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 50 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466aa00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_RootAuthorizersDenied(0x466aa00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:205 +0x1c
testing.tRunner(0x466aa00, 0x15c1ed4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 51 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466aaa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetServiceRegister(0x466aaa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:272 +0x20
testing.tRunner(0x466aaa0, 0x15c1ef8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 52 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466ab40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetServiceUpdate(0x466ab40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:303 +0x1c
testing.tRunner(0x466ab40, 0x15c1efc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 53 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466abe0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetCheckRegister(0x466abe0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:326 +0x20
testing.tRunner(0x466abe0, 0x15c1ef0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 54 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466ac80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_vetCheckUpdate(0x466ac80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:392 +0x1c
testing.tRunner(0x466ac80, 0x15c1ef4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 55 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466ad20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_filterMembers(0x466ad20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:432 +0x1c
testing.tRunner(0x466ad20, 0x15c1ee8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 56 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466adc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_filterServices(0x466adc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:451 +0x1c
testing.tRunner(0x466adc0, 0x15c1eec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 57 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466ae60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACL_filterChecks(0x466ae60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/acl_test.go:465 +0x1c
testing.tRunner(0x466ae60, 0x15c1ee4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 58 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466af00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services(0x466af00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:57 +0x20
testing.tRunner(0x466af00, 0x15c2174)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 59 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466afa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services_ExternalConnectProxy(0x466afa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:107 +0x20
testing.tRunner(0x466afa0, 0x15c216c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 60 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services_Sidecar(0x466b040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:143 +0x20
testing.tRunner(0x466b040, 0x15c2170)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 61 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b0e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Services_ACLFilter(0x466b0e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:193 +0x1c
testing.tRunner(0x466b0e0, 0x15c2168)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 63 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Service_DeprecatedManagedProxy(0x466b220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:529 +0x20
testing.tRunner(0x466b220, 0x15c2158)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 64 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b2c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Checks(0x466b2c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:606 +0x1c
testing.tRunner(0x466b2c0, 0x15c1fc0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 65 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_HealthServiceByID(0x466b360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:634 +0x1c
testing.tRunner(0x466b360, 0x15c2004)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 66 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_HealthServiceByName(0x466b400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:829 +0x1c
testing.tRunner(0x466b400, 0x15c2008)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 67 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b4a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Checks_ACLFilter(0x466b4a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1072 +0x1c
testing.tRunner(0x466b4a0, 0x15c1fbc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 68 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Self(0x466b540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1111 +0x20
testing.tRunner(0x466b540, 0x15c2144)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 69 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b5e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Self_ACLDeny(0x466b5e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1149 +0x1c
testing.tRunner(0x466b5e0, 0x15c2140)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 70 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Metrics_ACLDeny(0x466b680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1178 +0x1c
testing.tRunner(0x466b680, 0x15c2058)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 71 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Reload(0x466b720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1207 +0x20
testing.tRunner(0x466b720, 0x15c212c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 72 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b7c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Reload_ACLDeny(0x466b7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1280 +0x1c
testing.tRunner(0x466b7c0, 0x15c2128)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 73 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Members(0x466b860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1307 +0x1c
testing.tRunner(0x466b860, 0x15c2054)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 74 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Members_WAN(0x466b900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1328 +0x1c
testing.tRunner(0x466b900, 0x15c2050)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 75 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b9a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Members_ACLFilter(0x466b9a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1349 +0x1c
testing.tRunner(0x466b9a0, 0x15c204c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 76 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466ba40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Join(0x466ba40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1380 +0x20
testing.tRunner(0x466ba40, 0x15c2038)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 77 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Join_WAN(0x466bae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1410 +0x20
testing.tRunner(0x466bae0, 0x15c2034)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 78 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bb80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Join_ACLDeny(0x466bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1440 +0x20
testing.tRunner(0x466bb80, 0x15c2030)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 79 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bc20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_JoinLANNotify(0x466bc20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1482 +0x20
testing.tRunner(0x466bc20, 0x15c202c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 81 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bd60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Leave_ACLDeny(0x466bd60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1545 +0x1c
testing.tRunner(0x466bd60, 0x15c203c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 83 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ForceLeave_ACLDeny(0x466bea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1620 +0x1c
testing.tRunner(0x466bea0, 0x15c1fe8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 84 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bf40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck(0x466bf40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1649 +0x1c
testing.tRunner(0x466bf40, 0x15c20dc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 85 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_Scripts(0x4802000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1692 +0x20
testing.tRunner(0x4802000, 0x15c20d4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 86 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48020a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheckScriptsExecDisable(0x48020a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1777 +0x20
testing.tRunner(0x48020a0, 0x15c20c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 87 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheckScriptsExecRemoteDisable(0x4802140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1803 +0x20
testing.tRunner(0x4802140, 0x15c20c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 88 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48021e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_Passing(0x48021e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1831 +0x1c
testing.tRunner(0x48021e0, 0x15c20d0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 89 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_BadStatus(0x4802280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1867 +0x1c
testing.tRunner(0x4802280, 0x15c20cc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 90 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_ACLDeny(0x4802320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1888 +0x1c
testing.tRunner(0x4802320, 0x15c20c8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 91 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48023c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterCheck(0x48023c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1914 +0x1c
testing.tRunner(0x48023c0, 0x15c1fcc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 92 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterCheckACLDeny(0x4802460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1940 +0x1c
testing.tRunner(0x4802460, 0x15c1fc8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 93 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PassCheck(0x4802500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1966 +0x1c
testing.tRunner(0x4802500, 0x15c2088)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 94 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48025a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PassCheck_ACLDeny(0x48025a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:1994 +0x1c
testing.tRunner(0x48025a0, 0x15c2084)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 95 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_WarnCheck(0x4802640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2021 +0x1c
testing.tRunner(0x4802640, 0x15c219c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 96 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48026e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_WarnCheck_ACLDeny(0x48026e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2049 +0x1c
testing.tRunner(0x48026e0, 0x15c2198)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 97 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_FailCheck(0x4802780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2076 +0x1c
testing.tRunner(0x4802780, 0x15c1fe4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 98 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_FailCheck_ACLDeny(0x4802820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2104 +0x1c
testing.tRunner(0x4802820, 0x15c1fe0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 99 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48028c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_UpdateCheck(0x48028c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2131 +0x20
testing.tRunner(0x48028c0, 0x15c2194)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 100 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_UpdateCheck_ACLDeny(0x4802960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2215 +0x1c
testing.tRunner(0x4802960, 0x15c2190)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 101 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService(0x4802a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2244 +0x1c
testing.tRunner(0x4802a00, 0x15c2118)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 102 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_TranslateKeys(0x4802aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2311 +0x20
testing.tRunner(0x4802aa0, 0x15c210c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 103 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ACLDeny(0x4802b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2514 +0x1c
testing.tRunner(0x4802b40, 0x15c20ec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 104 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_InvalidAddress(0x4802be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2552 +0x1c
testing.tRunner(0x4802be0, 0x15c20f4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 105 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ManagedConnectProxy(0x4802c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2582 +0x20
testing.tRunner(0x4802c80, 0x15c2100)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 106 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ManagedConnectProxyDeprecated(0x4802d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2653 +0x20
testing.tRunner(0x4802d20, 0x15c20f8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 107 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ManagedConnectProxy_Disabled(0x4802dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2751 +0x20
testing.tRunner(0x4802dc0, 0x15c20fc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 108 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_UnmanagedConnectProxy(0x4802e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2790 +0x20
testing.tRunner(0x4802e60, 0x15c2114)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 109 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterServiceDeregisterService_Sidecar(0x4802f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:2913 +0x20
testing.tRunner(0x4802f00, 0x15c20e8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 110 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4802fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_UnmanagedConnectProxyInvalid(0x4802fa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3394 +0x20
testing.tRunner(0x4802fa0, 0x15c2110)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 111 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ConnectNative(0x4803040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3427 +0x20
testing.tRunner(0x4803040, 0x15c20f0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 112 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48030e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ScriptCheck_ExecDisable(0x48030e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3461 +0x1c
testing.tRunner(0x48030e0, 0x15c2104)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 113 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterService_ScriptCheck_ExecRemoteDisable(0x4803180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3497 +0x1c
testing.tRunner(0x4803180, 0x15c2108)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 114 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService(0x4822000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3535 +0x1c
testing.tRunner(0x4822000, 0x15c1fdc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 115 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48220a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService_ACLDeny(0x48220a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3568 +0x1c
testing.tRunner(0x48220a0, 0x15c1fd0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 116 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService_withManagedProxy(0x4822140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3597 +0x20
testing.tRunner(0x4822140, 0x15c1fd8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 117 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48221e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_DeregisterService_managedProxyDirect(0x48221e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3653 +0x20
testing.tRunner(0x48221e0, 0x15c1fd4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 118 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServiceMaintenance_BadRequest(0x4822280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3701 +0x1c
testing.tRunner(0x4822280, 0x15c214c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 120 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48223c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServiceMaintenance_Disable(0x48223c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3785 +0x20
testing.tRunner(0x48223c0, 0x15c2150)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 121 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ServiceMaintenance_ACLDeny(0x4822460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3822 +0x1c
testing.tRunner(0x4822460, 0x15c2148)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 122 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_BadRequest(0x4822500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3852 +0x1c
testing.tRunner(0x4822500, 0x15c2078)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 123 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48225a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_Enable(0x48225a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3869 +0x20
testing.tRunner(0x48225a0, 0x15c2080)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 124 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_Disable(0x4822640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3902 +0x1c
testing.tRunner(0x4822640, 0x15c207c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 125 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48226e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenance_ACLDeny(0x48226e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3927 +0x1c
testing.tRunner(0x48226e0, 0x15c2074)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 126 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RegisterCheck_Service(0x4822780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:3948 +0x1c
testing.tRunner(0x4822780, 0x15c20d8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 128 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48228c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Monitor_ACLDeny(0x48228c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4063 +0x1c
testing.tRunner(0x48228c0, 0x15c2060)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 129 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Token(0x4822960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4081 +0x20
testing.tRunner(0x4822960, 0x15c218c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 130 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCARoots_empty(0x4822a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4339 +0x1c
testing.tRunner(0x4822a00, 0x15c1f4c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 131 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCARoots_list(0x4822aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4354 +0x20
testing.tRunner(0x4822aa0, 0x15c1f50)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 132 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclDefaultDeny(0x4822b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4432 +0x20
testing.tRunner(0x4822b40, 0x15c1f2c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 133 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclProxyToken(0x4822be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4469 +0x20
testing.tRunner(0x4822be0, 0x15c1f34)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 134 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclProxyTokenOther(0x4822c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4515 +0x20
testing.tRunner(0x4822c80, 0x15c1f30)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 135 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclServiceWrite(0x4822d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4580 +0x20
testing.tRunner(0x4822d20, 0x15c1f3c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 136 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_aclServiceReadDeny(0x4822dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4638 +0x20
testing.tRunner(0x4822dc0, 0x15c1f38)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 137 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectCALeafCert_good(0x4822e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:4693 +0x20
testing.tRunner(0x4822e60, 0x15c1f48)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 140 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclDefaultDeny(0x4823040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5139 +0x20
testing.tRunner(0x4823040, 0x15c1f5c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 141 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48230e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclProxyToken(0x48230e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5175 +0x20
testing.tRunner(0x48230e0, 0x15c1f60)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 142 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclServiceWrite(0x4823180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5223 +0x20
testing.tRunner(0x4823180, 0x15c1f68)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 143 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectProxyConfig_aclServiceReadDeny(0x4823220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5282 +0x20
testing.tRunner(0x4823220, 0x15c1f64)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 145 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_badBody(0x4823360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5692 +0x20
testing.tRunner(0x4823360, 0x15c1f08)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 146 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_noTarget(0x4823400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5712 +0x20
testing.tRunner(0x4823400, 0x15c1f24)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 147 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48234a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_idInvalidFormat(0x48234a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5733 +0x20
testing.tRunner(0x48234a0, 0x15c1f1c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 148 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_idNotService(0x4823540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5757 +0x20
testing.tRunner(0x4823540, 0x15c1f20)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 149 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48235e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_allow(0x48235e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5781 +0x20
testing.tRunner(0x48235e0, 0x15c1f04)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 150 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_deny(0x4823680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5878 +0x20
testing.tRunner(0x4823680, 0x15c1f18)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 151 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_allowTrustDomain(0x4823720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5927 +0x20
testing.tRunner(0x4823720, 0x15c1f00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 152 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48237c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_denyWildcard(0x48237c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:5972 +0x20
testing.tRunner(0x48237c0, 0x15c1f14)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 153 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_serviceWrite(0x4823860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6053 +0x20
testing.tRunner(0x4823860, 0x15c1f28)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 154 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_defaultDeny(0x4823900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6091 +0x20
testing.tRunner(0x4823900, 0x15c1f10)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 155 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48239a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentConnectAuthorize_defaultAllow(0x48239a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6115 +0x20
testing.tRunner(0x48239a0, 0x15c1f0c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 156 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Host(0x4823a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6161 +0x20
testing.tRunner(0x4823a40, 0x15c2010)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 157 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_HostBadACL(0x4823ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_endpoint_test.go:6189 +0x20
testing.tRunner(0x4823ae0, 0x15c200c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 879 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_StartStop(0x45a4280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:114 +0x1c
testing.tRunner(0x45a4280, 0x15c2180)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1841 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0280, 0x15c2320)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6953 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fefc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a014, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492a000, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492a000, 0x482bab0, 0x0, 0x7a3ae4)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x49f8040, 0x48cf200, 0x12b58, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x49f8040, 0x0, 0x2c05c8, 0x4ab76a0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x44a8500, 0x49f8040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 972 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PersistService(0x45a5540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1439 +0x20
testing.tRunner(0x45a5540, 0x15c2094)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1701 [chan send]:
testing.tRunner.func1(0x45741e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45741e0, 0x15c2364)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 880 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a43c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RPCPing(0x45a43c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:133 +0x1c
testing.tRunner(0x45a43c0, 0x15c20b0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1203 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadCheckState(0x4823cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2877 +0x20
testing.tRunner(0x4823cc0, 0x15c21a4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1226 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogServices_NodeMetaFilter(0x4803d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:446 +0x20
testing.tRunner(0x4803d60, 0x15c2254)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1231 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x46140a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogServiceNodes_ConnectProxy(0x46140a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:831 +0x20
testing.tRunner(0x46140a0, 0x15c2240)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1232 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogConnectServiceNodes_good(0x4614140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:861 +0x20
testing.tRunner(0x4614140, 0x15c220c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1202 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_persistCheckState(0x4823b80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2830 +0x20
testing.tRunner(0x4823b80, 0x15c21d4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 881 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_TokenStore(0x45a4460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:144 +0x1c
testing.tRunner(0x45a4460, 0x15c2188)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6943 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee160, 0x3b9aca00, 0x0, 0x4bf0c40, 0x4b73d40, 0x4b8a440)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 1706 [chan send]:
testing.tRunner.func1(0x4574500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574500, 0x15c237c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1702 [chan send]:
testing.tRunner.func1(0x4574280)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574280, 0x15c2330)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7229 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x4973b90)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 6938 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fd70, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca294, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x44ca280, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x44ca280, 0x45ee4b0, 0x48cf15c, 0x9ecfe07c)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a74420, 0x48cf140, 0x2, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a74420, 0x7229c, 0x48cf140, 0x156537e)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bf0a80, 0x4a74420)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 1249 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_CaseInsensitiveNodeLookup(0x4614be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:273 +0x20
testing.tRunner(0x4614be0, 0x15c229c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6949 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x4763980, 0x4811420, 0x0, 0x4afbfc0, 0x4a57b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4c43280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 1688 [chan send]:
testing.tRunner.func1(0x4615540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4615540, 0x15c22c8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1923 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f03c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f03c0, 0x15c2318)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1236 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x46143c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestConnectCARoots_empty(0x46143c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/connect_ca_endpoint_test.go:21 +0x1c
testing.tRunner(0x46143c0, 0x15c2260)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7013 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x45439e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 6945 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee160, 0xbebc200, 0x0, 0x4bf0c80, 0x4b73d40, 0x4b8a450)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 1963 [chan send, 2 minutes]:
testing.tRunner.func1(0x44f1cc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1cc0, 0x15c2418)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 962 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_ExecRemoteDisable(0x45a4f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1061 +0x1c
testing.tRunner(0x45a4f00, 0x15c1f84)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 986 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadServices_sidecarSeparateToken(0x45a5e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2179 +0x1c
testing.tRunner(0x45a5e00, 0x15c21c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 989 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_unloadServices(0x4822320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2289 +0x20
testing.tRunner(0x4822320, 0x15c21f8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 953 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RemoveService(0x45a4960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:556 +0x20
testing.tRunner(0x45a4960, 0x15c213c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6907 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionsForNode(0x4fbc140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:547 +0x1c
testing.tRunner(0x4fbc140, 0x15c2688)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7096 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa59497ec, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a2e4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492a2d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492a2d0, 0x493a784, 0x493a774, 0x2)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x49f86d0, 0x10000, 0x10000, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x49f86d0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bd4e00, 0x49f86d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 6545 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f60a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_Recurse(0x51f60a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:76 +0x20
testing.tRunner(0x51f60a0, 0x15c2524)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 947 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a45a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ReconnectConfigWanDisabled(0x45a45a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:201 +0x1c
testing.tRunner(0x45a45a0, 0x15c20bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 961 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_ExecDisable(0x45a4e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1024 +0x1c
testing.tRunner(0x45a4e60, 0x15c1f80)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6528 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba15e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsCheck_noSource(0x4ba15e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:244 +0x20
testing.tRunner(0x4ba15e0, 0x15c24d0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6984 [select, 1 minutes]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4bf0cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7139 [IO wait]:
internal/poll.runtime_pollWait(0xa597f848, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492aa64, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x492aa50, 0x4a28000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x492aa50, 0x4a28000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4b8a998, 0x4a28000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x44ab9b0, 0x49ab770, 0xc, 0xc, 0x4631730, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x44ab9b0, 0x49ab770, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x48ba460, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x48ba460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 1239 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x46145a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCoordinate_Disabled_Response(0x46145a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go:18 +0x1c
testing.tRunner(0x46145a0, 0x15c2274)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 988 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadServices_sidecarOverrideMeta(0x45a5f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2248 +0x1c
testing.tRunner(0x45a5f40, 0x15c21bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6443 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseWait_InvalidIndex(0x4fbc500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:848 +0x1c
testing.tRunner(0x4fbc500, 0x15c2598)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1225 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogServices(0x4803cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:410 +0x20
testing.tRunner(0x4803cc0, 0x15c2258)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 991 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadProxies_nilProxy(0x4822f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2363 +0x1c
testing.tRunner(0x4822f00, 0x15c21b0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6425 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba08c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestContentTypeIsJSON(0x4ba08c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:424 +0x1c
testing.tRunner(0x4ba08c0, 0x15c226c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1675 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_SOA_Settings(0x4614640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1121 +0x1c
testing.tRunner(0x4614640, 0x15c2344)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7008 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x44caaf0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 1690 [chan send]:
testing.tRunner.func1(0x4615680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4615680, 0x15c239c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 982 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadChecks_token(0x45a5b80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2051 +0x1c
testing.tRunner(0x45a5b80, 0x15c21ac)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6608 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionDestroy(0x5202960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:309 +0x1c
testing.tRunner(0x5202960, 0x15c2660)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1104 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_checkStateSnapshot(0x45a5400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2738 +0x1c
testing.tRunner(0x45a5400, 0x15c21a0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1238 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestConnectCAConfig(0x4614500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/connect_ca_endpoint_test.go:65 +0x20
testing.tRunner(0x4614500, 0x15c225c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1103 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a41e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_NodeMaintenanceMode(0x45a41e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2693 +0x1c
testing.tRunner(0x45a41e0, 0x15c2070)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1670 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466a5a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ReverseLookup_CustomDomain(0x466a5a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:893 +0x20
testing.tRunner(0x466a5a0, 0x15c2338)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6955 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4764000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 1230 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogServiceNodes_DistanceSort(0x4614000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:741 +0x20
testing.tRunner(0x4614000, 0x15c2244)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6609 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCustomTTL(0x5202a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:328 +0x1c
testing.tRunner(0x5202a00, 0x15c2658)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6954 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f9d4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a064, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x492a050, 0x49d4000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x492a050, 0x49d4000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4410178, 0x49d4000, 0x10000, 0x10000, 0x45b8734, 0x101, 0x45b8708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4410178, 0x49d4000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x44a8500, 0x4410178)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 576 [chan receive, 4 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4b22180, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x48bf500, 0x0, 0x1407d08, 0x4572190, 0x0, 0x0, 0x0, 0x11b50c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:349 +0x120
github.com/hashicorp/consul/agent/consul.(*Catalog).Register(0x4a39930, 0x4572190, 0x2666c9c, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go:141 +0x2e0
reflect.Value.call(0x4598740, 0x4a39990, 0x13, 0x1538345, 0x4, 0x4ac8d88, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598740, 0x4a39990, 0x13, 0x4ac8d88, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x464d5a0, 0x4a415c0, 0x46563a8, 0x0, 0x48502d0, 0x49ed860, 0x1407d08, 0x4572190, 0x16, 0x11b50c0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4a415c0, 0x181be20, 0x494b860, 0x0, 0x4ac8ec4)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x48bf500, 0x154971f, 0x10, 0x1407d08, 0x4572140, 0x11b50c0, 0x2666c9c, 0x38, 0x4947640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent/local.(*State).syncNodeInfo(0x4542c60, 0x464cdc0, 0x4ac8ef4)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/local/state.go:1457 +0x118
github.com/hashicorp/consul/agent/local.(*State).SyncChanges(0x4542c60, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/local/state.go:1254 +0x3a0
github.com/hashicorp/consul/agent/local.(*State).SyncFull(0x4542c60, 0x153de00, 0x8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/local/state.go:1197 +0x44
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x492b900, 0x153de26, 0x8, 0x1, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:167 +0x3c4
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x492b900, 0x153de26, 0x8, 0x4ac8fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x492b900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 6546 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_DELETE_CAS(0x51f6140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:152 +0x20
testing.tRunner(0x51f6140, 0x15c250c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1205 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_GetCoordinate(0x4823e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2970 +0x1c
testing.tRunner(0x4823e00, 0x15c1ff0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7009 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4acd2e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 6965 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4678000, 0x1539225, 0x5, 0x4770140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 946 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ReconnectConfigSettings(0x45a4500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:165 +0x1c
testing.tRunner(0x45a4500, 0x15c20b8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1672 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466bcc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceReverseLookup(0x466bcc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:975 +0x20
testing.tRunner(0x466bcc0, 0x15c23c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6550 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f63c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_GET_Raw(0x51f63c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:402 +0x20
testing.tRunner(0x51f63c0, 0x15c2514)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6420 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHTTPAPI_BlockEndpoints(0x4ba0500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:288 +0x20
testing.tRunner(0x4ba0500, 0x15c2440)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1689 [chan send]:
testing.tRunner.func1(0x46155e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x46155e0, 0x15c2398)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 974 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PurgeService(0x45a5680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1573 +0x20
testing.tRunner(0x45a5680, 0x15c20ac)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1223 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodes_Blocking(0x4803b80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:255 +0x20
testing.tRunner(0x4803b80, 0x15c2228)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7019 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x45500b0, 0x2a05f200, 0x1, 0x4541fc0, 0x4bcd880, 0x50426d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 1699 [chan send]:
testing.tRunner.func1(0x45740a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45740a0, 0x15c2384)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7101 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee420, 0x2a05f200, 0x1, 0x4bd4f80, 0x49f47c0, 0x4b8a530)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 6522 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsList_values(0x4ba1220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:33 +0x20
testing.tRunner(0x4ba1220, 0x15c24e0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1708 [chan send]:
testing.tRunner.func1(0x4574640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574640, 0x15c2388)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6996 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x482b6c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 6418 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba03c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSetLastContact(0x4ba03c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:241 +0x20
testing.tRunner(0x4ba03c0, 0x15c2698)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 966 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_Alias_userToken(0x45a5180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1190 +0x1c
testing.tRunner(0x45a5180, 0x15c1f78)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1254 [chan send]:
testing.tRunner.func1(0x4614f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4614f00, 0x15c22ec)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1714 [chan receive, 1 minutes]:
testing.tRunner.func1(0x4574a00)
	/usr/lib/go-1.13/src/testing/testing.go:885 +0x1b4
testing.tRunner(0x4574a00, 0x15c235c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7075 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x49529a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 6525 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsMatch_byInvalid(0x4ba1400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:154 +0x20
testing.tRunner(0x4ba1400, 0x15c24e8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6542 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1ea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_InitKeyring(0x4ba1ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/keyring_test.go:236 +0x20
testing.tRunner(0x4ba1ea0, 0x15c2020)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 950 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddService(0x45a4780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:328 +0x20
testing.tRunner(0x45a4780, 0x15c1fb8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1962 [chan send, 2 minutes]:
testing.tRunner.func1(0x44f1c20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1c20, 0x15c248c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 4537 [select, 3 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x504d420, 0x4733a14, 0x20, 0x20, 0x50429a8, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x504d380, 0x181bb80, 0x504d420, 0x50429b0, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x504d380, 0x50a8540, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x48bfa40, 0x47b479c, 0x50a851c, 0x4733b78, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x454cff0, 0x47b4770, 0x50a8500, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4598980, 0x4799650, 0x13, 0x1538345, 0x4, 0x4733d54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598980, 0x4799650, 0x13, 0x49a1d54, 0x3, 0x3, 0x1284901, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x4a69a80, 0x48ae330, 0x50be028, 0x0, 0x454d0e0, 0x49f71e0, 0x1480eb8, 0x47b4770, 0x16, 0x1199a78, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x48ae330, 0x181be20, 0x504d2c0, 0x44e40d8, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x48bfa40, 0x1547bdf, 0xf, 0x1480eb8, 0x47b4690, 0x1199a78, 0x50a84c0, 0x7c0510, 0x40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x44e4000, 0x1547bdf, 0xf, 0x1480eb8, 0x47b4690, 0x1199a78, 0x50a84c0, 0x4ff52c4, 0x485a960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x4a38838, 0xe, 0x0, 0xb2c97000, 0x8b, 0x504d2a0, 0x1807978, 0x47b4690, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6c8, 0x4a38838, 0x48c53c0, 0x1199a78, 0x50a8440, 0x0, 0x0, 0x0, 0x0, 0xe, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 6415 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba01e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHTTPServer_H2(0x4ba01e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:131 +0x20
testing.tRunner(0x4ba01e0, 0x15c245c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6552 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_DELETE_ConflictingFlags(0x51f6500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:451 +0x20
testing.tRunner(0x51f6500, 0x15c2510)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7100 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44ee420)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 1698 [chan send]:
testing.tRunner.func1(0x4574000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574000, 0x15c2314)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6611 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionGet(0x5202b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:453 +0x1c
testing.tRunner(0x5202b40, 0x15c266c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1957 [chan send, 2 minutes]:
testing.tRunner.func1(0x44f1900)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1900, 0x15c24a8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1687 [chan send]:
testing.tRunner.func1(0x46154a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x46154a0, 0x15c22d4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7167 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5064cf0, 0x0, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4928380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 6532 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsSpecificGet_good(0x4ba1860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:323 +0x20
testing.tRunner(0x4ba1860, 0x15c24f8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 981 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PurgeCheckOnDuplicate(0x45a5ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1985 +0x20
testing.tRunner(0x45a5ae0, 0x15c2098)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6413 [chan send, 2 minutes]:
testing.tRunner.func1(0x4ba00a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4ba00a0, 0x15c2464)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7014 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f740, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a474, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492a460, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492a460, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a225e0, 0x1, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a225e0, 0x23caed8, 0x10000, 0x10000)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4541b40, 0x4a225e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 1712 [chan send]:
testing.tRunner.func1(0x45748c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45748c0, 0x15c2374)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1211 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_SetupProxyManager(0x4803400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3401 +0x1c
testing.tRunner(0x4803400, 0x15c217c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1674 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceReverseLookup_CustomDomain(0x4614320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1071 +0x20
testing.tRunner(0x4614320, 0x15c23bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1669 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ReverseLookup(0x4803f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:853 +0x20
testing.tRunner(0x4803f40, 0x15c2340)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 975 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PurgeServiceOnDuplicate(0x45a5720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1612 +0x20
testing.tRunner(0x45a5720, 0x15c20a8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1222 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodes_WanTranslation(0x4803ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:157 +0x20
testing.tRunner(0x4803ae0, 0x15c2234)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6918 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f950, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x454c064, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x454c050, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x454c050, 0x1449b10, 0x445a5a0, 0xb6d49a00)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4780000, 0xa4, 0x18, 0x478b460)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4780000, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4780000, 0x20, 0x1449b10, 0x2f0001, 0x478b460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x4542b40, 0x1813fc0, 0x46500d0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x44e4f00, 0x49f00c0, 0x4932340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 1673 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466be00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceReverseLookup_IPV6(0x466be00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1023 +0x20
testing.tRunner(0x466be00, 0x15c23c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6969 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a721c0, 0x473c080)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4970390, 0x4a721c0, 0x473c080)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 6601 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHandleRemoteExec(0x5202500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:329 +0x1c
testing.tRunner(0x5202500, 0x15c2474)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6960 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4764000, 0x1dcd6500, 0x0, 0x44a8800, 0x4a70440, 0x4410768)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 6921 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x44cbd60, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x44cbd60, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x44cbd60, 0x153de26, 0x8, 0x4ac6fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x44cbd60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 6978 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x46786c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7089 [IO wait]:
internal/poll.runtime_pollWait(0xa59494d4, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x48515a4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4851590, 0x4a33000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4851590, 0x4a33000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4798068, 0x4a33000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4439a40, 0x49130a0, 0xc, 0xc, 0x482b960, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4439a40, 0x49130a0, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4800c40, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4800c40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 1925 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0500)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0500, 0x15c23d8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 955 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_IndexChurn(0x45a4aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:738 +0x1c
testing.tRunner(0x45a4aa0, 0x15c201c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 971 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a54a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_updateTTLCheck(0x45a54a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1405 +0x1c
testing.tRunner(0x45a54a0, 0x15c21fc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 978 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PurgeProxyOnDuplicate(0x45a5900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1784 +0x20
testing.tRunner(0x45a5900, 0x15c20a0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 963 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_GRPC(0x45a4fa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1091 +0x1c
testing.tRunner(0x45a4fa0, 0x15c1f88)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 990 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadProxies(0x4822820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2325 +0x1c
testing.tRunner(0x4822820, 0x15c21b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1951 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1540)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1540, 0x15c2480)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1247 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_Over_TCP(0x4614aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:144 +0x20
testing.tRunner(0x4614aa0, 0x15c230c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 958 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_MinInterval(0x45a4c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:923 +0x1c
testing.tRunner(0x45a4c80, 0x15c1f8c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1955 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f17c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f17c0, 0x15c2498)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6428 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPrettyPrintBare(0x4ba0aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:504 +0x1c
testing.tRunner(0x4ba0aa0, 0x15c2618)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1676 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceReverseLookupNodeAddress(0x4614a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1154 +0x20
testing.tRunner(0x4614a00, 0x15c23b8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 878 [select, 4 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x455c870)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 1956 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1860, 0x15c24b4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1707 [chan send]:
testing.tRunner.func1(0x45745a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45745a0, 0x15c2380)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6419 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSetMeta(0x4ba0460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:265 +0x1c
testing.tRunner(0x4ba0460, 0x15c269c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6529 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsCheck_noDestination(0x4ba1680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:261 +0x20
testing.tRunner(0x4ba1680, 0x15c24cc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6983 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x44afc00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 1839 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0140)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0140, 0x15c2308)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1642 [chan send]:
testing.tRunner.func1(0x44fa000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44fa000, 0x15c2310)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7242 [select, 3 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4e35340, 0x15c2894, 0x4a73340, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4a73340, 0x1824420, 0x49b8e98)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4a73340, 0x1824420, 0x49b8e98, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 6544 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_PUT_GET_DELETE(0x51f6000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:17 +0x20
testing.tRunner(0x51f6000, 0x15c2520)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6963 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4678000, 0x153a3b9, 0x6, 0x4770100)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 1210 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ReLoadProxiesFromConfig(0x4803360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3336 +0x24
testing.tRunner(0x4803360, 0x15c20b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1244 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x46148c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCoordinate_Update_ACLDeny(0x46148c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go:338 +0x20
testing.tRunner(0x46148c0, 0x15c2280)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1233 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x46141e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodeServices(0x46141e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:889 +0x20
testing.tRunner(0x46141e0, 0x15c2220)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7079 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x44adab0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 6426 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHTTP_wrap_obfuscateLog(0x4ba0960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:445 +0x20
testing.tRunner(0x4ba0960, 0x15c246c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1245 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRecursorAddr(0x4614960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:104 +0x1c
testing.tRunner(0x4614960, 0x15c2620)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1667 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823c20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_EDNS0(0x4823c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:721 +0x20
testing.tRunner(0x4823c20, 0x15c22bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1700 [chan send]:
testing.tRunner.func1(0x4574140)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574140, 0x15c2368)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 977 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PurgeProxy(0x45a5860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1744 +0x20
testing.tRunner(0x45a5860, 0x15c20a4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 956 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck(0x45a4b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:845 +0x1c
testing.tRunner(0x45a4b40, 0x15c1fa0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1234 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodeServices_ConnectProxy(0x4614280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:934 +0x20
testing.tRunner(0x4614280, 0x15c2218)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1668 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_EDNS0_ECS(0x4803220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:761 +0x20
testing.tRunner(0x4803220, 0x15c22b8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1250 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_NodeLookup_PeriodName(0x4614c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:305 +0x20
testing.tRunner(0x4614c80, 0x15c22f0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 567 [semacquire, 4 minutes]:
sync.runtime_SemacquireMutex(0x4542c64, 0xa5b00, 0x1)
	/usr/lib/go-1.13/src/runtime/sema.go:71 +0x34
sync.(*Mutex).lockSlow(0x4542c60)
	/usr/lib/go-1.13/src/sync/mutex.go:138 +0x218
sync.(*Mutex).Lock(0x4542c60)
	/usr/lib/go-1.13/src/sync/mutex.go:81 +0x4c
sync.(*RWMutex).Lock(0x4542c60)
	/usr/lib/go-1.13/src/sync/rwmutex.go:98 +0x20
github.com/hashicorp/consul/agent/local.(*State).StopNotify(0x4542c60, 0x45e4fc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/local/state.go:939 +0x20
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4535290, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:120 +0x104
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x48ee000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 992 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4822fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_unloadProxies(0x4822fa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2384 +0x1c
testing.tRunner(0x4822fa0, 0x15c21f4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6964 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4678000, 0x1539045, 0x5, 0x4770120)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 1709 [chan send]:
testing.tRunner.func1(0x45746e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45746e0, 0x15c2204)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6417 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSetKnownLeader(0x4ba0320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:225 +0x1c
testing.tRunner(0x4ba0320, 0x15c2694)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 985 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadServices_sidecar(0x45a5d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2145 +0x1c
testing.tRunner(0x45a5d60, 0x15c21c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6967 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x44afc00, 0x4a70700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4441a50, 0x44afc00, 0x4a70700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 1242 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCoordinate_Node(0x4614780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go:187 +0x20
testing.tRunner(0x4614780, 0x15c227c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1102 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_restoresSnapshot(0x45a4140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2650 +0x1c
testing.tRunner(0x45a4140, 0x15c1f9c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1682 [chan receive]:
github.com/hashicorp/serf/serf.(*Snapshotter).Wait(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:190
github.com/hashicorp/serf/serf.(*Serf).Shutdown(0x4ce5e60, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:863 +0xf4
github.com/hashicorp/consul/agent/consul.(*Server).Shutdown(0x5592540, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:737 +0x210
github.com/hashicorp/consul/agent.(*Agent).ShutdownAgent(0x48eef00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1501 +0x578
github.com/hashicorp/consul/agent.(*TestAgent).Shutdown(0x4f7b130, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:245 +0x60
github.com/hashicorp/consul/agent.TestDNS_ConnectServiceLookup(0x4615180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1615 +0x8c0
testing.tRunner(0x4615180, 0x15c22b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1926 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f05a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f05a0, 0x15c23e0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1683 [chan send]:
testing.tRunner.func1(0x4615220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4615220, 0x15c22c0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 976 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a57c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PersistProxy(0x45a57c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1664 +0x20
testing.tRunner(0x45a57c0, 0x15c2090)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7154 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x4a41b60, 0x0, 0x0, 0x48b0be8, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4928700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 1243 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCoordinate_Update(0x4614820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go:291 +0x20
testing.tRunner(0x4614820, 0x15c2284)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 4538 [select, 3 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x50a8540, 0x50429b0, 0x181bb80, 0x504d420)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 1215 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ReloadConfigTLSConfigFailure(0x4803680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3654 +0x20
testing.tRunner(0x4803680, 0x15c2124)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6966 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x47f3800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 952 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a48c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddServiceNoRemoteExec(0x45a48c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:530 +0x20
testing.tRunner(0x45a48c0, 0x15c1fb0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1214 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48035e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ReloadConfigIncomingRPCConfig(0x48035e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3614 +0x20
testing.tRunner(0x48035e0, 0x15c211c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1224 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803c20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodes_DistanceSort(0x4803c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:325 +0x20
testing.tRunner(0x4803c20, 0x15c222c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 957 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_StartPassing(0x45a4be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:884 +0x1c
testing.tRunner(0x45a4be0, 0x15c1f98)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7234 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4800c40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 6736 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4a34200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x4a34200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4a34200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a34200, 0x50423b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1659 [chan send]:
testing.tRunner.func1(0x44f0000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0000, 0x15c23a8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1685 [chan send]:
testing.tRunner.func1(0x4615360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4615360, 0x15c22c4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1209 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48032c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RemoveProxy(0x48032c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3297 +0x1c
testing.tRunner(0x48032c0, 0x15c2134)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 987 [chan send, 1 minutes]:
testing.tRunner.func1(0x45a5ea0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45a5ea0, 0x15c21b8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1216 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestBlacklist(0x4803720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/blacklist_test.go:8 +0x20
testing.tRunner(0x4803720, 0x15c2208)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6516 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseConsistency_Invalid(0x4ba0e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:951 +0x20
testing.tRunner(0x4ba0e60, 0x15c2584)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6535 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsSpecificDelete_good(0x4ba1a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:422 +0x20
testing.tRunner(0x4ba1a40, 0x15c24f4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1213 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_ReloadConfigOutgoingRPCConfig(0x4803540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3579 +0x20
testing.tRunner(0x4803540, 0x15c2120)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1207 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_reloadWatchesHTTPS(0x4823f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3049 +0x20
testing.tRunner(0x4823f40, 0x15c21e0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 979 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a59a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_PersistCheck(0x45a59a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1845 +0x20
testing.tRunner(0x45a59a0, 0x15c208c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7102 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44ee420, 0x49f47c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 1237 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestConnectCARoots_list(0x4614460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/connect_ca_endpoint_test.go:36 +0x20
testing.tRunner(0x4614460, 0x15c2264)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6599 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52023c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites_ACLAgentToken(0x52023c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:188 +0x1c
testing.tRunner(0x52023c0, 0x15c2634)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6518 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestEnableWebUI(0x4ba0fa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:1106 +0x1c
testing.tRunner(0x4ba0fa0, 0x15c23f0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1703 [chan send]:
testing.tRunner.func1(0x4574320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574320, 0x15c232c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 965 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a50e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_Alias_setToken(0x45a50e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1162 +0x1c
testing.tRunner(0x45a50e0, 0x15c1f70)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1671 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x466b180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ReverseLookup_IPV6(0x466b180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:935 +0x20
testing.tRunner(0x466b180, 0x15c233c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 967 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_Alias_userAndSetToken(0x45a5220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1220 +0x1c
testing.tRunner(0x45a5220, 0x15c1f74)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1705 [chan send]:
testing.tRunner.func1(0x4574460)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574460, 0x15c2370)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 951 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddServiceNoExec(0x45a4820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:500 +0x20
testing.tRunner(0x45a4820, 0x15c1fac)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1105 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadChecks_checkFails(0x45a5a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2795 +0x1c
testing.tRunner(0x45a5a40, 0x15c21a8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6414 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHTTPServer_UnixSocket_FileExists(0x4ba0140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:92 +0x20
testing.tRunner(0x4ba0140, 0x15c2460)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1681 [chan receive]:
github.com/hashicorp/serf/serf.(*Snapshotter).Wait(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:190
github.com/hashicorp/serf/serf.(*Serf).Shutdown(0x5b29680, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:863 +0xf4
github.com/hashicorp/consul/agent/consul.(*Server).Shutdown(0x5a08380, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:737 +0x210
github.com/hashicorp/consul/agent.(*Agent).ShutdownAgent(0x4576c80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1501 +0x578
github.com/hashicorp/consul/agent.(*TestAgent).Shutdown(0x5581860, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:245 +0x60
github.com/hashicorp/consul/agent.TestDNS_ServiceLookupWithInternalServiceAddress(0x46150e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1569 +0x698
testing.tRunner(0x46150e0, 0x15c2354)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1228 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803ea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogServiceNodes_NodeMetaFilter(0x4803ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:600 +0x20
testing.tRunner(0x4803ea0, 0x15c2248)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6547 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f61e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_CAS(0x51f61e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:218 +0x20
testing.tRunner(0x51f61e0, 0x15c2508)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1679 [runnable]:
syscall.Syscall(0x94, 0x7b, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x7b, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x5e46a20, 0x1000, 0x1000)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).writeMeta(0x59be080, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:556 +0xfc
github.com/boltdb/bolt.(*Tx).Commit(0x59be080, 0x23c0274, 0x4)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:221 +0x3e8
github.com/hashicorp/raft-boltdb.(*BoltStore).initialize(0x5f0e040, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:105 +0x124
github.com/hashicorp/raft-boltdb.New(0x5cfc460, 0x46, 0x0, 0x5cfc400, 0x46, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:81 +0xd8
github.com/hashicorp/raft-boltdb.NewBoltStore(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:60
github.com/hashicorp/consul/agent/consul.(*Server).setupRaft(0x6238380, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:552 +0x778
github.com/hashicorp/consul/agent/consul.NewServerLogger(0x5733a00, 0x5df0870, 0x624a000, 0x5df08d0, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:384 +0x680
github.com/hashicorp/consul/agent.(*Agent).Start(0x45772c0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:404 +0x44c
github.com/hashicorp/consul/agent.(*TestAgent).Start(0x5dc1a90, 0x4614fa0, 0x67f7c)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:162 +0x6e0
github.com/hashicorp/consul/agent.NewTestAgent(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:101
github.com/hashicorp/consul/agent.TestDNS_ServiceLookupMultiAddrNoCNAME(0x4614fa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1317 +0xa0
testing.tRunner(0x4614fa0, 0x15c2348)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1924 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0460)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0460, 0x15c23d4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 960 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_RestoreState(0x45a4dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:981 +0x20
testing.tRunner(0x45a4dc0, 0x15c1f94)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 969 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_HTTPCheck_TLSSkipVerify(0x45a5360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1298 +0x20
testing.tRunner(0x45a5360, 0x15c2000)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6982 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46786c0, 0x1539225, 0x5, 0x51bce40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 1691 [chan send]:
testing.tRunner.func1(0x4615720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4615720, 0x15c2394)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1684 [chan send]:
testing.tRunner.func1(0x46152c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x46152c0, 0x15c22cc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1932 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0960)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0960, 0x15c22a8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1678 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookupPreferNoCNAME(0x4614e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1258 +0x20
testing.tRunner(0x4614e60, 0x15c2350)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7021 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x45500b0, 0x1dcd6500, 0x0, 0x4afa000, 0x4bcd880, 0x50426e8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 1680 [semacquire]:
sync.runtime_Semacquire(0x594f8d8)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x594f8d0)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/testutil/retry.run(0x1807a38, 0x57bb290, 0x180f430, 0x4615040, 0x5f0eb20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/testutil/retry/retry.go:132 +0x108
github.com/hashicorp/consul/testutil/retry.Run(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/testutil/retry/retry.go:90
github.com/hashicorp/consul/agent.(*TestAgent).Start(0x57ef040, 0x4615040, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:194 +0xb48
github.com/hashicorp/consul/agent.NewTestAgent(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:101
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup(0x4615040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:1391 +0xa0
testing.tRunner(0x4615040, 0x15c23b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6530 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsCreate_good(0x4ba1720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:278 +0x20
testing.tRunner(0x4ba1720, 0x15c24d4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1927 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0640)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0640, 0x15c23dc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 983 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5c20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_unloadChecks(0x45a5c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2072 +0x20
testing.tRunner(0x45a5c20, 0x15c21f0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7078 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x44adab0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 1218 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogDeregister(0x4803860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:44 +0x1c
testing.tRunner(0x4803860, 0x15c2214)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 964 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_Alias(0x45a5040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1128 +0x1c
testing.tRunner(0x45a5040, 0x15c1f7c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1217 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48037c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogRegister_Service_InvalidAddress(0x48037c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:19 +0x1c
testing.tRunner(0x48037c0, 0x15c223c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 954 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RemoveServiceRemovesAllChecks(0x45a4a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:685 +0x20
testing.tRunner(0x45a4a00, 0x15c2138)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1252 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNSCycleRecursorCheck(0x4614dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:388 +0x20
testing.tRunner(0x4614dc0, 0x15c228c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1946 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1220)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1220, 0x15c2408)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 949 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a46e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_makeNodeID(0x45a46e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:282 +0x1c
testing.tRunner(0x45a46e0, 0x15c21d0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6560 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringUse(0x51f6a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:223 +0x20
testing.tRunner(0x51f6a00, 0x15c2550)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1704 [chan send]:
testing.tRunner.func1(0x45743c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45743c0, 0x15c2334)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1101 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a40a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddService_restoresSnapshot(0x45a40a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2607 +0x1c
testing.tRunner(0x45a40a0, 0x15c1fb4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7341 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4d62400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4d62400, 0x50422b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1212 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48034a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadTokens(0x48034a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:3422 +0x1c
testing.tRunner(0x48034a0, 0x15c21cc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6993 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x44e4f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 993 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48232c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_Service_MaintenanceMode(0x48232c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2415 +0x20
testing.tRunner(0x48232c0, 0x15c215c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7282 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x50a5b90)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7236 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x45a1dc0, 0x4ad5980)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x498a780, 0x45a1dc0, 0x4ad5980)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 7011 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x49ec500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 6531 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba17c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsCreate_noBody(0x4ba17c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:310 +0x20
testing.tRunner(0x4ba17c0, 0x15c24d8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 973 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a55e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_persistedService_compat(0x45a55e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1529 +0x20
testing.tRunner(0x45a55e0, 0x15c21d8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 984 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a5cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_loadServices_token(0x45a5cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2124 +0x1c
testing.tRunner(0x45a5cc0, 0x15c21c8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1961 [chan send, 2 minutes]:
testing.tRunner.func1(0x44f1b80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1b80, 0x15c2490)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1943 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1040)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1040, 0x15c2414)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1950 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f14a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f14a0, 0x15c2484)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6933 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x492e600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x492e600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x492e600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x492e600, 0x4b8a0e8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1204 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_purgeCheckState(0x4823d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2938 +0x20
testing.tRunner(0x4823d60, 0x15c21dc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1960 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1ae0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1ae0, 0x15c24b0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1206 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4823ea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_reloadWatches(0x4823ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:2991 +0x20
testing.tRunner(0x4823ea0, 0x15c21e4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6424 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHTTPAPIResponseHeaders(0x4ba0820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:393 +0x20
testing.tRunner(0x4ba0820, 0x15c242c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6737 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4a34200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a34200, 0x50423b8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 968 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a52c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_RemoveCheck(0x45a52c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:1250 +0x1c
testing.tRunner(0x45a52c0, 0x15c2130)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6928 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f194, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca064, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x44ca050, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x44ca050, 0x1449b10, 0x2657440, 0xb6d49a00)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x440e430, 0x77, 0x18, 0x4932ec0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x440e430, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x440e430, 0x20, 0x1449b10, 0x2f0001, 0x4932ec0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x495c6c0, 0x1813fc0, 0x49b81c0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x44e48c0, 0x4949e40, 0x461b1a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 948 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_setupNodeID(0x45a4640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:214 +0x20
testing.tRunner(0x45a4640, 0x15c21e8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7322 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x45a1dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 1954 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1720)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1720, 0x15c249c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1928 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f06e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f06e0, 0x15c23d0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1958 [chan send, 2 minutes]:
testing.tRunner.func1(0x44f19a0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f19a0, 0x15c24a4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7076 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4952a10, 0x4984000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x48afb00)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x48afb00, 0x59d2d40, 0x0, 0x4952a14)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x51bc4e0, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x440f4f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x48afb90, 0x11b15b0, 0x478b540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x48afb90, 0x11b15b0, 0x478b540, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x48afad0, 0x11b15b0, 0x478b540, 0x478b540, 0x4763968)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x48afad0, 0x478b540, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x4763950, 0x181cc00, 0x48afad0, 0x472ef90, 0x44a2464, 0x181cc01, 0x48afad0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x4763950, 0x181cc00, 0x48afad0, 0x53f3c50, 0x4a6f4f0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x53f3c50, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x4763950, 0x181cc00, 0x48afad0, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x44afc00, 0x1824300, 0x4952a10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 7168 [select, 3 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x5078540, 0x4c24a14, 0x20, 0x20, 0x47aa090, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x50784a0, 0x181bb80, 0x5078540, 0x47aa098, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x50784a0, 0x4bd4540, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x45a1dc0, 0x463017c, 0x4bd451c, 0x4c24b78, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x44cb4f0, 0x4630150, 0x4bd4500, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4598980, 0x451f8f8, 0x13, 0x1538345, 0x4, 0x4c24d54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598980, 0x451f8f8, 0x13, 0x4e57554, 0x3, 0x3, 0xb6d49001, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x4718ac0, 0x4a41b30, 0x49707c8, 0x0, 0x44cb5e0, 0x4a4e2a0, 0x1480eb8, 0x4630150, 0x16, 0x1199a78, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4a41b30, 0x181be20, 0x50783c0, 0x48ee858, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x45a1dc0, 0x1547bdf, 0xf, 0x1480eb8, 0x4952cb0, 0x1199a78, 0x4bd44c0, 0x7c0510, 0x40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48ee780, 0x1547bdf, 0xf, 0x1480eb8, 0x4952cb0, 0x1199a78, 0x4bd44c0, 0x48ce2a4, 0x4484000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x449d920, 0xa, 0x0, 0xb2c97000, 0x8b, 0x50783a0, 0x1807978, 0x4952cb0, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6c8, 0x449d920, 0x443f700, 0x1199a78, 0x4b7d200, 0x0, 0x0, 0x0, 0x0, 0xa, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 1241 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x46146e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCoordinate_Nodes(0x46146e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/coordinate_endpoint_test.go:76 +0x20
testing.tRunner(0x46146e0, 0x15c2278)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1944 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f10e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f10e0, 0x15c2410)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6944 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44ee160, 0x4b73d40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 1221 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodes_MetaFilter(0x4803a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:116 +0x20
testing.tRunner(0x4803a40, 0x15c2230)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1220 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x48039a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogNodes(0x48039a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:82 +0x20
testing.tRunner(0x48039a0, 0x15c2238)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1711 [chan send]:
testing.tRunner.func1(0x4574820)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574820, 0x15c23ac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1219 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogDatacenters(0x4803900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:63 +0x1c
testing.tRunner(0x4803900, 0x15c2210)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6594 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52020a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec_ACLToken(0x52020a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:105 +0x1c
testing.tRunner(0x52020a0, 0x15c262c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 959 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x45a4d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_AddCheck_MissingService(0x45a4d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent_test.go:958 +0x1c
testing.tRunner(0x45a4d20, 0x15c1f90)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6561 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_Keyring_InvalidRelayFactor(0x51f6aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:267 +0x20
testing.tRunner(0x51f6aa0, 0x15c2554)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1693 [chan send]:
testing.tRunner.func1(0x4615860)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4615860, 0x15c22a0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1227 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4803e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestCatalogServiceNodes(0x4803e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/catalog_endpoint_test.go:487 +0x20
testing.tRunner(0x4803e00, 0x15c2250)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7146 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x48be380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 1251 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4614d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_NodeLookup_AAAA(0x4614d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:345 +0x20
testing.tRunner(0x4614d20, 0x15c22dc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6527 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsCheck_basic(0x4ba1540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:188 +0x20
testing.tRunner(0x4ba1540, 0x15c24c8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1840 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f01e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f01e0, 0x15c2304)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1942 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0fa0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0fa0, 0x15c23f8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1941 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0f00, 0x15c23fc)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1930 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0820)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0820, 0x15c22a4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1929 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0780)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0780, 0x15c22b0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1720 [chan send, 1 minutes]:
testing.tRunner.func1(0x4574dc0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574dc0, 0x15c2390)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1719 [chan send, 1 minutes]:
testing.tRunner.func1(0x4574d20)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574d20, 0x15c2324)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6602 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52025a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHandleRemoteExecFailed(0x52025a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:334 +0x1c
testing.tRunner(0x52025a0, 0x15c2470)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1952 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f15e0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f15e0, 0x15c2494)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6934 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x492e600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x492e600, 0x4b8a0f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 1947 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f12c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f12c0, 0x15c240c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6523 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba12c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsMatch_basic(0x4ba12c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:72 +0x20
testing.tRunner(0x4ba12c0, 0x15c24e4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1948 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1360, 0x15c26d8)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7020 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x45500b0, 0x4bcd880)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7012 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x45439e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7143 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x48ba310)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 1953 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1680)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1680, 0x15c24a0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1721 [chan send, 1 minutes]:
testing.tRunner.func1(0x4574e60)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574e60, 0x15c238c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1945 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f1180)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f1180, 0x15c2400)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1837 [chan send, 1 minutes]:
testing.tRunner.func1(0x45a4000)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x45a4000, 0x15c2298)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1722 [chan send, 1 minutes]:
testing.tRunner.func1(0x4574f00)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574f00, 0x15c236c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6596 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52021e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec_ACLDeny(0x52021e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:127 +0x1c
testing.tRunner(0x52021e0, 0x15c2628)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6995 [IO wait]:
internal/poll.runtime_pollWait(0xa597f29c, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca474, 0x72, 0xff00, 0xffff, 0x5d6bec0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x44ca460, 0x60a2000, 0xffff, 0xffff, 0x5d6bec0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x44ca460, 0x60a2000, 0xffff, 0xffff, 0x5d6bec0, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4b8a4f0, 0x60a2000, 0xffff, 0xffff, 0x5d6bec0, 0x28, 0x28, 0xb6d496d0, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4b8a4f0, 0x60a2000, 0xffff, 0xffff, 0x5d6bec0, 0x28, 0x28, 0xb6d496d0, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4b8a4f0, 0x60a2000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4a58800, 0x4b8a4f0, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4b8a518, 0x4b8a4f0, 0x77359400, 0x0, 0x512ae70, 0x1, 0x0, 0x0, 0x1808770, 0x512ae70)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4a58800, 0x4b8a4f0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4a58800, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bf0e40, 0x1537fd1, 0x3, 0x48acb80, 0xf, 0x4a74c70, 0x1, 0x5d721c)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e4f00, 0x4bf0e40, 0x4bf0d80, 0x4bf0dc0, 0x180f040, 0x4770f80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 1931 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f08c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f08c0, 0x15c22ac)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6533 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsSpecificGet_invalidId(0x4ba1900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:359 +0x1c
testing.tRunner(0x4ba1900, 0x15c24fc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6595 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec_ACLAgentToken(0x5202140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:116 +0x1c
testing.tRunner(0x5202140, 0x15c2624)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1922 [chan send, 1 minutes]:
testing.tRunner.func1(0x44f0320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x44f0320, 0x15c22d0)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6423 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestHTTPAPI_TranslateAddrHeader(0x4ba0780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:350 +0x20
testing.tRunner(0x4ba0780, 0x15c2454)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7094 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x498e750)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 1718 [chan send, 1 minutes]:
testing.tRunner.func1(0x4574c80)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574c80, 0x15c2328)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 1717 [chan send, 1 minutes]:
testing.tRunner.func1(0x4574be0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574be0, 0x15c23a4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6517 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestACLResolution(0x4ba0f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:969 +0x20
testing.tRunner(0x4ba0f00, 0x15c1e9c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7029 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4fb8000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 1716 [chan send]:
testing.tRunner.func1(0x4574b40)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x4574b40, 0x15c22f4)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7040 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4565680, 0x153a3b9, 0x6, 0x478bb60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 6908 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionDeleteDestroy(0x4fbc320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:595 +0x20
testing.tRunner(0x4fbc320, 0x15c265c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6947 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x44cbdb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 6416 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSetIndex(0x4ba0280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:211 +0x1c
testing.tRunner(0x4ba0280, 0x15c2690)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6948 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4a22e80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7082 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x455c460, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x455c460, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x455c460, 0x153de26, 0x8, 0x49acfdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x455c460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 6526 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba14a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsMatch_noName(0x4ba14a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:171 +0x20
testing.tRunner(0x4ba14a0, 0x15c24f0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6558 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f68c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringList(0x51f68c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:117 +0x20
testing.tRunner(0x51f68c0, 0x15c2548)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7077 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x4439e90, 0x4afbe00, 0x7229c, 0x49d0fac, 0x49d0f70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4928280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 6992 [select, 3 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x44e4f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7157 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x44caaa0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x44caaa0, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x44caaa0, 0x153de26, 0x8, 0x4b85fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x44caaa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 6438 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc1e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseWait(0x4fbc1e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:716 +0x20
testing.tRunner(0x4fbc1e0, 0x15c25a0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6952 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4fb8090)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 6980 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46786c0, 0x153a3b9, 0x6, 0x51bce00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7283 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa59493cc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4572ce4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4572cd0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4572cd0, 0x3, 0x3, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4b71b40, 0x4b24f94, 0x4d01434, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4b71b40, 0x12b94, 0x1ba48, 0x4a67170)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4da1b80, 0x4b71b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 6429 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseSource(0x4ba0b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:536 +0x20
testing.tRunner(0x4ba0b40, 0x15c2590)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7166 [runnable]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4cf58d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7138 [select, 3 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x48ba460, 0x15c2894, 0x4a721c0, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4a721c0, 0x1824420, 0x4b8a998)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4a721c0, 0x1824420, 0x4b8a998, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7334 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a73a40, 0x4c92a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x44a6d00, 0x4a73a40, 0x4c92a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 7010 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4a34200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a34200, 0x50423c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6548 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_ListKeys(0x51f6280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:294 +0x20
testing.tRunner(0x51f6280, 0x15c2518)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7023 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4564ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7025 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4564ea0, 0x1539045, 0x5, 0x49ec5c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 6994 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fb60, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a1a4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492a190, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492a190, 0x450a5a0, 0xb6d496d0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x49f8bd0, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x49f8bd0, 0x4410820, 0x492a190, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x44c2300, 0x1816180, 0x49f8bd0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x44c2300, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bf0e00, 0x1537fb3, 0x3, 0x4441a70, 0xf, 0x49f8bc0, 0x1, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e4f00, 0x4bf0e00, 0x4bf0d80, 0x4bf0dc0, 0x180f028, 0x4770f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 6981 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46786c0, 0x1539045, 0x5, 0x51bce20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7235 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4800c40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7165 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x492b3b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7095 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x498e750)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7284 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5949348, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4572d34, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4572d20, 0x4e16000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4572d20, 0x4e16000, 0x10000, 0x10000, 0x0, 0x1ba01, 0x2657101, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x451ef98, 0x4e16000, 0x10000, 0x10000, 0x4aaef34, 0x101, 0x4aaef08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x451ef98, 0x4e16000, 0x10000, 0x10000, 0x3, 0x0, 0x0, 0x4d00800, 0x4b25dc0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4da1b80, 0x451ef98)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 6961 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4678000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 6950 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4770060)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 6604 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52026e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate_Delete(0x52026e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:98 +0x1c
testing.tRunner(0x52026e0, 0x15c264c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6551 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestKVSEndpoint_PUT_ConflictingFlags(0x51f6460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/kvs_endpoint_test.go:432 +0x20
testing.tRunner(0x51f6460, 0x15c251c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6979 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x46786c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7099 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44ee420)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7340 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4d62400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4d62400, 0x50422a8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6959 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4764000, 0x4a70440)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 6521 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsList_empty(0x4ba1180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:15 +0x20
testing.tRunner(0x4ba1180, 0x15c24dc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6598 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202320)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites_ACLToken(0x5202320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:177 +0x1c
testing.tRunner(0x5202320, 0x15c263c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6557 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringInstall(0x51f6820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:81 +0x20
testing.tRunner(0x51f6820, 0x15c2544)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6541 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_InmemKeyrings(0x4ba1e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/keyring_test.go:110 +0x1c
testing.tRunner(0x4ba1e00, 0x15c2028)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7159 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x482b960, 0x4a3a000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x48ae1e0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x48ae1e0, 0x5b5b340, 0x0, 0x482b964)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x47c7d00, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4537c60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x48ae270, 0x11b15b0, 0x500c020)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x48ae270, 0x11b15b0, 0x500c020, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x48ae150, 0x11b15b0, 0x500c020, 0x500c020, 0x4439e78)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x48ae150, 0x500c020, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x4439e60, 0x181cc00, 0x48ae150, 0x4b82f90, 0x44a2400, 0x181cc01, 0x0, 0x0, 0x52)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x4439e60, 0x181cc00, 0x48ae150, 0x605bca0, 0x455dc70, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x605bca0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x4439e60, 0x181cc00, 0x48ae150, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x48be380, 0x1824300, 0x482b960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 6439 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPProfHandlers_EnableDebug(0x4fbc280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:733 +0x1c
testing.tRunner(0x4fbc280, 0x15c2578)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6519 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseToken_ProxyTokenResolve(0x4ba1040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:1121 +0x20
testing.tRunner(0x4ba1040, 0x15c2594)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6958 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4764000, 0x2a05f200, 0x1, 0x44a87c0, 0x4a70440, 0x4410758)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7339 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4d62400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x4d62400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4d62400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4d62400, 0x50422a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6427 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0a00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPrettyPrint(0x4ba0a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:499 +0x1c
testing.tRunner(0x4ba0a00, 0x15c261c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6605 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate_DefaultCheck(0x5202780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:154 +0x20
testing.tRunner(0x5202780, 0x15c2648)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7103 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee420, 0x1dcd6500, 0x0, 0x4bd4fc0, 0x49f47c0, 0x4b8a540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7024 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4564ea0, 0x153a3b9, 0x6, 0x49ec5a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 6607 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x52028c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestFixupLockDelay(0x52028c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:227 +0x1c
testing.tRunner(0x52028c0, 0x15c2420)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6444 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc5a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseConsistency(0x4fbc5a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:863 +0x20
testing.tRunner(0x4fbc5a0, 0x15c2588)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6941 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44ee160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7039 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4565680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7017 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x45500b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7090 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4554400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x4554400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4554400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4554400, 0x4b8a0c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7038 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4565680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7037 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4550160, 0xbebc200, 0x0, 0x4afa5c0, 0x4bde340, 0x5042a20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7016 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x45500b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7036 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4550160, 0x4bde340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7035 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4550160, 0x3b9aca00, 0x0, 0x4afa580, 0x4bde340, 0x5042a10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7018 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x45500b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7033 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4550160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7030 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5949870, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a604, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492a5f0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492a5f0, 0x1b874, 0x4ff3f70, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a23350, 0x4819f78, 0x6f128, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a23350, 0x7b79c, 0x5060340, 0x4)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4afa280, 0x4a23350)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7034 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4550160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7031 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fadc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a654, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x492a640, 0x4b5e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x492a640, 0x4b5e000, 0x10000, 0x10000, 0x306600, 0x4a00001, 0x4a0ae01, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5042748, 0x4b5e000, 0x10000, 0x10000, 0x4a0af34, 0x101, 0x4a0af08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5042748, 0x4b5e000, 0x10000, 0x10000, 0x2659558, 0x3065f0, 0x5064214, 0x1549401, 0x7229c)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4afa280, 0x5042748)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7028 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4fb8000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 6936 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x475e5a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7027 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x478bac0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7026 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4564ea0, 0x1539225, 0x5, 0x49ec5e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 6549 [chan send, 2 minutes]:
testing.tRunner.func1(0x51f6320)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x51f6320, 0x15c2504)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6735 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x4652780, 0x457ae40, 0x4bcd170, 0x488160f, 0x3)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4c42100)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7703 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x44fa960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestStatusPeers(0x44fa960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/status_endpoint_test.go:29 +0x1c
testing.tRunner(0x44fa960, 0x15c26b0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6606 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate_NoCheck(0x5202820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:190 +0x20
testing.tRunner(0x5202820, 0x15c2650)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6600 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites_ACLDeny(0x5202460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:199 +0x1c
testing.tRunner(0x5202460, 0x15c2638)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6534 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba19a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsSpecificUpdate_good(0x4ba19a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:376 +0x20
testing.tRunner(0x4ba19a0, 0x15c2500)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6543 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgentKeyring_ACL(0x4ba1f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/keyring_test.go:275 +0x1c
testing.tRunner(0x4ba1f40, 0x15c1f6c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6957 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4764000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7093 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x47f2cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 6997 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x482b6c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7097 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fc68, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a334, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x492a320, 0x4b3e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x492a320, 0x4b3e000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x2659558)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4b8a1d8, 0x4b3e000, 0x10000, 0x10000, 0x4b0a734, 0x101, 0x4b0a708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4b8a1d8, 0x4b3e000, 0x10000, 0x10000, 0x4b0a7cc, 0x4bd4c40, 0x0, 0x0, 0x1807528)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bd4e00, 0x4b8a1d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7015 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f7c4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a4c4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x492a4b0, 0x4b4e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x492a4b0, 0x4b4e000, 0x10000, 0x10000, 0x0, 0x1, 0x12b01, 0x0, 0x4659000)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5042430, 0x4b4e000, 0x10000, 0x10000, 0x505bf34, 0x101, 0x505bf08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5042430, 0x4b4e000, 0x10000, 0x10000, 0x0, 0x180a288, 0x4b8a4e0, 0x4b8a4d0, 0x4659040)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4541b40, 0x5042430)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 6597 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecWrites(0x5202280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:172 +0x1c
testing.tRunner(0x5202280, 0x15c2640)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6555 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f66e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_RaftConfiguration(0x51f66e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:20 +0x20
testing.tRunner(0x51f66e0, 0x15c2558)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6556 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_RaftPeer(0x51f6780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:46 +0x1c
testing.tRunner(0x51f6780, 0x15c2564)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6559 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_KeyringRemove(0x51f6960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:165 +0x20
testing.tRunner(0x51f6960, 0x15c254c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7085 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4631730, 0x4a30000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x48cf680)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x48cf680, 0x5b30920, 0x0, 0x4631734)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x503eae0, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4a74a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x48cf6e0, 0x11b15b0, 0x47afb00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x48cf6e0, 0x11b15b0, 0x47afb00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x48cf650, 0x11b15b0, 0x47afb00, 0x47afb00, 0x4652768)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x48cf650, 0x47afb00, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x4652750, 0x181cc00, 0x48cf650, 0x4b86f90, 0x44a2464, 0x181cc01, 0x48cf650, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x4652750, 0x181cc00, 0x48cf650, 0x44b48d0, 0x4a5bea0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x44b48d0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x4652750, 0x181cc00, 0x48cf650, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4a721c0, 0x1824300, 0x4631730)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 6540 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestAgent_LoadKeyrings(0x4ba1d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/keyring_test.go:31 +0x1c
testing.tRunner(0x4ba1d60, 0x15c2048)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7022 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4564ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 6603 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSessionCreate(0x5202640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/session_endpoint_test.go:41 +0x1c
testing.tRunner(0x5202640, 0x15c2654)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6442 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestParseWait_InvalidTime(0x4fbc460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:833 +0x1c
testing.tRunner(0x4fbc460, 0x15c259c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6441 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4fbc3c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPProfHandlers_ACLs(0x4fbc3c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/http_test.go:761 +0x20
testing.tRunner(0x4fbc3c0, 0x15c2570)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6524 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba1360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIntentionsMatch_noBy(0x4ba1360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/intentions_endpoint_test.go:137 +0x20
testing.tRunner(0x4ba1360, 0x15c24ec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7091 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4554400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4554400, 0x4b8a0c8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 6562 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_AutopilotGetConfiguration(0x51f6b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:293 +0x20
testing.tRunner(0x51f6b40, 0x15c253c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6563 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_AutopilotSetConfiguration(0x51f6be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:318 +0x20
testing.tRunner(0x51f6be0, 0x15c2540)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6564 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_AutopilotCASConfiguration(0x51f6c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:346 +0x20
testing.tRunner(0x51f6c80, 0x15c2538)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6565 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_ServerHealth(0x51f6d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:413 +0x1c
testing.tRunner(0x51f6d20, 0x15c256c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6566 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestOperator_ServerHealth_Unhealthy(0x51f6dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/operator_endpoint_test.go:445 +0x1c
testing.tRunner(0x51f6dc0, 0x15c2568)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6567 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6e60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Create(0x51f6e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:77 +0x20
testing.tRunner(0x51f6e60, 0x15c25a4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6568 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6f00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_List(0x51f6f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:165 +0x1c
testing.tRunner(0x51f6f00, 0x15c260c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6569 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f6fa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Execute(0x51f6fa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:248 +0x1c
testing.tRunner(0x51f6fa0, 0x15c25dc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6570 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7040)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_ExecuteCached(0x51f7040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:617 +0x20
testing.tRunner(0x51f7040, 0x15c25d8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6571 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f70e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Explain(0x51f70e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:674 +0x1c
testing.tRunner(0x51f70e0, 0x15c25ec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6572 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7180)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Get(0x51f7180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:769 +0x1c
testing.tRunner(0x51f7180, 0x15c25f8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6573 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7220)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Update(0x51f7220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:835 +0x20
testing.tRunner(0x51f7220, 0x15c2610)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6574 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f72c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_Delete(0x51f72c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:913 +0x20
testing.tRunner(0x51f72c0, 0x15c25a8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6575 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7360)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestPreparedQuery_parseLimit(0x51f7360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/prepared_query_endpoint_test.go:961 +0x1c
testing.tRunner(0x51f7360, 0x15c2614)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6321 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5202000)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestRemoteExecGetSpec(0x5202000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/remote_exec_test.go:100 +0x1c
testing.tRunner(0x5202000, 0x15c2630)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 6940 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44ee160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 6989 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x44afc00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 6942 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44ee160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 6935 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x492e600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x492e600, 0x4b8a0f8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7144 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x48ba310)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7002 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x48be380, 0x4af3c40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x45204c0, 0x48be380, 0x4af3c40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 6733 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x4a5a780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 6970 [select, 3 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x49529a0, 0x15c2894, 0x44afc00, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x44afc00, 0x1824420, 0x4798680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x44afc00, 0x1824420, 0x4798680, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 6991 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x44e4f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 6951 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4fb8090)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7098 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44ee420)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7701 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x44fa820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSnapshot_Options(0x44fa820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/snapshot_endpoint_test.go:57 +0x1c
testing.tRunner(0x44fa820, 0x15c26a4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7000 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x4a5a730, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x4a5a730, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x4a5a730, 0x153de26, 0x8, 0x49b1fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x4a5a730)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 6977 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4b70fc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 6929 [IO wait]:
internal/poll.runtime_pollWait(0xa597fcec, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6fa04, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4a6f9f0, 0x496f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4a6f9f0, 0x496f000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4798680, 0x496f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x48afa10, 0x44a7d30, 0xc, 0xc, 0x4952a10, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x48afa10, 0x44a7d30, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x49529a0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x49529a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 6987 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x44afc00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7062 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fe78, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6f874, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6f860, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6f860, 0x1449b10, 0x485ab40, 0xb6d49300)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4989950, 0xa1, 0x18, 0x4ff7400)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4989950, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4989950, 0x20, 0x1449b10, 0x2f0001, 0x4ff7400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x4b8c360, 0x1813fc0, 0x47869a0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x44e5180, 0x4945f40, 0x49b4020)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7063 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4a721c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 6962 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4678000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 6956 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4764000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 6937 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x475e5a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 6976 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x455c4b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 6939 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fdf4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca2e4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x44ca2d0, 0x4a76000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x44ca2d0, 0x4a76000, 0x10000, 0x10000, 0x0, 0x2659501, 0x1, 0x0, 0x1)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4b8a180, 0x4a76000, 0x10000, 0x10000, 0x51fb734, 0x101, 0x51fb708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4b8a180, 0x4a76000, 0x10000, 0x10000, 0x2, 0x1, 0x0, 0x47aa080, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bf0a80, 0x4b8a180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 6968 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x44afc00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 6734 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x49882c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7041 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4565680, 0x1539045, 0x5, 0x478bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 6990 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x501e960, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x44e4f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 6986 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x44afc00, 0x0, 0x15c28d8, 0x4678000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7074 [select]:
github.com/hashicorp/yamux.(*Session).send(0x49529a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 6988 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fbe4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6f5f4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6f5e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6f5e0, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a22e90, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4a22e90, 0x2, 0x2, 0x3f800000, 0x4798680)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x44afc00, 0x1816180, 0x4a22e90)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 6985 [select, 3 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x4763890, 0x4bf1980, 0x1537ffe, 0x3, 0x4a70000, 0x4bf1900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7523 [runnable]:
syscall.Syscall(0x94, 0x5f, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x5f, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x4b1fc20, 0x1000, 0xfffffff)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).write(0x5624180, 0xbf6f9556, 0xd6d1087e)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:519 +0x3d0
github.com/boltdb/bolt.(*Tx).Commit(0x5624180, 0x50beb08, 0x8)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:198 +0x29c
github.com/hashicorp/raft-boltdb.(*BoltStore).StoreLogs(0x4537200, 0x58e2580, 0x1, 0x1, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:187 +0x228
github.com/hashicorp/raft.(*LogCache).StoreLogs(0x4baf080, 0x58e2580, 0x1, 0x1, 0x2656cc8, 0x4a573b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/log_cache.go:61 +0x110
github.com/hashicorp/raft.(*Raft).dispatchLogs(0x480ea00, 0x4ca1d1c, 0x1, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:1061 +0x284
github.com/hashicorp/raft.(*Raft).leaderLoop(0x480ea00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:746 +0x5ac
github.com/hashicorp/raft.(*Raft).runLeader(0x480ea00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x480ea00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x480ea00, 0x50436a8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7042 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4565680, 0x1539225, 0x5, 0x478bbc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7043 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4a721c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7044 [select, 1 minutes]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4afa600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7045 [select, 3 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x4652660, 0x5186740, 0x1537ffe, 0x3, 0x4bccc80, 0x51866c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7046 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4a721c0, 0x0, 0x15c28d8, 0x4564ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7047 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4a721c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7048 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597fa58, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a5bfa4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a5bf90, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a5bf90, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x49882d0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x49882d0, 0x2, 0x2, 0x3f800000, 0x4b8a998)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4a721c0, 0x1816180, 0x49882d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7049 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4a721c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7050 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4535440, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x44e48c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7051 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x44e48c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7052 [select, 3 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x44e48c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7053 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x44e48c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7054 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f218, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x454c244, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x454c230, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x454c230, 0x445b680, 0xb6d49008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4acc5f0, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4acc5f0, 0x451e350, 0x454c230, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x44c2780, 0x1816180, 0x4acc5f0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x44c2780, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4afa800, 0x1537fb3, 0x3, 0x498a470, 0xf, 0x4acc5e0, 0x5186d10, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e48c0, 0x4afa800, 0x4afa740, 0x4afa780, 0x180f028, 0x47854e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7055 [IO wait]:
internal/poll.runtime_pollWait(0xa597f530, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a794, 0x72, 0xff00, 0xffff, 0x44a4180)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x492a780, 0x4dca000, 0xffff, 0xffff, 0x44a4180, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x492a780, 0x4dca000, 0xffff, 0xffff, 0x44a4180, 0x28, 0x28, 0x8, 0x0, 0x1d2c0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5042ac0, 0x4dca000, 0xffff, 0xffff, 0x44a4180, 0x28, 0x28, 0xb6d496d0, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5042ac0, 0x4dca000, 0xffff, 0xffff, 0x44a4180, 0x28, 0x28, 0xb6d496d0, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5042ac0, 0x4dca000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4c42580, 0x5042ac0, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5042ae8, 0x5042ac0, 0x77359400, 0x0, 0x51f2030, 0x1, 0x0, 0x0, 0x1808770, 0x51f2030)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4c42580, 0x5042ac0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4c42580, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4afa840, 0x1537fd1, 0x3, 0x46568c0, 0xf, 0x4a23bb0, 0x1, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e48c0, 0x4afa840, 0x4afa740, 0x4afa780, 0x180f040, 0x4785540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7088 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4800a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7057 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x44cbe00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7059 [IO wait]:
internal/poll.runtime_pollWait(0xa597f4ac, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492ab54, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x492ab40, 0x4825000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x492ab40, 0x4825000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x5042c18, 0x4825000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4535c50, 0x4970894, 0xc, 0xc, 0x44adf10, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4535c50, 0x4970894, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x44ade30, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x44ade30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7060 [select]:
github.com/hashicorp/yamux.(*Session).send(0x44ade30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7061 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x44ade30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7104 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4ce4360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7105 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4ce4360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7106 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4360, 0x153a3b9, 0x6, 0x47f2d80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7107 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4360, 0x1539045, 0x5, 0x47f2da0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7108 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4360, 0x1539225, 0x5, 0x47f2e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7109 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x47c7280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7110 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x498ed80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7111 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x498ed80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7112 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f6bc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a7e4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492a7d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492a7d0, 0x3, 0x3, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x49f8f80, 0x49954d4, 0x457b174, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x49f8f80, 0x5d721c, 0x4bcc940, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bd5180, 0x49f8f80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7113 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f638, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a834, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x492a820, 0x4c2e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x492a820, 0x4c2e000, 0x10000, 0x10000, 0x126b100, 0x49b8101, 0x1269501, 0x0, 0x181be20)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4b8a5a8, 0x4c2e000, 0x10000, 0x10000, 0x51fdf34, 0x101, 0x51fdf08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4b8a5a8, 0x4c2e000, 0x10000, 0x10000, 0x2020501, 0x405, 0x7229c, 0x4652660, 0x51fdf94)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bd5180, 0x4b8a5a8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7114 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44ee4d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7115 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44ee4d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7116 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44ee4d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7117 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee4d0, 0x3b9aca00, 0x0, 0x4bd5300, 0x4948600, 0x4b8a898)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7118 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44ee4d0, 0x4948600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7119 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee4d0, 0xbebc200, 0x0, 0x4bd5340, 0x4948600, 0x4b8a8a8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7120 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4ce4900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7121 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4ce4900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7122 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4900, 0x153a3b9, 0x6, 0x47c73e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7123 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4900, 0x1539045, 0x5, 0x47c7400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7124 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4900, 0x1539225, 0x5, 0x47c7480)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7125 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x48be380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7126 [select, 1 minutes]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4bd5380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7127 [select, 3 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x4439da0, 0x4d041c0, 0x1537ffe, 0x3, 0x4748540, 0x4d04140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7128 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x48be380, 0x0, 0x15c28d8, 0x4ce4360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7129 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x48be380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7130 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f8cc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x455dd74, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x455dd60, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x455dd60, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4b70fd0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4b70fd0, 0x2, 0x2, 0x3f800000, 0x4798068)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x48be380, 0x1816180, 0x4b70fd0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7131 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x48be380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7132 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x44ab800, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x44e5180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7133 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x44e5180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7134 [select, 3 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x44e5180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7135 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x44e5180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7136 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f5b4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6f824, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6f810, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6f810, 0x485ab40, 0xb6d4936c, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4989940, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4989940, 0x4786898, 0x4a6f810, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x453a180, 0x1816180, 0x4989940, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x453a180, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bd54c0, 0x1537fb3, 0x3, 0x45204e0, 0xf, 0x4989930, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e5180, 0x4bd54c0, 0x4bd5440, 0x4bd5480, 0x180f028, 0x494ad20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7137 [IO wait]:
internal/poll.runtime_pollWait(0xa597f008, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492a974, 0x72, 0xff00, 0xffff, 0x5522630)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x492a960, 0x60c8000, 0xffff, 0xffff, 0x5522630, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x492a960, 0x60c8000, 0xffff, 0xffff, 0x5522630, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4b8a948, 0x60c8000, 0xffff, 0xffff, 0x5522630, 0x28, 0x28, 0xb6d49008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4b8a948, 0x60c8000, 0xffff, 0xffff, 0x5522630, 0x28, 0x28, 0xb6d49008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4b8a948, 0x60c8000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x453ea80, 0x4b8a948, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4b8a970, 0x4b8a948, 0x77359400, 0x0, 0x4fdab70, 0x1, 0x0, 0x0, 0x1808770, 0x4fdab70)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x453ea80, 0x4b8a948, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x453ea80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bd5500, 0x1537fd1, 0x3, 0x49ab6f0, 0xf, 0x49f97a0, 0x5dc, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e5180, 0x4bd5500, 0x4bd5440, 0x4bd5480, 0x180f040, 0x494ad60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7065 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x4a5a7d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7067 [IO wait]:
internal/poll.runtime_pollWait(0xa597f428, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6fcd4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4a6fcc0, 0x4a27000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4a6fcc0, 0x4a27000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4787548, 0x4a27000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4baec30, 0x49ab780, 0xc, 0xc, 0x4897110, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4baec30, 0x49ab780, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4897030, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4897030)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7068 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4897030)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7069 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4897030)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7140 [select]:
github.com/hashicorp/yamux.(*Session).send(0x48ba460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7141 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x48ba460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7070 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x480e200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x480e200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x480e200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x480e200, 0x449c738)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7071 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x480e200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x480e200, 0x449c748)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7072 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x480e200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x480e200, 0x449c758)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7073 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4be2ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7170 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x50a4240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7171 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x50a4240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7172 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f08c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a5a1f4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a5a1e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a5a1e0, 0x4a00000, 0x4a0ef28, 0x5ebc2c)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4acc310, 0x0, 0x4a0ef28, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4acc310, 0x5d721c, 0x4a71140, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bf12c0, 0x4acc310)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7173 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f110, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a5a514, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4a5a500, 0x4ae0000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4a5a500, 0x4ae0000, 0x10000, 0x10000, 0x4b7c300, 0x4a74a01, 0x1827f01, 0x0, 0xb6d49a34)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x449c9b8, 0x4ae0000, 0x10000, 0x10000, 0x45f1f34, 0x101, 0x45f1f08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x449c9b8, 0x4ae0000, 0x10000, 0x10000, 0x0, 0x7a3ae4, 0x4bcc500, 0x4cfcad8, 0xffffff00)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bf12c0, 0x449c9b8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7174 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47ce580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7175 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47ce580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7176 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47ce580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7177 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ce580, 0x2a05f200, 0x1, 0x4bf1540, 0x4bfe580, 0x449d5c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7178 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47ce580, 0x4bfe580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7179 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ce580, 0x1dcd6500, 0x0, 0x4bf1580, 0x4bfe580, 0x449d5d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7180 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4565320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7181 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4565320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7182 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4565320, 0x153a3b9, 0x6, 0x4be2ba0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7183 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4565320, 0x1539045, 0x5, 0x4be2bc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7184 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4565320, 0x1539225, 0x5, 0x4be2be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7185 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x49f72e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7186 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x50a4c60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7187 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x50a4c60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7188 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f3a4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4850014, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4850000, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4850000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4accb20, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4accb20, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bf1740, 0x4accb20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7189 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5949768, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4850064, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4850050, 0x4b8e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4850050, 0x4b8e000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x449d630, 0x4b8e000, 0x10000, 0x10000, 0x4bf7734, 0x101, 0x4bf7708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x449d630, 0x4b8e000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bf1740, 0x449d630)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7190 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47ce630)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7191 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47ce630)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7192 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47ce630)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7193 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ce630, 0x3b9aca00, 0x0, 0x4bf1bc0, 0x4bff3c0, 0x449d8d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7194 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47ce630, 0x4bff3c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7195 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ce630, 0xbebc200, 0x0, 0x4bf1c80, 0x4bff3c0, 0x449d8e8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7196 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4678a20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7197 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4678a20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7198 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4678a20, 0x153a3b9, 0x6, 0x49f7380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7199 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4678a20, 0x1539045, 0x5, 0x49f73a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7200 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4678a20, 0x1539225, 0x5, 0x49f73c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7201 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x45a1dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7202 [select, 1 minutes]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4bf1cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7203 [select, 3 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x4a41a70, 0x5141fc0, 0x1537ffe, 0x3, 0x4bfe040, 0x5141f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7204 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x45a1dc0, 0x0, 0x15c28d8, 0x4565320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7205 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x45a1dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7206 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa597f320, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x45726f4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x45726e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x45726e0, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4acd2f0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4acd2f0, 0x2, 0x2, 0x3f800000, 0x47864d0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x45a1dc0, 0x1816180, 0x4acd2f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7207 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x45a1dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7208 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4a41170, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x48ee780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7209 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x48ee780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7210 [select, 3 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x48ee780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7211 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x48ee780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7212 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa59496e4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x454de64, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x454de50, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x454de50, 0x445ad20, 0xb6d49a34, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x476e950, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x476e950, 0x47aa360, 0x454de50, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4a58580, 0x1816180, 0x476e950, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4a58580, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bf1e00, 0x1537fb3, 0x3, 0x498a7a0, 0xf, 0x476e940, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48ee780, 0x4bf1e00, 0x4bf1d80, 0x4bf1dc0, 0x180f028, 0x49f75e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7213 [IO wait]:
internal/poll.runtime_pollWait(0xa5949660, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca384, 0x72, 0xff00, 0xffff, 0x4a2b740)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x44ca370, 0x5fde000, 0xffff, 0xffff, 0x4a2b740, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x44ca370, 0x5fde000, 0xffff, 0xffff, 0x4a2b740, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x451e358, 0x5fde000, 0xffff, 0xffff, 0x4a2b740, 0x28, 0x28, 0xb6d49a34, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x451e358, 0x5fde000, 0xffff, 0xffff, 0x4a2b740, 0x28, 0x28, 0xb6d49a34, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x451e358, 0x5fde000, 0xffff, 0xffff, 0x45, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x44c2380, 0x451e358, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x451e460, 0x451e358, 0x77359400, 0x0, 0x5615fb0, 0x1, 0x0, 0x0, 0x1808770, 0x5615fb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x44c2380, 0x451e358, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x44c2380, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bf1e40, 0x1537fd1, 0x3, 0x4656290, 0xf, 0x4b701d0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48ee780, 0x4bf1e40, 0x4bf1d80, 0x4bf1dc0, 0x180f040, 0x49f7620)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7214 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa59495dc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4851414, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4851400, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4851400, 0x1449b10, 0x445ab40, 0xb6d49300)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4acd8d0, 0xb8, 0x18, 0x4ff76e0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4acd8d0, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4acd8d0, 0x20, 0x1449b10, 0x2f0001, 0x4ff76e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x50a5170, 0x1813fc0, 0x449d990, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x48ee780, 0x4bff740, 0x489a1c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7087 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4800a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7150 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x44caaf0, 0x181bb80, 0x49323a0, 0x1548869, 0xf, 0x1807870, 0x48cf260, 0x1538981, 0x4, 0x4b7d040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:122 +0x284
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7216 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x455c500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7086 [IO wait]:
internal/poll.runtime_pollWait(0xa5949558, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4851554, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4851540, 0x4a29000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4851540, 0x4a29000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x47aa3b0, 0x4a29000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x44398f0, 0x498aa30, 0xc, 0xc, 0x4800bd0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x44398f0, 0x498aa30, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4800a80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4800a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7218 [select, 3 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4800c40, 0x15c2894, 0x48be380, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x48be380, 0x1824420, 0x4798068)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x48be380, 0x1824420, 0x4798068, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7149 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x44caaf0, 0x1548878, 0xf, 0x1807978, 0x4952cb0, 0xa, 0x0, 0x1199a78, 0x4b7d200, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:362 +0x4e0
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x44caaf0, 0x181bb80, 0x49323a0, 0x1548878, 0xf, 0x1807978, 0x4952cb0, 0x1539c2f, 0x5, 0x4b7d040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7151 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x44caaf0, 0x1548afd, 0xf, 0x1807990, 0x4572910, 0x1, 0x0, 0x1199b40, 0x48cf590, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:362 +0x4e0
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x44caaf0, 0x181bb80, 0x49323a0, 0x1548afd, 0xf, 0x1807990, 0x4572910, 0x1540ed6, 0xa, 0x4b7d040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7152 [select, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*state).run(0x4952c40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:217 +0x1c0
created by github.com/hashicorp/consul/agent/proxycfg.(*state).Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:106 +0xbc

goroutine 7153 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked.func1(0x4a41170, 0x4b7d080)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:195 +0x68
created by github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:193 +0x164

goroutine 7332 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x4e34310)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7169 [select, 3 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x4bd4540, 0x47aa098, 0x181bb80, 0x5078540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7259 [select, 3 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x4b7d380, 0x451eaf8, 0x181bb80, 0x51bd020)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 7510 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x48009a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7258 [select, 3 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x51bd020, 0x4c209fc, 0x20, 0x20, 0x1, 0x4b70e20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x51bccc0, 0x181bb80, 0x51bd020, 0x451eaf8, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x51bccc0, 0x4b7d380, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x45a1dc0, 0x4572a18, 0x48cf74c, 0x4c20b84, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x451f988, 0x4572a00, 0x48cf740, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4598d40, 0x451f9d8, 0x13, 0x1538345, 0x4, 0x4c20d54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598d40, 0x451f9d8, 0x13, 0x48d1554, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x4718c40, 0x4a41b30, 0x4657060, 0x0, 0x44cb900, 0x4a4e1e0, 0x143b020, 0x4572a00, 0x16, 0x1199b40, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4a41b30, 0x181be20, 0x51bcbe0, 0x48ee858, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x45a1dc0, 0x1547f18, 0xf, 0x143b020, 0x4572910, 0x1199b40, 0x48cf710, 0x7c08f8, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48ee780, 0x1547f18, 0xf, 0x143b020, 0x4572910, 0x1199b40, 0x48cf710, 0x48cf624, 0x485ab40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x449d928, 0x1, 0x0, 0xb2c97000, 0x8b, 0x51bcb00, 0x1807990, 0x4572910, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6f8, 0x449d928, 0x443f820, 0x1199b40, 0x48cf590, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 7228 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x4973b90)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7358 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4a35800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x4a35800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4a35800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a35800, 0x451e9b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7394 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47cedc0, 0x3b9aca00, 0x0, 0x4da0a00, 0x4a66d40, 0x4799838)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7265 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x50a5b90)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7231 [runnable]:
syscall.Syscall(0x94, 0x38, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x38, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x4b1e7e0, 0x1000, 0xfffffff)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).write(0x59be180, 0xbf6f9556, 0xd60353bd)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:519 +0x3d0
github.com/boltdb/bolt.(*Tx).Commit(0x59be180, 0x594e530, 0x8)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:198 +0x29c
github.com/hashicorp/raft-boltdb.(*BoltStore).StoreLogs(0x4b71490, 0x5226040, 0x1, 0x1, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:187 +0x228
github.com/hashicorp/raft.(*LogCache).StoreLogs(0x51302d0, 0x5226040, 0x1, 0x1, 0x2656cc8, 0x4bdea30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/log_cache.go:61 +0x110
github.com/hashicorp/raft.(*Raft).dispatchLogs(0x492ee00, 0x4c22d1c, 0x1, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:1061 +0x284
github.com/hashicorp/raft.(*Raft).leaderLoop(0x492ee00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:746 +0x5ac
github.com/hashicorp/raft.(*Raft).runLeader(0x492ee00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x492ee00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x492ee00, 0x47984f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7232 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x492ee00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x492ee00, 0x47984f8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7233 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x492ee00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x492ee00, 0x4798500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7266 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x50f1f20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7285 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47644d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7286 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47644d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7287 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47644d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7288 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47644d0, 0x2a05f200, 0x1, 0x4da1d00, 0x4c54700, 0x451f228)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7289 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47644d0, 0x4c54700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7290 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47644d0, 0x1dcd6500, 0x0, 0x4da1d40, 0x4c54700, 0x451f238)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7291 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x46790e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7292 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x46790e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7293 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46790e0, 0x153a3b9, 0x6, 0x5079a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7294 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46790e0, 0x1539045, 0x5, 0x5079a60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7295 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46790e0, 0x1539225, 0x5, 0x5079a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7296 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x51ed8c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7297 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x495cbd0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7298 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x495cbd0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7299 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa59492c4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4572e74, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4572e60, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4572e60, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4e30300, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4e30300, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4da1f00, 0x4e30300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7300 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5949240, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4572ec4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4572eb0, 0x4e58000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4572eb0, 0x4e58000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x451f298, 0x4e58000, 0x10000, 0x10000, 0x4e13734, 0x101, 0x4e13708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x451f298, 0x4e58000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4da1f00, 0x451f298)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7301 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4764580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7302 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4764580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7303 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4764580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7304 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4764580, 0x3b9aca00, 0x0, 0x4e4e080, 0x4c54c00, 0x451f548)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7305 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4764580, 0x4c54c00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7306 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4764580, 0xbebc200, 0x0, 0x4e4e0c0, 0x4c54c00, 0x451f560)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7307 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4ce58c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7308 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4ce58c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7309 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce58c0, 0x153a3b9, 0x6, 0x51ed960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7310 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce58c0, 0x1539045, 0x5, 0x51ed980)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7311 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce58c0, 0x1539225, 0x5, 0x51ed9a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7312 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4a73340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7313 [select, 1 minutes]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4e4e100)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7314 [select, 3 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x5064bd0, 0x4d01700, 0x1537ffe, 0x3, 0x4c2a980, 0x4d01680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7315 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4a73340, 0x0, 0x15c28d8, 0x46790e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7316 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4a73340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7317 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5949450, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4d7f2d4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4d7f2c0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4d7f2c0, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4cf58e0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4cf58e0, 0x2, 0x2, 0x3f800000, 0x49b8e98)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4a73340, 0x1816180, 0x4cf58e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7318 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4a73340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7267 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x51315f0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x44e52c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7268 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x44e52c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7269 [select, 3 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x44e52c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7270 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x44e52c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7319 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5949138, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6f644, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6f630, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6f630, 0x445b680, 0xb6d496d0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x476e020, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x476e020, 0x4798030, 0x4a6f630, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4a58000, 0x1816180, 0x476e020, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4a58000, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4e4e640, 0x1537fb3, 0x3, 0x498a010, 0xf, 0x476e010, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e52c0, 0x4e4e640, 0x4e4e5c0, 0x4e4e600, 0x180f028, 0x4f9e520)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7320 [IO wait]:
internal/poll.runtime_pollWait(0xa59491bc, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4d7f504, 0x72, 0xff00, 0xffff, 0x56bcf60)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x4d7f4f0, 0x6178000, 0xffff, 0xffff, 0x56bcf60, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x4d7f4f0, 0x6178000, 0xffff, 0xffff, 0x56bcf60, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x49b9b78, 0x6178000, 0xffff, 0xffff, 0x56bcf60, 0x28, 0x28, 0xb6d4936c, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x49b9b78, 0x6178000, 0xffff, 0xffff, 0x56bcf60, 0x28, 0x28, 0xb6d4936c, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x49b9b78, 0x6178000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x44c2c00, 0x49b9b78, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x47aa010, 0x49b9b78, 0x77359400, 0x0, 0x4f95980, 0x1, 0x0, 0x0, 0x1808770, 0x4f95980)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x44c2c00, 0x49b9b78, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x44c2c00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4e4e680, 0x1537fd1, 0x3, 0x4fc2710, 0xf, 0x4e30b10, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e52c0, 0x4e4e680, 0x4e4e5c0, 0x4e4e600, 0x180f040, 0x4f9e560)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7321 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa59490b4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6f6e4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6f6d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6f6d0, 0x1449b10, 0x445b680, 0xb6d49600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x476e030, 0xcd, 0x18, 0x50f1980)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x476e030, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x476e030, 0x20, 0x1449b10, 0x2f0001, 0x50f1980)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x4542fc0, 0x1813fc0, 0x4798050, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x44e52c0, 0x4c2a0c0, 0x50fe140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7240 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x492b360, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x492b360, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x492b360, 0x153de26, 0x8, 0x4e28fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x492b360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 7618 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4894400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4894400, 0x4786998)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7277 [select]:
github.com/hashicorp/yamux.(*Session).send(0x44ad030)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7509 [select]:
github.com/hashicorp/yamux.(*Session).send(0x48009a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7325 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x44cab40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7276 [IO wait]:
internal/poll.runtime_pollWait(0xa5949030, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6fdc4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4a6fdb0, 0x4d88000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4a6fdb0, 0x4d88000, 0x1000, 0x1000, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x47864b0, 0x4d88000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4d02db0, 0x4f9a210, 0xc, 0xc, 0x15c2f1c, 0x5, 0x44ad030)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4d02db0, 0x4f9a210, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x44ad030, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x44ad030)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7327 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x5f1c060, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a73340, 0x4a66740)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:162 +0x3dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4fc26f0, 0x4a73340, 0x4a66740)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 7278 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x44ad030)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7279 [select, 3 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4973dc0, 0x15c2894, 0x45a1dc0, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x45a1dc0, 0x1824420, 0x47864d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x45a1dc0, 0x1824420, 0x47864d0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7280 [IO wait]:
internal/poll.runtime_pollWait(0xa5948fac, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x454deb4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x454dea0, 0x4dc7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x454dea0, 0x4dc7000, 0x1000, 0x1000, 0x1, 0x12b94, 0x1ba48)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x47864d0, 0x4dc7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4d02f00, 0x4f9a240, 0xc, 0xc, 0x4784460, 0x5, 0x4c6a258)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4d02f00, 0x4f9a240, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4973dc0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4973dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7281 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4973dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7346 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4973dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7347 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4973e30, 0x4dda000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x4d02f90)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x4d02f90, 0x5bdaea0, 0x0, 0x4973e34)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x4784740, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4a742b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x4d02fc0, 0x11b15b0, 0x50f1440)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x4d02fc0, 0x11b15b0, 0x50f1440, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x4d02f60, 0x11b15b0, 0x50f1440, 0x50f1440, 0x4a41b48)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x4d02f60, 0x50f1440, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x4a41b30, 0x181cc00, 0x4d02f60, 0x4c07f90, 0x44a2400, 0x13001, 0x0, 0xa3d3a568, 0x34)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x4a41b30, 0x181cc00, 0x4d02f60, 0x60e8910, 0x45725f0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x60e8910, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x4a41b30, 0x181cc00, 0x4d02f60, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x45a1dc0, 0x1824300, 0x4973e30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 7331 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x4e34310)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7246 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4e353b0, 0x4ef9000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x512a2d0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x512a2d0, 0x5759f80, 0x0, 0x4e353b4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x4ee8620, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4bb4f60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x512a300, 0x11b15b0, 0x50f1a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x512a300, 0x11b15b0, 0x50f1a00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x512a2a0, 0x11b15b0, 0x50f1a00, 0x50f1a00, 0x5064cd8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x512a2a0, 0x50f1a00, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x5064cc0, 0x181cc00, 0x512a2a0, 0x4e9ef90, 0x44a2400, 0x181cc01, 0x0, 0x0, 0xb8)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x5064cc0, 0x181cc00, 0x512a2a0, 0x53f3a60, 0x4d7f1d0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x53f3a60, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x5064cc0, 0x181cc00, 0x512a2a0, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4a73340, 0x1824300, 0x4e353b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 7353 [select, 3 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x4851630)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7354 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4b70070)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7355 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5065c50, 0x0, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x44c2f00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7524 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x480ea00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x480ea00, 0x50436b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7503 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x49528c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7359 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4a35800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a35800, 0x451e9b8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7360 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4a35800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a35800, 0x451e9c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7361 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x49f68a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7362 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4559290)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7363 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4559290)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7364 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5948ea4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x45730f4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x45730e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x45730e0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4acc4e0, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4acc4e0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4d80640, 0x4acc4e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7365 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5948e20, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4573144, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4573130, 0x4d4c000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4573130, 0x4d4c000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x451ea58, 0x4d4c000, 0x10000, 0x10000, 0x4c6e734, 0x101, 0x4c6e708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x451ea58, 0x4d4c000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4d80640, 0x451ea58)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7366 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47ced10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7367 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47ced10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7368 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47ced10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7369 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ced10, 0x2a05f200, 0x1, 0x4d807c0, 0x4c55540, 0x451ee28)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7370 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47ced10, 0x4c55540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7371 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ced10, 0x1dcd6500, 0x0, 0x4d80800, 0x4c55540, 0x451ee38)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7372 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4b1ec60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7373 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4b1ec60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7374 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1ec60, 0x153a3b9, 0x6, 0x49f6960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7375 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1ec60, 0x1539045, 0x5, 0x49f6980)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7376 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1ec60, 0x1539225, 0x5, 0x49f69a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7377 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x49eeda0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7378 [select, 3 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x45599e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7379 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x45599e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7380 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5948d9c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4573284, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4573270, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4573270, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4acccf0, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4acccf0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4d809c0, 0x4acccf0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7381 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5948d18, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x45732d4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x45732c0, 0x4e68000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x45732c0, 0x4e68000, 0x10000, 0x10000, 0x0, 0x10001, 0x8001, 0x0, 0xff7b1f02)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x451ee98, 0x4e68000, 0x10000, 0x10000, 0x4d66734, 0x101, 0x4d66708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x451ee98, 0x4e68000, 0x10000, 0x10000, 0x880, 0x14, 0x20003, 0x1, 0x7e9e)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4d809c0, 0x451ee98)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7382 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47cedc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7383 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47cedc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7384 [select, 3 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47cedc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7395 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47cedc0, 0x4a66d40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7396 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47cedc0, 0xbebc200, 0x0, 0x4da0a40, 0x4a66d40, 0x4799858)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7397 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4b1f200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7398 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4b1f200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7399 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f200, 0x153a3b9, 0x6, 0x49eee40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7400 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f200, 0x1539045, 0x5, 0x49eee60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7401 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f200, 0x1539225, 0x5, 0x49eee80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7402 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4a73a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7403 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4da0b00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7404 [select, 3 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x5065b60, 0x476c480, 0x1537ffe, 0x3, 0x4c550c0, 0x476c000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7405 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4a73a40, 0x0, 0x15c28d8, 0x4b1ec60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7406 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4a73a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7407 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa5948f28, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4572b54, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4572b40, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4572b40, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4b70080, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4b70080, 0x2, 0x2, 0x3f800000, 0x451e700)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4a73a40, 0x1816180, 0x4b70080)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7408 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4a73a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7409 [chan receive, 3 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4bafbf0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x44e5040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7410 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x44e5040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7411 [select, 3 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x44e5040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7412 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x44e5040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7413 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5948c10, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492ad84, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x492ad70, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x492ad70, 0x445b680, 0xb6d49a34, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4cf47a0, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4cf47a0, 0x49b8318, 0x492ad70, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x453a280, 0x1816180, 0x4cf47a0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x453a280, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4da0c40, 0x1537fb3, 0x3, 0x44a6d20, 0xf, 0x4cf4790, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e5040, 0x4da0c40, 0x4da0bc0, 0x4da0c00, 0x180f028, 0x4716e80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7414 [IO wait]:
internal/poll.runtime_pollWait(0xa5948c94, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6ff54, 0x72, 0xff00, 0xffff, 0x4a2b920)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x4a6ff40, 0x60f6000, 0xffff, 0xffff, 0x4a2b920, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x4a6ff40, 0x60f6000, 0xffff, 0xffff, 0x4a2b920, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4799ab0, 0x60f6000, 0xffff, 0xffff, 0x4a2b920, 0x28, 0x28, 0xb6d49a34, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4799ab0, 0x60f6000, 0xffff, 0xffff, 0x4a2b920, 0x28, 0x28, 0xb6d49a34, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4799ab0, 0x60f6000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4a58900, 0x4799ab0, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4799ad8, 0x4799ab0, 0x77359400, 0x0, 0x51c64e0, 0x1, 0x0, 0x0, 0x1808770, 0x51c64e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4a58900, 0x4799ab0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4a58900, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4da0c80, 0x1537fd1, 0x3, 0x498bc90, 0xf, 0x476fe90, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e5040, 0x4da0c80, 0x4da0bc0, 0x4da0c00, 0x180f040, 0x4716ee0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7415 [IO wait, 3 minutes]:
internal/poll.runtime_pollWait(0xa5948b8c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6ffa4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6ff90, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6ff90, 0x1449b10, 0x4484000, 0xb6d49000)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x476fee0, 0x7d278, 0x18, 0x4b1d7e0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x476fee0, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x476fee0, 0x20, 0x1449b10, 0x2f0001, 0x4b1d7e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x44a2f30, 0x1813fc0, 0x4799b28, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x44e5040, 0x4a66f40, 0x503ff00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7245 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4e35340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7244 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4e35340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7418 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x48515e0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x48515e0, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x48515e0, 0x153de26, 0x8, 0x4e29fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x48515e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 7508 [IO wait]:
internal/poll.runtime_pollWait(0xa598f6dc, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x492abf4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x492abe0, 0x5001000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x492abe0, 0x5001000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x47aacb0, 0x5001000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4cf19e0, 0x4fc2510, 0xc, 0xc, 0x4800cb0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4cf19e0, 0x4fc2510, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x48009a0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x48009a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7521 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4894400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4894400, 0x47868b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7243 [IO wait]:
internal/poll.runtime_pollWait(0xa5948a84, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede104, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4ede0f0, 0x4ef8000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4ede0f0, 0x4ef8000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x49b8e98, 0x4ef8000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x512a1e0, 0x4fd0c84, 0xc, 0xc, 0x4e353b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x512a1e0, 0x4fd0c84, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4e35340, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4e35340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7421 [select, 3 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4a73340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 7387 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x492b400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7389 [IO wait]:
internal/poll.runtime_pollWait(0xa5948b08, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4573874, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4573860, 0x4eed000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4573860, 0x4eed000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x451ffa8, 0x4eed000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x51ee000, 0x4fd0c90, 0xc, 0xc, 0x4ea8460, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x51ee000, 0x4fd0c90, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4ea8380, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4ea8380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7390 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4ea8380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7391 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4ea8380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7506 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x4851680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7426 [select, 2 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x4ec24b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7427 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4ee2d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7428 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x51c7710, 0x0, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4a59c00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7392 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x47b52d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7393 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x47b52d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7342 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4b1d320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7343 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x50a4b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7344 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x50a4b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7345 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa594897c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca654, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x44ca640, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x44ca640, 0x4c73714, 0x4c7370c, 0x5e73d8)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x476ede0, 0x1, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x476ede0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bd4e40, 0x476ede0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7458 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa59488f8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca6a4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x44ca690, 0x4d28000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x44ca690, 0x4d28000, 0x10000, 0x10000, 0x4fd0c00, 0x4bb4f01, 0x1827f01, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5042320, 0x4d28000, 0x10000, 0x10000, 0x4c70f34, 0x101, 0x4c70f08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5042320, 0x4d28000, 0x10000, 0x10000, 0x0, 0x7a3ae4, 0x4a67b00, 0x4fd0cc0, 0xffffff00)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bd4e40, 0x5042320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7459 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44ee6e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7460 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44ee6e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7461 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44ee6e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7462 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee6e0, 0x2a05f200, 0x1, 0x4bd51c0, 0x4c2aec0, 0x5042688)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7463 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44ee6e0, 0x4c2aec0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7464 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee6e0, 0x1dcd6500, 0x0, 0x4bd5200, 0x4c2aec0, 0x50427b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7465 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4679200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7466 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4679200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7467 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4679200, 0x153a3b9, 0x6, 0x4b1d3c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7468 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4679200, 0x1539045, 0x5, 0x4b1d3e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7469 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4679200, 0x1539225, 0x5, 0x4b1d400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7470 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x47c7e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7471 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x50a5290)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7472 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x50a5290)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7473 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f970, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca7e4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x44ca7d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x44ca7d0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x476f6a0, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x476f6a0, 0x0, 0x4af45c, 0x440f330)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4bd5700, 0x476f6a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7474 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f8ec, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca834, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x44ca820, 0x4d8a000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x44ca820, 0x4d8a000, 0x10000, 0x10000, 0x4992200, 0x1, 0x1539801, 0x0, 0x1537e66)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5042820, 0x4d8a000, 0x10000, 0x10000, 0x4c7df34, 0x101, 0x4c7df08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5042820, 0x4d8a000, 0x10000, 0x10000, 0x5ddddd29, 0x0, 0x1b1d30f5, 0x47444b0, 0xa5c0e)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4bd5700, 0x5042820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7475 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44ee790)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7476 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44ee790)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7477 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44ee790)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7478 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee790, 0x3b9aca00, 0x0, 0x4bd5880, 0x4c2b7c0, 0x5042bf0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7479 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44ee790, 0x4c2b7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7480 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ee790, 0xbebc200, 0x0, 0x4bd58c0, 0x4c2b7c0, 0x5042c00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7481 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x46797a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7482 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x46797a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7483 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46797a0, 0x153a3b9, 0x6, 0x47c7f80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7484 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46797a0, 0x1539045, 0x5, 0x461b660)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7485 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x46797a0, 0x1539225, 0x5, 0x461bd40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7486 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x5013180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7487 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4bd5900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7488 [select, 2 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x51c7620, 0x502c980, 0x1537ffe, 0x3, 0x4c2a540, 0x502c900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7489 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x5013180, 0x0, 0x15c28d8, 0x4679200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7490 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x5013180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7491 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa5948a00, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ec3cd4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4ec3cc0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4ec3cc0, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4ee2d70, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4ee2d70, 0x2, 0x2, 0x3f800000, 0x451f8f0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5013180, 0x1816180, 0x4ee2d70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7492 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x5013180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7493 [chan receive, 2 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4653800, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x5046000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7494 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x5046000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7495 [select, 2 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x5046000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7496 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x5046000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7497 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f7e4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede014, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4ede000, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4ede000, 0x445ad20, 0xb6d49008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a22150, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4a22150, 0x5042d30, 0x4ede000, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4c42600, 0x1816180, 0x4a22150, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4c42600, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bd5a40, 0x1537fb3, 0x3, 0x49aa030, 0xf, 0x4a22130, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5046000, 0x4bd5a40, 0x4bd59c0, 0x4bd5a00, 0x180f028, 0x5030a20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7498 [IO wait]:
internal/poll.runtime_pollWait(0xa598f868, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca9c4, 0x72, 0xff00, 0xffff, 0x4c1a2a0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x44ca9b0, 0x6188000, 0xffff, 0xffff, 0x4c1a2a0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x44ca9b0, 0x6188000, 0xffff, 0xffff, 0x4c1a2a0, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5042cc0, 0x6188000, 0xffff, 0xffff, 0x4c1a2a0, 0x28, 0x28, 0xb6d496d0, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5042cc0, 0x6188000, 0xffff, 0xffff, 0x4c1a2a0, 0x28, 0x28, 0xb6d496d0, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5042cc0, 0x6188000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4c42480, 0x5042cc0, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5042ce8, 0x5042cc0, 0x77359400, 0x0, 0x51f2d50, 0x1, 0x0, 0x0, 0x1808770, 0x51f2d50)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4c42480, 0x5042cc0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4c42480, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4bd5a80, 0x1537fd1, 0x3, 0x4913fa0, 0xf, 0x4a220d0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x5046000, 0x4bd5a80, 0x4bd59c0, 0x4bd5a00, 0x180f040, 0x5030a60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7499 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f760, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede064, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4ede050, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4ede050, 0x1449b10, 0x445ad20, 0xb6d49000)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a22160, 0xb, 0x18, 0x504c140)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a22160, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4a22160, 0x20, 0x1449b10, 0x2f0001, 0x504c140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x50a5a70, 0x1813fc0, 0x5042d50, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x5046000, 0x4c2bf40, 0x4cce220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7449 [IO wait]:
internal/poll.runtime_pollWait(0xa598f658, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x455dfa4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x455df90, 0x4f0a000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x455df90, 0x4f0a000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x451e700, 0x4f0a000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5131fb0, 0x4fc2504, 0xc, 0xc, 0x465ca80, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x5131fb0, 0x4fc2504, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x465ca10, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x465ca10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7448 [select, 2 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x465ca10, 0x15c2894, 0x4a73a40, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4a73a40, 0x1824420, 0x451e700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4a73a40, 0x1824420, 0x451e700, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7446 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x4ec2460, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x4ec2460, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x4ec2460, 0x153de26, 0x8, 0x4f38fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x4ec2460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 7504 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x49528c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7429 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4a73a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 7450 [select]:
github.com/hashicorp/yamux.(*Session).send(0x465ca10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7451 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x465ca10)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7452 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x465ca80, 0x4a92000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x48ae090)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x48ae090, 0x5b01680, 0x0, 0x465ca84)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x49f6f20, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x440f8d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x48ae0c0, 0x11b15b0, 0x473a1c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x48ae0c0, 0x11b15b0, 0x473a1c0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x48ae060, 0x11b15b0, 0x473a1c0, 0x473a1c0, 0x5065c38)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x48ae060, 0x473a1c0, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x5065c20, 0x181cc00, 0x48ae060, 0x4f39f90, 0x44a2464, 0x181cc01, 0x48ae060, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x5065c20, 0x181cc00, 0x48ae060, 0x5e44810, 0x45729b0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x5e44810, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x5065c20, 0x181cc00, 0x48ae060, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4a73a40, 0x1824300, 0x465ca80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 7457 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x495d830)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7520 [select]:
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4894400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:539 +0x238
github.com/hashicorp/raft.(*Raft).runLeader(0x4894400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4894400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4894400, 0x47868a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7513 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5013180, 0x4c7ffc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x44a7040, 0x5013180, 0x4c7ffc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 7436 [select, 2 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x4cbc780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7437 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x5b8a2a0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x4a728c0, 0x86, 0x1282178, 0x5e44780, 0x0, 0x1, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:349 +0x120
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchApplyUpdates(0x44b5e50, 0x5410fa4, 0x2)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:102 +0x2cc
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x44b5e50)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:46 +0xb0
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7438 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x4f946f0, 0x0, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x4ea4a80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7525 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x480ea00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x480ea00, 0x50436b8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7526 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5008100)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7538 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x495d830)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7539 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f550, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4573414, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4573400, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4573400, 0x465ca80, 0x0, 0x7a3ae4)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4e306d0, 0x4bf1d40, 0x443f760, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4e306d0, 0x451e808, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4ffe400, 0x4e306d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7540 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f4cc, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4573464, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4573450, 0x51d8000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4573450, 0x51d8000, 0x10000, 0x10000, 0x5064b00, 0x1573601, 0x1, 0x0, 0x1)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x451eec8, 0x51d8000, 0x10000, 0x10000, 0x4e47734, 0x101, 0x4e47708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x451eec8, 0x51d8000, 0x10000, 0x10000, 0x512ba40, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4ffe400, 0x451eec8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7541 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4764840)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7542 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4764840)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7543 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4764840)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7544 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4764840, 0x2a05f200, 0x1, 0x4ffe580, 0x4cfb4c0, 0x451f1d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7545 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4764840, 0x4cfb4c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7546 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4764840, 0x1dcd6500, 0x0, 0x4ffe5c0, 0x4cfb4c0, 0x451f1e8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7547 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4ce50e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7548 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4ce50e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7549 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce50e0, 0x153a3b9, 0x6, 0x4fcef60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7550 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce50e0, 0x1539045, 0x5, 0x4fcef80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7551 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce50e0, 0x1539225, 0x5, 0x4fcefa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7552 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x509cd60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7527 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x4eea480)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7528 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x4eea480)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7529 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f448, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede4c4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4ede4b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4ede4b0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a23c70, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a23c70, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5004300, 0x4a23c70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7530 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f3c4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede514, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4ede500, 0x5204000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4ede500, 0x5204000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x5043728, 0x5204000, 0x10000, 0x10000, 0x4dc1f34, 0x101, 0x4dc1f08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x5043728, 0x5204000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5004300, 0x5043728)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7531 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44ef130)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7532 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44ef130)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7533 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44ef130)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7534 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ef130, 0x3b9aca00, 0x0, 0x5004480, 0x48989c0, 0x5043ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7535 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44ef130, 0x48989c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7536 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44ef130, 0xbebc200, 0x0, 0x50044c0, 0x48989c0, 0x5043b00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7537 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x50a2240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7554 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x50a2240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7555 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x50a2240, 0x153a3b9, 0x6, 0x5008200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7556 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x50a2240, 0x1539045, 0x5, 0x5008220)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7557 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x50a2240, 0x1539225, 0x5, 0x5008240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7558 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x4a728c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7559 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5004500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7560 [select, 2 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x4f94600, 0x4f0ee80, 0x1537ffe, 0x3, 0x4898480, 0x4f0ee00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7561 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x4a728c0, 0x0, 0x15c28d8, 0x4ce50e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7562 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x4a728c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7563 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f5d4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4cbdfa4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4cbdf90, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4cbdf90, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x44b5e60, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x44b5e60, 0x2, 0x2, 0x3f800000, 0x47872f8)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x4a728c0, 0x1816180, 0x44b5e60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7564 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x4a728c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7565 [chan receive, 2 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4fe4b70, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x44e5400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7566 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x44e5400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7567 [select, 2 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x44e5400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7568 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x44e5400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7569 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f340, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x45735a4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4573590, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4573590, 0x4484000, 0xb6d4936c, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4e30f70, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4e30f70, 0x451f2f0, 0x4573590, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4928880, 0x1816180, 0x4e30f70, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4928880, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5004640, 0x1537fb3, 0x3, 0x4971c70, 0xf, 0x4e30f60, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e5400, 0x5004640, 0x50045c0, 0x5004600, 0x180f028, 0x4ec0800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7570 [IO wait]:
internal/poll.runtime_pollWait(0xa598f2bc, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede654, 0x72, 0xff00, 0xffff, 0x4a2bcb0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x4ede640, 0x6260000, 0xffff, 0xffff, 0x4a2bcb0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x4ede640, 0x6260000, 0xffff, 0xffff, 0x4a2bcb0, 0x28, 0x28, 0xa3d31300, 0x1d3a8, 0xa3d31303, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x5043c98, 0x6260000, 0xffff, 0xffff, 0x4a2bcb0, 0x28, 0x28, 0xb6d49a34, 0x50ca540, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x5043c98, 0x6260000, 0xffff, 0xffff, 0x4a2bcb0, 0x28, 0x28, 0xb6d49a34, 0x50ca540, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x5043c98, 0x6260000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4c42980, 0x5043c98, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x5043d08, 0x5043c98, 0x77359400, 0x0, 0x51c7aa0, 0x1, 0x0, 0x0, 0x1808770, 0x51c7aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4c42980, 0x5043c98, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4c42980, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5004680, 0x1537fd1, 0x3, 0x45d87d0, 0xf, 0x507a5e0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x44e5400, 0x5004680, 0x50045c0, 0x5004600, 0x180f040, 0x4ec0840)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7516 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f238, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4fee1f4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4fee1e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4fee1e0, 0x1449b10, 0x485ab40, 0xb6d49a00)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4fe2940, 0xa5, 0x18, 0x5103480)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4fe2940, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4fe2940, 0x20, 0x1449b10, 0x2f0001, 0x5103480)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x44a3290, 0x1813fc0, 0x47ab620, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x44e5400, 0x4cc2a80, 0x4fed260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7619 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4fc7780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7441 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x4cbc730, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x4cbc730, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x4cbc730, 0x153de26, 0x8, 0x4f36fdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x4cbc730)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 7583 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x4cbc7d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7553 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x5013180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 7594 [select, 2 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x4ec22d0, 0x1548afd, 0xf, 0x1807990, 0x4fee870, 0x1, 0x0, 0x1199b40, 0x48cf4d0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:362 +0x4e0
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4ec22d0, 0x181bb80, 0x473a2e0, 0x1548afd, 0xf, 0x1807990, 0x4fee870, 0x1540ed6, 0xa, 0x4e4e6c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7572 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4a728c0, 0x4898bc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4fd0200, 0x4a728c0, 0x4898bc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 7585 [IO wait]:
internal/poll.runtime_pollWait(0xa598ec8c, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4573374, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4573360, 0x4c67000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4573360, 0x4c67000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4f88248, 0x4c67000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x5064ea0, 0x4970730, 0xc, 0xc, 0x46315e0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x5064ea0, 0x4970730, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4631500, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4631500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7574 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x4ec2500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 7602 [select, 2 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4f3c7e0, 0x15c2894, 0x5013180, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x5013180, 0x1824420, 0x451f8f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x5013180, 0x1824420, 0x451f8f0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7576 [IO wait]:
internal/poll.runtime_pollWait(0xa598f130, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4edea64, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4edea50, 0x5250000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4edea50, 0x5250000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4b8a028, 0x5250000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x51f2ae0, 0x45d8b40, 0xc, 0xc, 0x5063960, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x51f2ae0, 0x45d8b40, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x5063880, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x5063880)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7603 [IO wait]:
internal/poll.runtime_pollWait(0xa598f1b4, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x45737d4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x45737c0, 0x5099000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x45737c0, 0x5099000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x451f8f0, 0x5099000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x514d5f0, 0x513e804, 0xc, 0xc, 0x4f3c850, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x514d5f0, 0x513e804, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4f3c7e0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4f3c7e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7604 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4f3c7e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7605 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4f3c7e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7577 [select]:
github.com/hashicorp/yamux.(*Session).send(0x5063880)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7578 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x5063880)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7606 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4f3c850, 0x5256000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x514d680)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x514d680, 0x56599e0, 0x0, 0x4f3c854)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x51c4980, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4e31970)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x514d6b0, 0x11b15b0, 0x49f6820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x514d6b0, 0x11b15b0, 0x49f6820, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x514d650, 0x11b15b0, 0x49f6820, 0x49f6820, 0x51c76f8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x514d650, 0x49f6820, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x51c76e0, 0x181cc00, 0x514d650, 0x4f45f90, 0x44a2400, 0x181cc01, 0x0, 0x0, 0x117)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x51c76e0, 0x181cc00, 0x514d650, 0x4c84ce0, 0x4ec3bd0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x4c84ce0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x51c76e0, 0x181cc00, 0x514d650, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x5013180, 0x1824300, 0x4f3c850)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 18037 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x4a35000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a35000, 0x4d3a1d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 7592 [select, 2 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x4ec22d0, 0x1548878, 0xf, 0x1807978, 0x5106000, 0x9, 0x0, 0x1199a78, 0x4da7d40, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:362 +0x4e0
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4ec22d0, 0x181bb80, 0x473a2e0, 0x1548878, 0xf, 0x1807978, 0x5106000, 0x1539c2f, 0x5, 0x4e4e6c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7612 [select, 2 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x4ec22d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 7613 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x4e307e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 7614 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x4d03380, 0x0, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x453bd00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 7615 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x49538f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7616 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x49538f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 7666 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598ed10, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4a6fe14, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4a6fe00, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4a6fe00, 0x1449b10, 0x485a960, 0xb6d49300)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4a75470, 0x23, 0x18, 0x4716440)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4a75470, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x4a75470, 0x20, 0x1449b10, 0x2f0001, 0x4716440)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x4f92bd0, 0x1813fc0, 0x4fabc80, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x48ee140, 0x4c2a200, 0x503fba0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 7620 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x498f950)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7621 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x498f950)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7622 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f028, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede474, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4ede460, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4ede460, 0x112120, 0x5ddddd2f, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4cf4ba0, 0xf6a081ce, 0x4573950, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4cf4ba0, 0x513e8ec, 0x7229c, 0x44fa5a0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4b7d900, 0x4cf4ba0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7623 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598efa4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede5b4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4ede5a0, 0x4ece000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4ede5a0, 0x4ece000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4786a28, 0x4ece000, 0x10000, 0x10000, 0x50c3734, 0x101, 0x50c3708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4786a28, 0x4ece000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4b7d900, 0x4786a28)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7624 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47cea50)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7625 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47cea50)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7626 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47cea50)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7627 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47cea50, 0x2a05f200, 0x1, 0x4b7db80, 0x4c7ec80, 0x4786cc8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7628 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47cea50, 0x4c7ec80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7629 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47cea50, 0x1dcd6500, 0x0, 0x4b7dbc0, 0x4c7ec80, 0x4786cd8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7630 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4b1f320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7631 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4b1f320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7632 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f320, 0x153a3b9, 0x6, 0x4fc7840)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7633 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f320, 0x1539045, 0x5, 0x4fc7860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7634 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f320, 0x1539225, 0x5, 0x4fc7880)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7635 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x4d07ba0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 7636 [select, 2 minutes]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x498fef0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 7637 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x498fef0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 7638 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598ef20, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede794, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4ede780, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4ede780, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4cf5360, 0x0, 0x0, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x4cf5360, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x4b7dd80, 0x4cf5360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 7639 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598ee9c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede7e4, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x4ede7d0, 0x4f22000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x4ede7d0, 0x4f22000, 0x10000, 0x10000, 0x0, 0x1, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x4786d38, 0x4f22000, 0x10000, 0x10000, 0x4c5a734, 0x101, 0x4c5a708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x4786d38, 0x4f22000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x4b7dd80, 0x4786d38)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 7640 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x47ceb00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 7641 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x47ceb00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 7642 [select, 2 minutes]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x47ceb00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 7643 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ceb00, 0x3b9aca00, 0x0, 0x4b7df00, 0x4c7f380, 0x4787048)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7644 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x47ceb00, 0x4c7f380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:161 +0x19c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 7645 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x47ceb00, 0xbebc200, 0x0, 0x4b7dfc0, 0x4c7f380, 0x4787058)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:134 +0x13c
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 7646 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4b1f9e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 7647 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4b1f9e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 7648 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f9e0, 0x153a3b9, 0x6, 0x4d07cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 7649 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f9e0, 0x1539045, 0x5, 0x4d07d40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 7650 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4b1f9e0, 0x1539225, 0x5, 0x4d07da0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 7651 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x45a1500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 7652 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x50a8000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 7653 [select, 2 minutes]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x4d03290, 0x502cc00, 0x1537ffe, 0x3, 0x4c7e240, 0x502cb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 7654 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x45a1500, 0x0, 0x15c28d8, 0x4b1f320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 7655 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x45a1500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990

goroutine 7656 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598f0ac, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x44ca924, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x44ca910, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x44ca910, 0x346c6c, 0x2653bb4, 0x44a23f0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4e307f0, 0x2, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4e307f0, 0x2, 0x2, 0x3f800000, 0x4cdf9a8)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x45a1500, 0x1816180, 0x4e307f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 7657 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x45a1500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 7658 [chan receive, 2 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x4ff5d70, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x48ee140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 7659 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x48ee140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 7660 [select, 2 minutes]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x48ee140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 7661 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x48ee140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 7662 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0xa598ed94, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4fee4c4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x4fee4b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x4fee4b0, 0x2657440, 0xb6d496d0, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4bb4c20, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4bb4c20, 0x4411430, 0x4fee4b0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x4a58a80, 0x1816180, 0x4bb4c20, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x4a58a80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x50a8240, 0x1537fb3, 0x3, 0x513e900, 0xf, 0x4bb4c10, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48ee140, 0x50a8240, 0x50a8140, 0x50a8180, 0x180f028, 0x506e6a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7663 [IO wait]:
internal/poll.runtime_pollWait(0xa598ee18, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4ede924, 0x72, 0xff00, 0xffff, 0x5d6bdd0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x4ede910, 0x5fee000, 0xffff, 0xffff, 0x5d6bdd0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x4ede910, 0x5fee000, 0xffff, 0xffff, 0x5d6bdd0, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4787130, 0x5fee000, 0xffff, 0xffff, 0x5d6bdd0, 0x28, 0x28, 0xb6d496d0, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4787130, 0x5fee000, 0xffff, 0xffff, 0x5d6bdd0, 0x28, 0x28, 0xb6d496d0, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4787130, 0x5fee000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x44c2980, 0x4787130, 0x77359400, 0x0, 0x1808601, 0x2666a98, 0xa5989970, 0x2666a98, 0xffffff01, 0xa5989950)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4787158, 0x4787130, 0x77359400, 0x0, 0x56c78f0, 0x1, 0x0, 0x0, 0x1808770, 0x56c78f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x44c2980, 0x4787130, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x44c2980, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x50a8280, 0x1537fd1, 0x3, 0x44419a0, 0xf, 0x4988140, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48ee140, 0x50a8280, 0x50a8140, 0x50a8180, 0x180f040, 0x506e6e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 7682 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x45a1500, 0x4c7f740)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:210 +0x2a8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x51278d0, 0x45a1500, 0x4c7f740)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 7593 [select, 1 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x4ec22d0, 0x181bb80, 0x473a2e0, 0x1548869, 0xf, 0x1807870, 0x48cee70, 0x1538981, 0x4, 0x4e4e6c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:122 +0x284
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 7669 [select, 1 minutes]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x4ec2280, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:258 +0xe8
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x4ec2280, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x4ec2280, 0x153de26, 0x8, 0x4f5cfdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x4ec2280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 8411 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_JSON(0x5019680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:18 +0x20
testing.tRunner(0x5019680, 0x15c26bc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7671 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x4a728c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 7698 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4631500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7699 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4631500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7683 [select, 2 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4a42e70, 0x15c2894, 0x4a728c0, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x4a728c0, 0x1824420, 0x47872f8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x4a728c0, 0x1824420, 0x47872f8, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 7684 [IO wait]:
internal/poll.runtime_pollWait(0xa598ec08, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4edece4, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4edecd0, 0x4bc7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4edecd0, 0x4bc7000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x47872f8, 0x4bc7000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x4886a50, 0x4441f40, 0xc, 0xc, 0x4a42ee0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x4886a50, 0x4441f40, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4a42e70, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4a42e70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 7685 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4a42e70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 7686 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4a42e70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 7687 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4a42ee0, 0x4d6c000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x4886ae0)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x4886ae0, 0x5b5a8a0, 0x0, 0x4a42ee4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x4a4f920, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4988db0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x4886b40, 0x11b15b0, 0x50735c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x4886b40, 0x11b15b0, 0x50735c0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x4886ab0, 0x11b15b0, 0x50735c0, 0x50735c0, 0x4f946d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x4886ab0, 0x50735c0, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x4f946c0, 0x181cc00, 0x4886ab0, 0x4f34f90, 0x44a2464, 0x181cc01, 0x4886ab0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x4f946c0, 0x181cc00, 0x4886ab0, 0x605bb30, 0x4cbdea0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x605bb30, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x4f946c0, 0x181cc00, 0x4886ab0, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x4a728c0, 0x1824300, 0x4a42ee0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 16787 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x4e4f000, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/raft.(*Raft).Stats(0x4895c00, 0x47193c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/api.go:1015 +0x778
github.com/hashicorp/consul/agent/consul.(*Status).RaftStats(0x47864f8, 0x5312a80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/status_endpoint.go:46 +0x24
reflect.Value.call(0x4599580, 0x4786548, 0x13, 0x1538345, 0x4, 0x58f2efc, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4599580, 0x4786548, 0x13, 0x58f2efc, 0x3, 0x3, 0x5a5ac01, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5529be0, 0x55a3d10, 0x498a318, 0x0, 0x4edff40, 0x55604a0, 0x126b1f8, 0x2666c9c, 0x199, 0x1198e20, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x55a3d10, 0x181cc00, 0x5a5acc0, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x58e9880, 0x1824300, 0x4f3c8c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 8077 [select, 2 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5129880, 0x52267a8, 0x181bb80, 0x4b1d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 18259 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).lanEventHandler(0x6054540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server_serf.go:133 +0x88
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:430 +0x868

goroutine 15845 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x6b1fda0c, 0xf)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x5994060, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x57ca148, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18249 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x4aa0370)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 15897 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x32243932, 0x16)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x55b9b60, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x57caf90, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 17952 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x59d7200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x59d7200, 0x57cbd50)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 18188 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x6027570)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 12293 [select, 1 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x5933500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 18234 [select]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x5c61e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 18228 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5e3cb60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 17920 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x3a69cd42, 0x6)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x7, 0x1548869, 0xf, 0x55d9820, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x590fdd0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 14043 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x4bf1b40, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x5276d20, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5276d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 9116 [semacquire, 2 minutes]:
sync.runtime_Semacquire(0x553b4ec)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x553b4ec)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x553b490)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x4ab7880, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x4ab7880, 0x5688680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x46561d0, 0x4ab7880, 0x5688680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 18262 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).Flood(0x6054540, 0x0, 0x15c28d8, 0x4ce4b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/flood.go:49 +0x1d8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:450 +0xc24

goroutine 15102 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x51f7d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x51f7d60, 0x5586900)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15916 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x4cb6895a, 0x18)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x5825060, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x57cb3c8, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 12078 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x526b040, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*AutopilotDelegate).PromoteNonVoters(0x577f0e0, 0x50bd088, 0x0, 0x0, 0x0, 0x0, 0x0, 0x55b70c0, 0x487cfb4, 0x525c240, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot.go:69 +0x40
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).promoteServers(0x47b59d0, 0x487cf44, 0x3)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:140 +0x120
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x47b59d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:112 +0x198
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 7595 [select, 2 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*state).run(0x47b5f80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:217 +0x1c0
created by github.com/hashicorp/consul/agent/proxycfg.(*state).Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/state.go:106 +0xbc

goroutine 7596 [chan receive, 2 minutes]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked.func1(0x4ff5d70, 0x4e4e840)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:195 +0x68
created by github.com/hashicorp/consul/agent/proxycfg.(*Manager).ensureProxyServiceLocked
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:193 +0x164

goroutine 15116 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018960)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x5018960)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x5018960, 0x5586e80)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 7783 [select, 2 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x4e4eb40, 0x48de118, 0x181bb80, 0x473b0a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 12212 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x0, 0x5460310, 0x181bb80, 0x56399e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 18201 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x571c3f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 14954 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x50d8dd5c, 0x16)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x45d7fe0, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x4cdf458, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15111 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018640)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x5018640)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x5018640, 0x5586d00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 17340 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x4ecc080, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x465d260, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x465d260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 11369 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x51037e0, 0x58539fc, 0x20, 0x20, 0x100, 0xc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5103700, 0x181bb80, 0x51037e0, 0x451e688, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5103700, 0x4da1bc0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5013dc0, 0x547b378, 0x563024c, 0x5853b84, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x48de018, 0x547b360, 0x5630240, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4598d40, 0x48de068, 0x13, 0x1538345, 0x4, 0x5853d54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598d40, 0x48de068, 0x13, 0x57fa554, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5a04320, 0x58b7470, 0x56737f8, 0x0, 0x51a94f0, 0x538d920, 0x143b020, 0x547b360, 0x16, 0x1199b40, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x58b7470, 0x181be20, 0x51036a0, 0x5046498, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5013dc0, 0x1547f18, 0xf, 0x143b020, 0x547a7d0, 0x1199b40, 0x5630210, 0x7c08f8, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x50463c0, 0x1547f18, 0xf, 0x143b020, 0x547a7d0, 0x1199b40, 0x5630210, 0x56301e4, 0x445b4a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x4cd5978, 0x1, 0x0, 0xb2c97000, 0x8b, 0x5103680, 0x1807990, 0x547a7d0, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6f8, 0x4cd5978, 0x54f8240, 0x1199b40, 0x5739140, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15094 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x51f7860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x51f7860, 0x55866c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18240 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4b40, 0x153a3b9, 0x6, 0x5bcce80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 15095 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x51f7900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x51f7900, 0x5586700)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18251 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4aa0370, 0x3b9aca00, 0x0, 0x5a463c0, 0x58fa080, 0x58e2968)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:128 +0xc8
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 7782 [select, 2 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x473b0a0, 0x4f7d9fc, 0x20, 0x20, 0x1, 0x4bb5700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x473af80, 0x181bb80, 0x473b0a0, 0x48de118, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x473af80, 0x4e4eb40, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x45a1500, 0x4fee978, 0x48cf6bc, 0x4f7db84, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Intention).Match(0x4fab0e8, 0x4fee960, 0x48cf6b0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/intention_endpoint.go:250 +0x168
reflect.Value.call(0x4598d40, 0x4fab138, 0x13, 0x1538345, 0x4, 0x4f7dd54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598d40, 0x4fab138, 0x13, 0x4c60d54, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x506fac0, 0x4d03350, 0x513f660, 0x0, 0x492af00, 0x473a960, 0x143b020, 0x4fee960, 0x16, 0x1199b40, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4d03350, 0x181be20, 0x473ae60, 0x48ee218, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x45a1500, 0x1547f18, 0xf, 0x143b020, 0x4fee870, 0x1199b40, 0x48cf5c0, 0x7c08f8, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48ee140, 0x1547f18, 0xf, 0x143b020, 0x4fee870, 0x1199b40, 0x48cf5c0, 0x48cf564, 0x445b680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*IntentionMatch).Fetch(0x47870c0, 0x1, 0x0, 0xb2c97000, 0x8b, 0x473ae20, 0x1807990, 0x4fee870, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/intention_match.go:34 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6f8, 0x47870c0, 0x48c4620, 0x1199b40, 0x48cf4d0, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 8235 [select, 2 minutes]:
github.com/hashicorp/yamux.(*Session).AcceptStream(0x4fe0850, 0x15c2894, 0x45a1500, 0x1824300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:212 +0x7c
github.com/hashicorp/yamux.(*Session).Accept(...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:202
github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2(0x45a1500, 0x1824420, 0x4cdf9a8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:133 +0x158
github.com/hashicorp/consul/agent/consul.(*Server).handleConn(0x45a1500, 0x1824420, 0x4cdf9a8, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:112 +0x434
created by github.com/hashicorp/consul/agent/consul.(*Server).listen
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:61 +0xdc

goroutine 18246 [IO wait]:
internal/poll.runtime_pollWait(0xa599ba7c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57ee4c4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57ee4b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57ee4b0, 0x0, 0x154b546, 0x476ec00)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5e44990, 0x455c500, 0x12b58, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5e44990, 0x0, 0x2c05c8, 0x4c841b0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x5a46240, 0x5e44990)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 15100 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7c20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x51f7c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x51f7c20, 0x5586880)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 16475 [semacquire]:
sync.runtime_Semacquire(0x5988d0c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x5988d0c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x5988cb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x58e88c0, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x58e88c0, 0x5c5c400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5ceac60, 0x58e88c0, 0x5c5c400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 18258 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4ea0, 0x1539225, 0x5, 0x5584180)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 15112 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x50186e0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x50186e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x50186e0, 0x5586d80)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15103 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x51f7e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x51f7e00, 0x5586940)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18036 [runnable]:
syscall.Syscall(0x94, 0x7f, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x7f, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x5b28240, 0x1000, 0x1000)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).writeMeta(0x5bd2200, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:556 +0xfc
github.com/boltdb/bolt.(*Tx).Commit(0x5bd2200, 0x45d9610, 0x8)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:221 +0x3e8
github.com/hashicorp/raft-boltdb.(*BoltStore).StoreLogs(0x56dd630, 0x4d3a2f8, 0x1, 0x1, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:187 +0x228
github.com/hashicorp/raft.(*LogCache).StoreLogs(0x4a412c0, 0x4d3a2f8, 0x1, 0x1, 0x2656cc8, 0x4c1c0b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/log_cache.go:61 +0x110
github.com/hashicorp/raft.(*Raft).dispatchLogs(0x4a35000, 0x5800d1c, 0x1, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:1061 +0x284
github.com/hashicorp/raft.(*Raft).leaderLoop(0x4a35000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:746 +0x5ac
github.com/hashicorp/raft.(*Raft).runLeader(0x4a35000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x4a35000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a35000, 0x4d3a1c8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 18160 [IO wait]:
internal/poll.runtime_pollWait(0xa4716c10, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5eedd74, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5eedd60, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5eedd60, 0x48adeba, 0x3, 0x180f028)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4781400, 0x11b50c0, 0x2666c9c, 0x1198e20)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4781400, 0x0, 0x453f780, 0x180f040, 0x5b8cca0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5a08380, 0x1816180, 0x4781400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 18231 [runnable]:
syscall.Syscall(0x94, 0x7d, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x7d, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x5e465a0, 0x1000, 0xfffffff)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).write(0x59be380, 0xbf6f9556, 0xdd8bfc47)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:519 +0x3d0
github.com/boltdb/bolt.(*Tx).Commit(0x59be380, 0x23c0818, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:198 +0x29c
github.com/hashicorp/raft-boltdb.(*BoltStore).Set(0x605b570, 0x23c0818, 0xb, 0xb, 0x594f8f8, 0x8, 0x8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:229 +0x11c
github.com/hashicorp/raft-boltdb.(*BoltStore).SetUint64(0x605b570, 0x23c0818, 0xb, 0xb, 0x2, 0x0, 0x2668570, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:251 +0xb0
github.com/hashicorp/raft.(*Raft).setCurrentTerm(0x548c200, 0x2, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:1674 +0x5c
github.com/hashicorp/raft.(*Raft).electSelf(0x548c200, 0x3)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:1605 +0x64
github.com/hashicorp/raft.(*Raft).runCandidate(0x548c200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:252 +0x120
github.com/hashicorp/raft.(*Raft).run(0x548c200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:140 +0x7c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x548c200, 0x58e25d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 15104 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7ea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x51f7ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x51f7ea0, 0x5586980)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15110 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x50185a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x50185a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x50185a0, 0x5586b00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18095 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x48eef00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 18164 [select]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x4576c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 11389 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x4fcaf80, 0x449d858, 0x181bb80, 0x51c5560)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 18245 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x510d0e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 17951 [runnable]:
syscall.Syscall(0x94, 0x81, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fdatasync(0x81, 0x1000, 0x0)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:429 +0x30
github.com/boltdb/bolt.fdatasync(0x44d7e60, 0x1000, 0xfffffff)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/bolt_linux.go:9 +0x40
github.com/boltdb/bolt.(*Tx).write(0x59be100, 0xbf6f9556, 0xdb0b4423)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:519 +0x3d0
github.com/boltdb/bolt.(*Tx).Commit(0x59be100, 0x594e418, 0x8)
	/<<PKGBUILDDIR>>/_build/src/github.com/boltdb/bolt/tx.go:198 +0x29c
github.com/hashicorp/raft-boltdb.(*BoltStore).StoreLogs(0x6099030, 0x5226018, 0x1, 0x1, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft-boltdb/bolt_store.go:187 +0x228
github.com/hashicorp/raft.(*LogCache).StoreLogs(0x59c4cf0, 0x5226018, 0x1, 0x1, 0x2656cc8, 0x17480)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/log_cache.go:61 +0x110
github.com/hashicorp/raft.(*Raft).dispatchLogs(0x59d7200, 0x59e9d1c, 0x1, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:1061 +0x284
github.com/hashicorp/raft.(*Raft).leaderLoop(0x59d7200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:746 +0x5ac
github.com/hashicorp/raft.(*Raft).runLeader(0x59d7200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:455 +0x190
github.com/hashicorp/raft.(*Raft).run(0x59d7200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/raft.go:142 +0x6c
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x59d7200, 0x57cbd48)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 18179 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x605a150)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 14992 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7400)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x51f7400)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x51f7400, 0x5586540)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18242 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4b40, 0x1539225, 0x5, 0x5bccec0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:414 +0x920

goroutine 17984 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x58a0e10, 0x2, 0x53cf7b0, 0x4ddd7a0, 0x180ea70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x5625380)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 18236 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x44eebb0, 0x58c3300)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:152 +0xe0
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 14993 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f74a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x51f74a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x51f74a0, 0x5586580)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 11745 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x5767bc0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).joinConsulServer(0x5933500, 0x59cd771, 0x29, 0x5a2b40c, 0x4, 0x4, 0x2e46, 0x56e1140, 0x1, 0x2020501, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1456 +0x110
github.com/hashicorp/consul/agent/consul.(*Server).handleAliveMember(0x5933500, 0x59cd771, 0x29, 0x5a2b40c, 0x4, 0x4, 0x2e46, 0x56e1140, 0x1, 0x2020501, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1276 +0x660
github.com/hashicorp/consul/agent/consul.(*Server).reconcileMember(0x5933500, 0x59cd771, 0x29, 0x5a2b40c, 0x4, 0x4, 0x2e46, 0x56e1140, 0x1, 0x2020501, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1230 +0x1f8
github.com/hashicorp/consul/agent/consul.(*Server).reconcile(0x5933500, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/segment_oss.go:66 +0x138
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5933500, 0x4bde800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:188 +0x4e8
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x55bfda0, 0x5933500, 0x4bde800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 11506 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x4da1bc0, 0x451e688, 0x181bb80, 0x51037e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 15604 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xb1023ac1, 0x17)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x58c43c0, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x48df3f8, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18187 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x6027570)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 18248 [select]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x4aa0370)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 18167 [IO wait]:
internal/poll.runtime_pollWait(0xa4716c94, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x510f694, 0x72, 0xff00, 0xffff, 0x5e4a150)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x510f680, 0x4f48000, 0xffff, 0xffff, 0x5e4a150, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x510f680, 0x4f48000, 0xffff, 0xffff, 0x5e4a150, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x47ab850, 0x4f48000, 0xffff, 0xffff, 0x5e4a150, 0x28, 0x28, 0xb6d49a34, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x47ab850, 0x4f48000, 0xffff, 0xffff, 0x5e4a150, 0x28, 0x28, 0xb6d49a34, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x47ab850, 0x4f48000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x4ea5d80, 0x47ab850, 0x77359400, 0x0, 0x6f7ec, 0x5726e38, 0x5726e3c, 0x30, 0x4ea21c0, 0x411c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x47ab878, 0x47ab850, 0x77359400, 0x0, 0x5c606a0, 0x1807ff0, 0x47ab830, 0x5fb6000, 0x23, 0xffff)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x4ea5d80, 0x47ab850, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x4ea5d80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4fe9140, 0x1537fd1, 0x3, 0x594e030, 0xf, 0x53f3480, 0x57380c0, 0x4fad700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4576c80, 0x4fe9140, 0x4fe8b00, 0x4fe8c00, 0x180f040, 0x560ab80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 18270 [IO wait]:
internal/poll.runtime_pollWait(0xa4716a84, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5dc02e4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5dc02d0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5dc02d0, 0x485ab40, 0xb6d4936c, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5f0eb10, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x5f0eb10, 0x52263b8, 0x5dc02d0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x59be300, 0x1816180, 0x5f0eb10, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x59be300, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5a46580, 0x1537fb3, 0x3, 0x594f870, 0xf, 0x5f0eb00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48ef7c0, 0x5a46580, 0x5a46500, 0x5a46540, 0x180f028, 0x5c9cd00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 18265 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).sessionStats(0x6054540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/session_ttl.go:134 +0xe8
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:476 +0xac4

goroutine 18206 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetListen(0x44eebb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:301 +0xec
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:196 +0x35c

goroutine 17444 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x52a3500, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x5063b20, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5063b20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 18108 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x50ce1e0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x5592540, 0x584bb0d, 0x13c9c00, 0x54aebc0, 0x40, 0x14ad320, 0x154e001, 0x54aebc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:349 +0x120
github.com/hashicorp/consul/agent/consul.(*consulCADelegate).ApplyCARequest(0x4d3a188, 0x54aebc0, 0x1, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/consul_ca_delegate.go:19 +0x38
github.com/hashicorp/consul/agent/connect/ca.(*ConsulProvider).Configure(0x54aeb40, 0x4cc9e31, 0x24, 0x599dd01, 0x522cb00, 0x5595480, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/connect/ca/provider_consul.go:106 +0x648
github.com/hashicorp/consul/agent/consul.(*Server).initializeRootCA(0x5592540, 0x1826750, 0x54aeb40, 0x51b9a10, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:912 +0x48
github.com/hashicorp/consul/agent/consul.(*Server).initializeCA(0x5592540, 0x569d5c0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader_oss.go:24 +0x90
github.com/hashicorp/consul/agent/consul.(*Server).establishLeadership(0x5592540, 0x2, 0x2)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:268 +0xfc
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5592540, 0x5afd800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:170 +0x56c
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5672d20, 0x5592540, 0x5afd800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 11288 [select, 1 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x51a85a0, 0x1548878, 0xf, 0x1807978, 0x5683490, 0x1, 0x0, 0x1199a78, 0x52e3f00, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:362 +0x4e0
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x51a85a0, 0x181bb80, 0x538d6a0, 0x1548878, 0xf, 0x1807978, 0x5683490, 0x1539c2f, 0x5, 0x52e3e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 18065 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x577d080, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul.(*Server).raftApply(0x5a08380, 0x5953d0d, 0x14ad320, 0x56fd240, 0x0, 0x0, 0x7d278, 0x5e3cbb4)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:349 +0x120
github.com/hashicorp/consul/agent/consul.(*Server).initializeCAConfig(0x5a08380, 0x0, 0x0, 0x5953fbc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:903 +0xac
github.com/hashicorp/consul/agent/consul.(*Server).initializeCA(0x5a08380, 0x45ad400, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader_oss.go:13 +0x2c
github.com/hashicorp/consul/agent/consul.(*Server).establishLeadership(0x5a08380, 0x2, 0x2)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:268 +0xfc
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5a08380, 0x562c700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:170 +0x56c
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x50be3a4, 0x5a08380, 0x562c700)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 15097 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x51f7a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x51f7a40, 0x55867c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18166 [IO wait]:
internal/poll.runtime_pollWait(0xa599bc8c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57eeab4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57eeaa0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57eeaa0, 0x450a000, 0xb6d49008, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x51190b0, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x51190b0, 0x58e2198, 0x57eeaa0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x5f6ac80, 0x1816180, 0x51190b0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x5f6ac80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x4fe8dc0, 0x1537fb3, 0x3, 0x5f63a00, 0xf, 0x51190a0, 0x5c30100, 0x5ac59b0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x4576c80, 0x4fe8dc0, 0x4fe8b00, 0x4fe8c00, 0x180f028, 0x560ab40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 12191 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xe5e55259, 0x1a)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x50b2880, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x4dae640, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18131 [IO wait]:
internal/poll.runtime_pollWait(0xa599c0ac, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57eeb04, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x57eeaf0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x57eeaf0, 0x1449b10, 0x450a000, 0xb6d49000)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x51190c0, 0x57, 0x18, 0x5be8ac0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x51190c0, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x51190c0, 0x20, 0x1449b10, 0x2f0001, 0x5be8ac0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x510c630, 0x1813fc0, 0x58e2200, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x4576c80, 0x582c700, 0x5777500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 18243 [select]:
github.com/hashicorp/serf/serf.(*serfQueries).stream(0x55840a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:88 +0x84
created by github.com/hashicorp/serf/serf.newSerfQueries
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/internal_query.go:81 +0x8c

goroutine 15106 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018140)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x5018140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x5018140, 0x5586a00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 17175 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x211c2ad5, 0xc)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x8, 0x1548869, 0xf, 0x4ffda80, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x50424f8, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15108 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018460)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x5018460)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x5018460, 0x5586a80)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 16373 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x78fe92dc, 0x13)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0xa, 0x1548869, 0xf, 0x46584c0, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x54053f0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15109 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018500)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x5018500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x5018500, 0x5586ac0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15101 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x51f7cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x51f7cc0, 0x55868c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15115 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x50188c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x50188c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x50188c0, 0x5586e40)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18250 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x4aa0370)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 16771 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x56c8040, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/raft.(*Raft).Stats(0x5af0200, 0x5af0200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/api.go:1015 +0x778
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateServerHealth(0x580bf80, 0x58f1e80, 0x5730e60, 0x5ab4340, 0x5317808, 0x7, 0x0, 0x181bbc0, 0x57f2480)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:486 +0x100
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x580bf80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:431 +0x7a0
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x580bf80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 18274 [IO wait]:
internal/poll.runtime_pollWait(0xa599c448, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5eec654, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5eec640, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5eec640, 0x1449b10, 0x485af00, 0xb6d49600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x6235ac0, 0xbf, 0x18, 0x5a3f7c0)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x6235ac0, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x6235ac0, 0x20, 0x1449b10, 0x2f0001, 0x5a3f7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x5a62360, 0x1813fc0, 0x4d3a3a8, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x48ef7c0, 0x58aa180, 0x5c489a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 15090 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7540)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x51f7540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x51f7540, 0x55865c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 17101 [semacquire]:
sync.runtime_Semacquire(0x465d2bc)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x465d2bc)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x465d260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x5013880, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5013880, 0x4cc3900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x4fc1ea0, 0x5013880, 0x4cc3900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 15076 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xc6292d74, 0x1b)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x5094660, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x571a4d8, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18096 [select]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x48eef00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 18254 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4ce4ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 9198 [chan receive, 2 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x55f9340, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x553b490, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x553b490)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 18227 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x5e3cb60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 12079 [chan receive, 1 minutes]:
github.com/hashicorp/raft.(*deferError).Error(0x53c6940, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x47b59d0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x47b59d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 18271 [IO wait]:
internal/poll.runtime_pollWait(0xa599b8f0, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57ee654, 0x72, 0xff00, 0xffff, 0x5e4a5d0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x57ee640, 0x5214000, 0xffff, 0xffff, 0x5e4a5d0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x57ee640, 0x5214000, 0xffff, 0xffff, 0x5e4a5d0, 0x28, 0x28, 0x0, 0x1, 0xa0804, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x58e2a38, 0x5214000, 0xffff, 0xffff, 0x5e4a5d0, 0x28, 0x28, 0xb6d49a34, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x58e2a38, 0x5214000, 0xffff, 0xffff, 0x5e4a5d0, 0x28, 0x28, 0xb6d49a34, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x58e2a38, 0x5214000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x5624300, 0x58e2a38, 0x77359400, 0x0, 0x5aba001, 0x411a78, 0x8, 0x14b98a8, 0x180f001, 0x58e2a60)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x58e2a60, 0x58e2a38, 0x77359400, 0x0, 0x40000000, 0x0, 0x0, 0x4b32e58, 0x3d4210, 0x5aba044)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x5624300, 0x58e2a38, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x5624300, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5a465c0, 0x1537fd1, 0x3, 0x50bff40, 0xf, 0x5e451c0, 0x7229c, 0x4b32fb4)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48ef7c0, 0x5a465c0, 0x5a46500, 0x5a46540, 0x180f040, 0x5c9cd40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 15091 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7680)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x51f7680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x51f7680, 0x5586600)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 14656 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x2584f5d6, 0x1a)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x50945a0, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x5718cb0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18267 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x48ef7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 17466 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x670b70e2, 0xe)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0xa, 0x1548869, 0xf, 0x46586a0, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x5754f38, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15118 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018aa0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x5018aa0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x5018aa0, 0x5586f80)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18268 [select]:
github.com/hashicorp/consul/agent.(*Agent).handleEvents(0x48ef7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event.go:111 +0x80
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:481 +0x7d8

goroutine 18100 [IO wait]:
internal/poll.runtime_pollWait(0xa4716b08, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5dc0794, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5dc0780, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5dc0780, 0x1449b10, 0x450a000, 0xb6d49600)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5df8680, 0xc1, 0x18, 0x5521800)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5df8680, 0x0, 0xffffffff, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/consul/agent.tcpKeepAliveListener.Accept(0x5df8680, 0x20, 0x1449b10, 0x2f0001, 0x5521800)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:721 +0x1c
net/http.(*Server).Serve(0x5f29b00, 0x1813fc0, 0x4d3aa80, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/http/server.go:2896 +0x220
github.com/hashicorp/consul/agent.(*Agent).serveHTTP.func1(0x48eef00, 0x5afc580, 0x5cd8bc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:760 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).serveHTTP
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:757 +0x78

goroutine 18203 [IO wait]:
internal/poll.runtime_pollWait(0xa4716d9c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5dc0104, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5dc00f0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5dc00f0, 0x4592fb4, 0x4592f94, 0x2)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x5f0e360, 0x10001, 0x20000, 0x4c53fc)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).AcceptTCP(0x5f0e360, 0x4fc1ea0, 0x1, 0x10000)
	/usr/lib/go-1.13/src/net/tcpsock.go:248 +0x3c
github.com/hashicorp/memberlist.(*NetTransport).tcpListen(0x590c340, 0x5f0e360)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:235 +0x60
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:109 +0x6d0

goroutine 17467 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xa8dac60b, 0xf)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x58c4560, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x590f4d0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18178 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x57ef720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 18088 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5129240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 15590 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xb3f1a3d5, 0x13)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x5094780, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x50142b8, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18156 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x4fe8a00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 15113 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018780)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x5018780)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x5018780, 0x5586dc0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 17897 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x51b9b00, 0x5afe8c0, 0x7229c, 0x441d7ac, 0x441d770)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x5f6a580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 18256 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4ea0, 0x153a3b9, 0x6, 0x5584140)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:412 +0x8b8

goroutine 18092 [IO wait]:
internal/poll.runtime_pollWait(0xa599c1b4, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5d81f04, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x5d81ef0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x5d81ef0, 0x10000, 0x10000, 0x4)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x56dc350, 0x0, 0x5938d50, 0x5938d50)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x56dc350, 0x1, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x5592540, 0x1816180, 0x56dc350)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 18244 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).teeStream(0x510d0e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:223 +0xc4
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:152 +0x428

goroutine 18261 [select]:
github.com/hashicorp/consul/agent/router.HandleSerfEvents(0x51efbf0, 0x542ad80, 0x1537ffe, 0x3, 0x58c3180, 0x542ad00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/serf_adapter.go:47 +0x80
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:441 +0xbf0

goroutine 18239 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4ce4b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 18238 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReap(0x4ce4b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1514 +0xd8
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:410 +0x868

goroutine 16087 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x59cf840, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/raft.(*Raft).Stats(0x54a4200, 0x54a4200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/api.go:1015 +0x778
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateServerHealth(0x57ecaf0, 0x5950e80, 0x57ee050, 0x5759860, 0x4adf988, 0x8, 0x0, 0x181bbc0, 0x47a6ac0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:486 +0x100
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x57ecaf0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:431 +0x7a0
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x57ecaf0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 17385 [semacquire]:
sync.runtime_Semacquire(0x5063b7c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x5063b7c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x5063b20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x5012a80, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x5012a80, 0x4749f80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x5df77c0, 0x5012a80, 0x4749f80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 15105 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x51f7f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x51f7f40, 0x55869c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 11290 [select, 1 minutes]:
github.com/hashicorp/consul/agent/cache.(*Cache).getWithIndex(0x51a85a0, 0x1548afd, 0xf, 0x1807990, 0x547a7d0, 0x1, 0x0, 0x1199b40, 0x5739140, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:362 +0x4e0
github.com/hashicorp/consul/agent/cache.(*Cache).notifyBlockingQuery(0x51a85a0, 0x181bb80, 0x538d6a0, 0x1548afd, 0xf, 0x1807990, 0x547a7d0, 0x1540ed6, 0xa, 0x52e3e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:89 +0x98
created by github.com/hashicorp/consul/agent/cache.(*Cache).Notify
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/watch.go:64 +0x104

goroutine 15092 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x51f7720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x51f7720, 0x5586640)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18232 [select]:
github.com/hashicorp/raft.(*Raft).runFSM(0x548c200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/fsm.go:132 +0x140
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x548c200, 0x58e25e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 18041 [runnable]:
syscall.Syscall(0x76, 0x94, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fsync(0x94, 0x12b01, 0x12ecc)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:449 +0x30
internal/poll.(*FD).Fsync(0x5128640, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_fsync_posix.go:17 +0x88
os.(*File).Sync(0x4d3a1f0, 0x0, 0x0)
	/usr/lib/go-1.13/src/os/file_posix.go:113 +0x40
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5f28ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:311 +0x400
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 18260 [select]:
github.com/hashicorp/consul/agent/router.(*Manager).Start(0x5a46440)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/manager.go:471 +0x8c
created by github.com/hashicorp/consul/agent/router.(*Router).addServer
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/router/router.go:224 +0x284

goroutine 18202 [select]:
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x571c3f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:266 +0x114
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 15099 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x51f7b80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x51f7b80, 0x5586840)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18055 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5d80230, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:263 +0x1e4
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5d80230, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5d80230, 0x153de26, 0x8, 0x4f6dfdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5d80230)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 18252 [select]:
github.com/hashicorp/memberlist.(*Memberlist).pushPullTrigger(0x4aa0370, 0x58fa080)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:152 +0xe0
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:106 +0x2f8

goroutine 18237 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44eebb0, 0x1dcd6500, 0x0, 0x5a46040, 0x58c3300, 0x58e2610)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:128 +0xc8
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 18264 [IO wait]:
internal/poll.runtime_pollWait(0xa599bb00, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x6076fb4, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x6076fa0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x6076fa0, 0x55a0000, 0x7229c, 0x4f1b7ac)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x605a160, 0x10000, 0x4e62c0, 0x571d170)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x605a160, 0x1, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/hashicorp/consul/agent/consul.(*Server).listen(0x6054540, 0x1816180, 0x605a160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:52 +0x24
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:468 +0x9bc

goroutine 18116 [runnable]:
syscall.Syscall(0x76, 0x9e, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/syscall/asm_linux_arm.s:14 +0x8
syscall.Fsync(0x9e, 0x12b01, 0x12ecc)
	/usr/lib/go-1.13/src/syscall/zsyscall_linux_arm.go:449 +0x30
internal/poll.(*FD).Fsync(0x4b80380, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_fsync_posix.go:17 +0x88
os.(*File).Sync(0x57cbd70, 0x0, 0x0)
	/usr/lib/go-1.13/src/os/file_posix.go:113 +0x40
github.com/hashicorp/serf/serf.(*Snapshotter).stream(0x5263b90)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:311 +0x400
created by github.com/hashicorp/serf/serf.NewSnapshotter
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/snapshot.go:153 +0x444

goroutine 18224 [select]:
github.com/hashicorp/consul/agent/consul.(*Coordinate).batchUpdate(0x6234340)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:44 +0x98
created by github.com/hashicorp/consul/agent/consul.NewCoordinate
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/coordinate_endpoint.go:36 +0x68

goroutine 18098 [IO wait]:
internal/poll.runtime_pollWait(0xa4716b8c, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x53f1144, 0x72, 0x0, 0x0, 0x153c0e6)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Accept(0x53f1130, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:384 +0x1a8
net.(*netFD).accept(0x53f1130, 0x4484000, 0xb6d4936c, 0x0)
	/usr/lib/go-1.13/src/net/fd_unix.go:238 +0x20
net.(*TCPListener).accept(0x4c84750, 0x411738, 0x8, 0x12f3898)
	/usr/lib/go-1.13/src/net/tcpsock_posix.go:139 +0x20
net.(*TCPListener).Accept(0x4c84750, 0x55b5428, 0x53f1130, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/tcpsock.go:261 +0x3c
github.com/miekg/dns.(*Server).serveTCP(0x453ad00, 0x1816180, 0x4c84750, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:462 +0x15c
github.com/miekg/dns.(*Server).ListenAndServe(0x453ad00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:332 +0x1a8
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x5129380, 0x1537fb3, 0x3, 0x5672d50, 0xf, 0x4c84740, 0x5ae0220, 0x2)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48eef00, 0x5129380, 0x5129300, 0x5129340, 0x180f028, 0x522c040)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 18038 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x4a35000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x4a35000, 0x4d3a1d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 15107 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018280)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x5018280)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x5018280, 0x5586a40)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15098 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f7ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x51f7ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x51f7ae0, 0x5586800)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18189 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x5df0960, 0x4946d00, 0x5934d70, 0x4b0160f, 0x3)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x607e580)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 15114 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018820)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x5018820)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x5018820, 0x5586e00)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18291 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).forward(0x6054540, 0x154b546, 0x11, 0x181c0c0, 0x5f7a7e0, 0x1480eb8, 0x5f7a7e0, 0x1199c08, 0x57bb2c0, 0x55df520, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:246 +0x30c
github.com/hashicorp/consul/agent/consul.(*Catalog).ListNodes(0x5405658, 0x5f7a7e0, 0x57bb2c0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go:218 +0x5c
reflect.Value.call(0x4598640, 0x5405688, 0x13, 0x1538345, 0x4, 0x5537d84, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598640, 0x5405688, 0x13, 0x4938584, 0x3, 0x3, 0x444f901, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5c9d8c0, 0x51efcb0, 0x594f8b8, 0x0, 0x57efea0, 0x55df500, 0x1480eb8, 0x5f7a7e0, 0x16, 0x1199c08, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x51efcb0, 0x181be20, 0x55df4e0, 0x48ef898, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x6054540, 0x154b546, 0x11, 0x1480eb8, 0x5f7a770, 0x1199c08, 0x57bb260, 0x10aa344, 0x70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48ef7c0, 0x154b546, 0x11, 0x1480eb8, 0x5f7a770, 0x1199c08, 0x57bb260, 0x5dc0b0c, 0x1d3dc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent.(*TestAgent).Start.func2(0x5f0eb30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/testagent.go:207 +0x288
github.com/hashicorp/consul/testutil/retry.run.func2(0x594f8d0, 0x5f0eb20, 0x5f0eb30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/testutil/retry/retry.go:130 +0x50
created by github.com/hashicorp/consul/testutil/retry.run
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/testutil/retry/retry.go:128 +0xfc

goroutine 17953 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x59d7200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x59d7200, 0x57cbd58)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 16707 [semacquire]:
sync.runtime_Semacquire(0x580bfdc)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x580bfdc)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x580bf80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x575ae00, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x575ae00, 0x59e3e40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x50eec50, 0x575ae00, 0x59e3e40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 15122 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018d20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x5018d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x5018d20, 0x5587080)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 14108 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0xd81e90a2, 0x15)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x44d24e0, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x4cde078, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15119 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018b40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func2(0x5018b40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4448 +0x20
testing.tRunner(0x5018b40, 0x5586fc0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 11388 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x51c5560, 0x5905a14, 0x20, 0x20, 0x5829c70, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x51c54c0, 0x181bb80, 0x51c5560, 0x449d858, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x51c54c0, 0x4fcaf80, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x5013dc0, 0x465cfec, 0x4fcaf5c, 0x5905b78, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x51a90e0, 0x465cfc0, 0x4fcaf40, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4598980, 0x585ff88, 0x13, 0x1538345, 0x4, 0x5905d54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598980, 0x585ff88, 0x13, 0x531c554, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5a04180, 0x58b7470, 0x5a2a278, 0x0, 0x51a91d0, 0x538d760, 0x1480eb8, 0x465cfc0, 0x16, 0x1199a78, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x58b7470, 0x181be20, 0x51c5400, 0x5046498, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x5013dc0, 0x1547bdf, 0xf, 0x1480eb8, 0x5683490, 0x1199a78, 0x4fcae80, 0x7c0510, 0x40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x50463c0, 0x1547bdf, 0xf, 0x1480eb8, 0x5683490, 0x1199a78, 0x4fcae80, 0x4887b64, 0x485ab40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x4cd5970, 0x1, 0x0, 0xb2c97000, 0x8b, 0x51c53e0, 0x1807978, 0x5683490, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6c8, 0x4cd5970, 0x54f8200, 0x1199a78, 0x52e3f00, 0x0, 0x0, 0x0, 0x0, 0x1, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15096 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f79a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x51f79a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x51f79a0, 0x5586740)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18180 [select]:
github.com/hashicorp/consul/agent/consul.(*RaftLayer).Accept(0x51efce0, 0x4, 0x2f55, 0x5313c20, 0x1)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/raft_rpc.go:68 +0x84
github.com/hashicorp/raft.(*NetworkTransport).listen(0x607e000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:476 +0x3c
created by github.com/hashicorp/raft.NewNetworkTransportWithConfig
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/net_transport.go:167 +0x18c

goroutine 16052 [semacquire]:
sync.runtime_Semacquire(0x57ecb4c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x57ecb4c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x57ecaf0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x575bdc0, 0x15c28a4, 0x53afdd8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop.func2(0x575bdc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:181 +0x1c
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x575bdc0, 0x57ff7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:214 +0x470
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x59cab90, 0x575bdc0, 0x57ff7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 18163 [select]:
github.com/hashicorp/consul/agent.(*Agent).reapServices(0x4576c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1780 +0xa0
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:478 +0x7bc

goroutine 18134 [select]:
github.com/hashicorp/consul/agent/ae.(*StateSyncer).syncChangesEventFn(0x5eec0f0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:263 +0x1e4
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x5eec0f0, 0x1542794, 0xb, 0x1542794, 0xb)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:187 +0x174
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x5eec0f0, 0x153de26, 0x8, 0x5a7ffdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x5eec0f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 18223 [select]:
github.com/hashicorp/consul/agent/cache.(*Cache).runExpiryLoop(0x624a0f0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:683 +0x11c
created by github.com/hashicorp/consul/agent/cache.New
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:141 +0xf8

goroutine 15120 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018be0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x5018be0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x5018be0, 0x5587000)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 13728 [semacquire, 1 minutes]:
sync.runtime_Semacquire(0x5276d7c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x5276d7c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x5276d20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x48befc0, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x48befc0, 0x50a0940)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x58e0c60, 0x48befc0, 0x50a0940)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 8450 [chan send, 2 minutes]:
testing.tRunner.func1(0x57f1360)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x57f1360, 0x15c2528)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8401 [chan send, 2 minutes]:
testing.tRunner.func1(0x57f12c0)
	/usr/lib/go-1.13/src/testing/testing.go:904 +0x230
testing.tRunner(0x57f12c0, 0x15c252c)
	/usr/lib/go-1.13/src/testing/testing.go:913 +0xb8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 12031 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x56399e0, 0x5a6ae58, 0x20, 0x20, 0x0, 0x550fae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x5a6af94, 0x181bb80, 0x56399e0, 0x5460310, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x5a6af94, 0x0, 0x58c0000)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).startACLUpgrade.func1(0x181bb80, 0x55114e0, 0x5933500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:615 +0x6fc
created by github.com/hashicorp/consul/agent/consul.(*Server).startACLUpgrade
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:594 +0xb0

goroutine 15288 [sleep]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x249a6fb6, 0x13)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x4ec22d0, 0x48c4600, 0x9, 0x1548869, 0xf, 0x4bb7820, 0x18, 0x1807870, 0x48cee70)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x50a8100, 0x48c4600, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x577f678, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 8425 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019f40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestStringHash(0x5019f40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/util_test.go:18 +0x1c
testing.tRunner(0x5019f40, 0x15c26b4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8424 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019ea0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUserEventToken(0x5019ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:186 +0x20
testing.tRunner(0x5019ea0, 0x15c26e8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8426 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x4ba0dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSetFilePermissions(0x4ba0dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/util_test.go:28 +0x1c
testing.tRunner(0x4ba0dc0, 0x15c268c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18235 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x44eebb0, 0x2a05f200, 0x1, 0x5a46000, 0x58c3300, 0x58e2600)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:128 +0xc8
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:100 +0x390

goroutine 8423 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019e00)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestFireReceiveEvent(0x5019e00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:150 +0x1c
testing.tRunner(0x5019e00, 0x15c241c)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8422 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019d60)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestIngestUserEvent(0x5019d60)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:119 +0x20
testing.tRunner(0x5019d60, 0x15c24c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 15093 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x51f77c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x51f77c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x51f77c0, 0x5586680)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18266 [chan receive]:
github.com/hashicorp/consul/agent/proxycfg.(*Manager).Run(0x51c7f80, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/proxycfg/manager.go:117 +0xe4
github.com/hashicorp/consul/agent.(*Agent).Start.func2(0x48ef7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:471 +0x20
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:470 +0x7a0

goroutine 18269 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x48ef7c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 18255 [select]:
github.com/hashicorp/serf/serf.(*Serf).handleReconnect(0x4ce4ea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1532 +0xa0
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:411 +0x884

goroutine 18097 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x48eef00)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 18165 [select]:
github.com/hashicorp/consul/agent.(*Agent).sendCoordinate(0x4576c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1695 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:485 +0xacc

goroutine 18204 [IO wait]:
internal/poll.runtime_pollWait(0xa599b9f8, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5dc0154, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x5dc0140, 0x5196000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x5dc0140, 0x5196000, 0x10000, 0x10000, 0x0, 0x2659501, 0x1, 0x0, 0x1)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x52260d8, 0x5196000, 0x10000, 0x10000, 0x4f70f34, 0x101, 0x4f70f08, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x52260d8, 0x5196000, 0x10000, 0x10000, 0x2, 0x1, 0x7ea900, 0x15c0000, 0x4f70f50)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x590c340, 0x52260d8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 8421 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019cc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestShouldProcessUserEvent(0x5019cc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:50 +0x20
testing.tRunner(0x5019cc0, 0x15c26a0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8419 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019b80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestSummarizeServices(0x5019b80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:156 +0x1c
testing.tRunner(0x5019b80, 0x15c26b8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8420 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019c20)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestValidateUserEventParams(0x5019c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/user_event_test.go:13 +0x1c
testing.tRunner(0x5019c20, 0x15c26ec)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8417 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019a40)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiNodes(0x5019a40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:68 +0x20
testing.tRunner(0x5019a40, 0x15c26e4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8416 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x50199a0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiIndex(0x50199a0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:24 +0x20
testing.tRunner(0x50199a0, 0x15c26dc)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18253 [select]:
github.com/hashicorp/memberlist.(*Memberlist).triggerFunc(0x4aa0370, 0xbebc200, 0x0, 0x5a46400, 0x58fa080, 0x58e2978)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:128 +0xc8
created by github.com/hashicorp/memberlist.(*Memberlist).schedule
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/state.go:112 +0x210

goroutine 8418 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019ae0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestUiNodeInfo(0x5019ae0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ui_endpoint_test.go:106 +0x20
testing.tRunner(0x5019ae0, 0x15c26e0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8414 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019860)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_Size_Ops(0x5019860)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:104 +0x20
testing.tRunner(0x5019860, 0x15c26c8)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8413 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x50197c0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_Size_Net(0x50197c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:63 +0x20
testing.tRunner(0x50197c0, 0x15c26c4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8415 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019900)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_KV_Actions(0x5019900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:131 +0x1c
testing.tRunner(0x5019900, 0x15c26d4)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 10156 [select, 1 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x5557920, 0x5a7d980, 0x20, 0x20, 0x13, 0x48ded88)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x55576c0, 0x181bb80, 0x5557920, 0x48dee78, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x55576c0, 0x5a475c0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x4ab7180, 0x4b8d138, 0x5938f9c, 0x5a7db5c, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*Health).ServiceNodes(0x551fd08, 0x4b8d0e0, 0x5938f90, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/health_endpoint.go:141 +0x114
reflect.Value.call(0x4598b80, 0x551fd48, 0x13, 0x1538345, 0x4, 0x5a7dd54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598b80, 0x551fd48, 0x13, 0x529ad54, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x50c8880, 0x4887e90, 0x57a3b30, 0x0, 0x4c6b0e0, 0x584b240, 0x1480f60, 0x4b8d0e0, 0x16, 0x1199aa0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4887e90, 0x181be20, 0x5557660, 0x48ef258, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x4ab7180, 0x154f56e, 0x13, 0x1480f60, 0x495d4d0, 0x1199aa0, 0x5938f60, 0x7c0708, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48ef180, 0x154f56e, 0x13, 0x1480f60, 0x495d4d0, 0x1199aa0, 0x5938f60, 0x5938f34, 0x450a3c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*HealthServices).Fetch(0x5588cd8, 0xc, 0x0, 0xb2c97000, 0x8b, 0x5557640, 0x18079d8, 0x495d4d0, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/health_services.go:41 +0x144
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6e0, 0x5588cd8, 0x58c4140, 0x1199aa0, 0x4439b90, 0x0, 0x0, 0x0, 0x0, 0xc, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 15123 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018dc0)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func3(0x5018dc0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4456 +0x20
testing.tRunner(0x5018dc0, 0x55870c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 11734 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x821425d2, 0x1a)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x46582c0, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x4cd5200, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 8412 [chan receive, 2 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5019720)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestTxnEndpoint_Bad_Size_Item(0x5019720)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/txn_endpoint_test.go:37 +0x20
testing.tRunner(0x5019720, 0x15c26c0)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 8451 [chan receive, 2 minutes]:
testing.runTests.func1.1(0x466a140)
	/usr/lib/go-1.13/src/testing/testing.go:1207 +0x28
created by testing.runTests.func1
	/usr/lib/go-1.13/src/testing/testing.go:1207 +0x98

goroutine 14144 [sleep, 1 minutes]:
runtime.goparkunlock(...)
	/usr/lib/go-1.13/src/runtime/proc.go:310
time.Sleep(0x330a1542, 0x19)
	/usr/lib/go-1.13/src/runtime/time.go:105 +0x158
github.com/hashicorp/consul/agent/cache.(*Cache).refresh(0x44caaf0, 0x443f760, 0x9, 0x1548869, 0xf, 0x50b2700, 0x18, 0x1807870, 0x48cf260)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:648 +0x12c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6b0, 0x4bf1d40, 0x443f760, 0x0, 0x0, 0x0, 0x0, 0x1807528, 0x58e26d0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:588 +0x49c
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18257 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4ea0, 0x1539045, 0x5, 0x5584160)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 10157 [select, 1 minutes]:
github.com/hashicorp/go-memdb.WatchSet.Watch.func1(0x5a475c0, 0x48dee78, 0x181bb80, 0x5557920)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:57 +0x80
created by github.com/hashicorp/go-memdb.WatchSet.Watch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:56 +0xa8

goroutine 18205 [select]:
github.com/hashicorp/memberlist.(*Memberlist).streamListen(0x44eebb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:198 +0xd0
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:195 +0x340

goroutine 16520 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x4fa9880, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x5988cb0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x5988cb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 18233 [select]:
github.com/hashicorp/raft.(*Raft).runSnapshots(0x548c200)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/snapshot.go:71 +0xa8
github.com/hashicorp/raft.(*raftState).goFunc.func1(0x548c200, 0x58e25e8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:146 +0x50
created by github.com/hashicorp/raft.(*raftState).goFunc
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/state.go:144 +0x50

goroutine 18247 [IO wait]:
internal/poll.runtime_pollWait(0xa599bd10, 0x72, 0x0)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x57ee514, 0x72, 0x10000, 0x10000, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadFrom(0x57ee500, 0x51c8000, 0x10000, 0x10000, 0x0, 0x0, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:219 +0x164
net.(*netFD).readFrom(0x57ee500, 0x51c8000, 0x10000, 0x10000, 0x54800, 0x4f0e401, 0x1, 0x0, 0xf80ed8)
	/usr/lib/go-1.13/src/net/fd_unix.go:208 +0x38
net.(*UDPConn).readFrom(0x58e2670, 0x51c8000, 0x10000, 0x10000, 0x4968734, 0x101, 0x4968708, 0x4c5640)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:47 +0x38
net.(*UDPConn).ReadFrom(0x58e2670, 0x51c8000, 0x10000, 0x10000, 0x5809d00, 0x5809d00, 0x4f0e480, 0x2659518, 0x0)
	/usr/lib/go-1.13/src/net/udpsock.go:121 +0x40
github.com/hashicorp/memberlist.(*NetTransport).udpListen(0x5a46240, 0x58e2670)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:270 +0x84
created by github.com/hashicorp/memberlist.NewNetTransport
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net_transport.go:110 +0x664

goroutine 15847 [chan receive]:
github.com/hashicorp/raft.(*deferError).Error(0x55bae40, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/raft/future.go:106 +0x54
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).updateClusterHealth(0x580a7e0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:390 +0x180
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x580a7e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:344 +0x110
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 15121 [chan receive, 1 minutes]:
testing.(*testContext).waitParallel(0x47c6760)
	/usr/lib/go-1.13/src/testing/testing.go:1008 +0x84
testing.(*T).Parallel(0x5018c80)
	/usr/lib/go-1.13/src/testing/testing.go:815 +0x19c
github.com/hashicorp/consul/agent.TestDNS_ServiceLookup_AnswerLimits.func1(0x5018c80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns_test.go:4440 +0x20
testing.tRunner(0x5018c80, 0x5587040)
	/usr/lib/go-1.13/src/testing/testing.go:909 +0xa8
created by testing.(*T).Run
	/usr/lib/go-1.13/src/testing/testing.go:960 +0x2ac

goroutine 18099 [IO wait]:
internal/poll.runtime_pollWait(0xa599c4cc, 0x72, 0x28)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5dc0744, 0x72, 0xff00, 0xffff, 0x451c0c0)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).ReadMsg(0x5dc0730, 0x4eb0000, 0xffff, 0xffff, 0x451c0c0, 0x28, 0x28, 0x0, 0x0, 0x0, ...)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:243 +0x198
net.(*netFD).readMsg(0x5dc0730, 0x4eb0000, 0xffff, 0xffff, 0x451c0c0, 0x28, 0x28, 0x4, 0x1, 0x1d2c0, ...)
	/usr/lib/go-1.13/src/net/fd_unix.go:214 +0x50
net.(*UDPConn).readMsg(0x4d3aa28, 0x4eb0000, 0xffff, 0xffff, 0x451c0c0, 0x28, 0x28, 0xb6d49008, 0x0, 0x5a254, ...)
	/usr/lib/go-1.13/src/net/udpsock_posix.go:59 +0x50
net.(*UDPConn).ReadMsgUDP(0x4d3aa28, 0x4eb0000, 0xffff, 0xffff, 0x451c0c0, 0x28, 0x28, 0xb6d49008, 0x0, 0x72, ...)
	/usr/lib/go-1.13/src/net/udpsock.go:142 +0x58
github.com/miekg/dns.ReadFromSessionUDP(0x4d3aa28, 0x4eb0000, 0xffff, 0xffff, 0x46, 0x2656cc8, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/udp.go:26 +0x64
github.com/miekg/dns.(*Server).readUDP(0x5625a00, 0x4d3aa28, 0x77359400, 0x0, 0x6f7ec, 0x59e8e38, 0x59e8e3c, 0x30, 0x5039ea0, 0x411c20)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:637 +0xb0
github.com/miekg/dns.(*defaultReader).ReadUDP(0x4d3aa50, 0x4d3aa28, 0x77359400, 0x0, 0x5b30a60, 0x1807ff0, 0x4d3aa08, 0x5f86000, 0x23, 0xffff)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:254 +0x38
github.com/miekg/dns.(*Server).serveUDP(0x5625a00, 0x4d3aa28, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:507 +0x120
github.com/miekg/dns.(*Server).ListenAndServe(0x5625a00, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/miekg/dns/server.go:368 +0x308
github.com/hashicorp/consul/agent.(*DNSServer).ListenAndServe(0x51293c0, 0x1537fd1, 0x3, 0x513f8a0, 0xf, 0x5df8640, 0x5bdee20, 0x2)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/dns.go:190 +0x1e8
github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS.func1(0x48eef00, 0x51293c0, 0x5129300, 0x5129340, 0x180f040, 0x522c100)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:588 +0x10c
created by github.com/hashicorp/consul/agent.(*Agent).listenAndServeDNS
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:586 +0xd4

goroutine 18207 [select]:
github.com/hashicorp/memberlist.(*Memberlist).packetHandler(0x44eebb0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/net.go:420 +0x230
created by github.com/hashicorp/memberlist.newMemberlist
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/memberlist/memberlist.go:197 +0x378

goroutine 8075 [select, 2 minutes]:
github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning.func1(0x45a1500)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1064 +0xbc
created by github.com/hashicorp/consul/agent/consul.(*Server).startCARootPruning
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:1059 +0xac

goroutine 8134 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).run(0x482bab0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:108 +0xfc
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:80 +0xec

goroutine 8135 [select]:
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).serverHealthLoop(0x482bab0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:340 +0xf0
created by github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Start
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:81 +0x108

goroutine 8076 [select, 2 minutes]:
github.com/hashicorp/go-memdb.watchFew(0x181bb80, 0x4b1d240, 0x4f81a14, 0x20, 0x20, 0x52267a0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch_few.go:16 +0x490
github.com/hashicorp/go-memdb.WatchSet.WatchCtx(0x4b1d160, 0x181bb80, 0x4b1d240, 0x52267a8, 0x181bb80)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:82 +0xc4
github.com/hashicorp/go-memdb.WatchSet.Watch(0x4b1d160, 0x5129880, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-memdb/watch.go:64 +0xc4
github.com/hashicorp/consul/agent/consul.(*Server).blockingQuery(0x45a1500, 0x4800f0c, 0x512985c, 0x4f81b78, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:430 +0x1e0
github.com/hashicorp/consul/agent/consul.(*ConnectCA).Roots(0x4cbc640, 0x4800ee0, 0x5129840, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/connect_ca_endpoint.go:345 +0x110
reflect.Value.call(0x4598980, 0x4fab050, 0x13, 0x1538345, 0x4, 0x4f81d54, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598980, 0x4fab050, 0x13, 0x53bfd54, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x506f8e0, 0x4d03350, 0x5675180, 0x0, 0x492a550, 0x4716480, 0x1480eb8, 0x4800ee0, 0x16, 0x1199a78, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x4d03350, 0x181be20, 0x4b1d080, 0x48ee218, 0xffffffff)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x45a1500, 0x1547bdf, 0xf, 0x1480eb8, 0x5106000, 0x1199a78, 0x5129800, 0x7c0510, 0x40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent.(*Agent).RPC(0x48ee140, 0x1547bdf, 0xf, 0x1480eb8, 0x5106000, 0x1199a78, 0x5129800, 0x51b9564, 0x445b680)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1412 +0xa8
github.com/hashicorp/consul/agent/cache-types.(*ConnectCARoot).Fetch(0x47870b8, 0x9, 0x0, 0xb2c97000, 0x8b, 0x4b1d060, 0x1807978, 0x5106000, 0x0, 0x0, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache-types/connect_ca_root.go:36 +0x13c
github.com/hashicorp/consul/agent/cache.(*Cache).fetch.func1(0x180e6c8, 0x47870b8, 0x48c45e0, 0x1199a78, 0x4da7d40, 0x0, 0x0, 0x0, 0x0, 0x9, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:467 +0x138
created by github.com/hashicorp/consul/agent/cache.(*Cache).fetch
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/cache/cache.go:430 +0x298

goroutine 18241 [select]:
github.com/hashicorp/serf/serf.(*Serf).checkQueueDepth(0x4ce4b40, 0x1539045, 0x5, 0x5bccea0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:1630 +0x94
created by github.com/hashicorp/serf/serf.Create
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/serf/serf/serf.go:413 +0x8ec

goroutine 8080 [select]:
github.com/hashicorp/consul/agent/pool.(*ConnPool).reap(0x4ec2320)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:448 +0x2b4
created by github.com/hashicorp/consul/agent/pool.(*ConnPool).init
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/pool/pool.go:169 +0xd8

goroutine 8242 [IO wait]:
internal/poll.runtime_pollWait(0xa598eb84, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x4edef14, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x4edef00, 0x5098000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x4edef00, 0x5098000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x49a8728, 0x5098000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x587bd40, 0x56c4900, 0xc, 0xc, 0x5062bd0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x587bd40, 0x56c4900, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x5062af0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x5062af0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8243 [select]:
github.com/hashicorp/yamux.(*Session).send(0x5062af0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 8244 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x5062af0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8236 [IO wait]:
internal/poll.runtime_pollWait(0xa598eb00, 0x72, 0xffffffff)
	/usr/lib/go-1.13/src/runtime/netpoll.go:184 +0x44
internal/poll.(*pollDesc).wait(0x5876744, 0x72, 0x1000, 0x1000, 0xffffffff)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:87 +0x30
internal/poll.(*pollDesc).waitRead(...)
	/usr/lib/go-1.13/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0x5876730, 0x4dc6000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/internal/poll/fd_unix.go:169 +0x178
net.(*netFD).Read(0x5876730, 0x4dc6000, 0x1000, 0x1000, 0x7ce01, 0x0, 0x3a0cdc)
	/usr/lib/go-1.13/src/net/fd_unix.go:202 +0x38
net.(*conn).Read(0x4cdf9a8, 0x4dc6000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/usr/lib/go-1.13/src/net/net.go:184 +0x58
bufio.(*Reader).Read(0x59c5080, 0x578b550, 0xc, 0xc, 0x4fe08c0, 0x0, 0x0)
	/usr/lib/go-1.13/src/bufio/bufio.go:226 +0x248
io.ReadAtLeast(0x1807270, 0x59c5080, 0x578b550, 0xc, 0xc, 0xc, 0xc, 0x0, 0x0)
	/usr/lib/go-1.13/src/io/io.go:310 +0x6c
io.ReadFull(...)
	/usr/lib/go-1.13/src/io/io.go:329
github.com/hashicorp/yamux.(*Session).recvLoop(0x4fe0850, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:458 +0xa4
github.com/hashicorp/yamux.(*Session).recv(0x4fe0850)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:437 +0x1c
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:113 +0x1f8

goroutine 8237 [select]:
github.com/hashicorp/yamux.(*Session).send(0x4fe0850)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:399 +0x1c0
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:114 +0x214

goroutine 8238 [select]:
github.com/hashicorp/yamux.(*Session).keepalive(0x4fe0850)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:313 +0x90
created by github.com/hashicorp/yamux.newSession
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/session.go:116 +0x24c

goroutine 8239 [select]:
github.com/hashicorp/yamux.(*Stream).Read(0x4fe08c0, 0x5002000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/yamux/stream.go:133 +0x24c
bufio.(*Reader).fill(0x59c5140)
	/usr/lib/go-1.13/src/bufio/bufio.go:100 +0x108
bufio.(*Reader).ReadByte(0x59c5140, 0x55d7660, 0x0, 0x4fe08c4)
	/usr/lib/go-1.13/src/bufio/bufio.go:252 +0x28
github.com/hashicorp/go-msgpack/codec.(*ioDecReader).readn1(0x5009ba0, 0x476d240)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:90 +0x30
github.com/hashicorp/go-msgpack/codec.(*msgpackDecDriver).initReadNext(0x4c85900)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/msgpack.go:540 +0x38
github.com/hashicorp/go-msgpack/codec.(*Decoder).decode(0x59c5170, 0x11b15b0, 0x4de3940)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:635 +0x2c
github.com/hashicorp/go-msgpack/codec.(*Decoder).Decode(0x59c5170, 0x11b15b0, 0x4de3940, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/go-msgpack/codec/decode.go:630 +0x68
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).read(0x59c5110, 0x11b15b0, 0x4de3940, 0x4de3940, 0x4d03368)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:121 +0x44
github.com/hashicorp/net-rpc-msgpackrpc.(*MsgpackCodec).ReadRequestHeader(0x59c5110, 0x4de3940, 0x44a2474, 0x346944)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/net-rpc-msgpackrpc/codec.go:60 +0x2c
net/rpc.(*Server).readRequestHeader(0x4d03350, 0x181cc00, 0x59c5110, 0x487bf90, 0x44a2400, 0x181cc01, 0x0, 0x0, 0x35)
	/usr/lib/go-1.13/src/net/rpc/server.go:583 +0x38
net/rpc.(*Server).readRequest(0x4d03350, 0x181cc00, 0x59c5110, 0x5df9a90, 0x44ca5a0, 0x76dc8, 0x346c6c, 0x2653bb4, 0x44a23f0, 0x5df9a90, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:543 +0x2c
net/rpc.(*Server).ServeRequest(0x4d03350, 0x181cc00, 0x59c5110, 0x3f800000, 0x0)
	/usr/lib/go-1.13/src/net/rpc/server.go:486 +0x40
github.com/hashicorp/consul/agent/consul.(*Server).handleConsulConn(0x45a1500, 0x1824300, 0x4fe08c0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:155 +0x120
created by github.com/hashicorp/consul/agent/consul.(*Server).handleMultiplexV2
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:140 +0x14c

goroutine 18290 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).forward(0x6054540, 0x1551208, 0x14, 0x181c1c0, 0x508a190, 0x143b0b8, 0x508a190, 0x1199be0, 0x5b312e0, 0x5b31300, ...)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/rpc.go:246 +0x30c
github.com/hashicorp/consul/agent/consul.(*Catalog).NodeServices(0x5405658, 0x508a190, 0x5b312e0, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/catalog_endpoint.go:394 +0x5c
reflect.Value.call(0x4598700, 0x54056a8, 0x13, 0x1538345, 0x4, 0x487eae0, 0x3, 0x3, 0x7ce01, 0x0, ...)
	/usr/lib/go-1.13/src/reflect/value.go:460 +0x49c
reflect.Value.Call(0x4598700, 0x54056a8, 0x13, 0x487eae0, 0x3, 0x3, 0x1, 0x0, 0x0)
	/usr/lib/go-1.13/src/reflect/value.go:321 +0x78
net/rpc.(*service).call(0x5c9d8c0, 0x51efcb0, 0x5af6390, 0x0, 0x57eff40, 0x5b312a0, 0x143b0b8, 0x508a190, 0x16, 0x1199be0, ...)
	/usr/lib/go-1.13/src/net/rpc/server.go:377 +0xd8
net/rpc.(*Server).ServeRequest(0x51efcb0, 0x181be20, 0x5b31240, 0x1343c98, 0x1)
	/usr/lib/go-1.13/src/net/rpc/server.go:498 +0x174
github.com/hashicorp/consul/agent/consul.(*Server).RPC(0x6054540, 0x1551208, 0x14, 0x143b0b8, 0x508a140, 0x1199be0, 0x5b31200, 0x3, 0x530f456)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:1012 +0xa0
github.com/hashicorp/consul/agent/local.(*State).updateSyncState(0x5cc6120, 0x0, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/local/state.go:1040 +0x104
github.com/hashicorp/consul/agent/local.(*State).SyncFull(0x5cc6120, 0x153de00, 0x8)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/local/state.go:1194 +0x1c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).nextFSMState(0x57ef6d0, 0x153de26, 0x8, 0x1, 0x0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:167 +0x3c4
github.com/hashicorp/consul/agent/ae.(*StateSyncer).runFSM(0x57ef6d0, 0x153de26, 0x8, 0x487efdc)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:153 +0x2c
github.com/hashicorp/consul/agent/ae.(*StateSyncer).Run(0x57ef6d0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/ae/ae.go:147 +0x5c
created by github.com/hashicorp/consul/agent.(*Agent).StartSync
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/agent.go:1626 +0x30

goroutine 15786 [semacquire]:
sync.runtime_Semacquire(0x580a83c)
	/usr/lib/go-1.13/src/runtime/sema.go:56 +0x34
sync.(*WaitGroup).Wait(0x580a83c)
	/usr/lib/go-1.13/src/sync/waitgroup.go:130 +0x84
github.com/hashicorp/consul/agent/consul/autopilot.(*Autopilot).Stop(0x580a7e0)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/autopilot/autopilot.go:95 +0x80
github.com/hashicorp/consul/agent/consul.(*Server).revokeLeadership(0x575a1c0, 0x158f311, 0x30)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:301 +0xa4
github.com/hashicorp/consul/agent/consul.(*Server).leaderLoop(0x575a1c0, 0x5afcc40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:174 +0x5dc
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership.func1(0x57a3140, 0x575a1c0, 0x5afcc40)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:76 +0x50
created by github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:74 +0x1b8

goroutine 18263 [select]:
github.com/hashicorp/consul/agent/consul.(*Server).monitorLeadership(0x6054540)
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/leader.go:63 +0xf0
created by github.com/hashicorp/consul/agent/consul.NewServerLogger
	/<<PKGBUILDDIR>>/_build/src/github.com/hashicorp/consul/agent/consul/server.go:465 +0x990
FAIL	github.com/hashicorp/consul/agent	300.568s
=== RUN   TestAE_scaleFactor
=== PAUSE TestAE_scaleFactor
=== RUN   TestAE_Pause_nestedPauseResume
=== PAUSE TestAE_Pause_nestedPauseResume
=== RUN   TestAE_Pause_ResumeTriggersSyncChanges
--- PASS: TestAE_Pause_ResumeTriggersSyncChanges (0.00s)
=== RUN   TestAE_staggerDependsOnClusterSize
--- PASS: TestAE_staggerDependsOnClusterSize (0.00s)
=== RUN   TestAE_Run_SyncFullBeforeChanges
--- PASS: TestAE_Run_SyncFullBeforeChanges (0.00s)
=== RUN   TestAE_Run_Quit
=== RUN   TestAE_Run_Quit/Run_panics_without_ClusterSize
=== RUN   TestAE_Run_Quit/runFSM_quits
--- PASS: TestAE_Run_Quit (0.00s)
    --- PASS: TestAE_Run_Quit/Run_panics_without_ClusterSize (0.00s)
    --- PASS: TestAE_Run_Quit/runFSM_quits (0.00s)
=== RUN   TestAE_FSM
=== RUN   TestAE_FSM/fullSyncState
=== RUN   TestAE_FSM/fullSyncState/Paused_->_retryFullSyncState
=== RUN   TestAE_FSM/fullSyncState/SyncFull()_error_->_retryFullSyncState
[ERR] agent: failed to sync remote state: boom
=== RUN   TestAE_FSM/fullSyncState/SyncFull()_OK_->_partialSyncState
=== RUN   TestAE_FSM/retryFullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/shutdownEvent_->_doneState
=== RUN   TestAE_FSM/retryFullSyncState/syncFullNotifEvent_->_fullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/syncFullTimerEvent_->_fullSyncState
=== RUN   TestAE_FSM/retryFullSyncState/invalid_event_->_panic_
=== RUN   TestAE_FSM/partialSyncState
=== RUN   TestAE_FSM/partialSyncState/shutdownEvent_->_doneState
=== RUN   TestAE_FSM/partialSyncState/syncFullNotifEvent_->_fullSyncState
=== RUN   TestAE_FSM/partialSyncState/syncFullTimerEvent_->_fullSyncState
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+Paused_->_partialSyncState
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_error_->_partialSyncState
[ERR] agent: failed to sync changes: boom
=== RUN   TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_OK_->_partialSyncState
=== RUN   TestAE_FSM/partialSyncState/invalid_event_->_panic_
=== RUN   TestAE_FSM/invalid_state_->_panic_
--- PASS: TestAE_FSM (0.00s)
    --- PASS: TestAE_FSM/fullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/Paused_->_retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/SyncFull()_error_->_retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/fullSyncState/SyncFull()_OK_->_partialSyncState (0.00s)
    --- PASS: TestAE_FSM/retryFullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/shutdownEvent_->_doneState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/syncFullNotifEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/syncFullTimerEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/retryFullSyncState/invalid_event_->_panic_ (0.00s)
    --- PASS: TestAE_FSM/partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/shutdownEvent_->_doneState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncFullNotifEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncFullTimerEvent_->_fullSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+Paused_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_error_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/syncChangesEvent+SyncChanges()_OK_->_partialSyncState (0.00s)
        --- PASS: TestAE_FSM/partialSyncState/invalid_event_->_panic_ (0.00s)
    --- PASS: TestAE_FSM/invalid_state_->_panic_ (0.00s)
=== RUN   TestAE_RetrySyncFullEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_shutdownEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_shutdownEvent_during_FullNotif
=== RUN   TestAE_RetrySyncFullEvent/trigger_syncFullNotifEvent
=== RUN   TestAE_RetrySyncFullEvent/trigger_syncFullTimerEvent
--- PASS: TestAE_RetrySyncFullEvent (0.13s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_shutdownEvent (0.00s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_shutdownEvent_during_FullNotif (0.10s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_syncFullNotifEvent (0.01s)
    --- PASS: TestAE_RetrySyncFullEvent/trigger_syncFullTimerEvent (0.02s)
=== RUN   TestAE_SyncChangesEvent
=== RUN   TestAE_SyncChangesEvent/trigger_shutdownEvent
=== RUN   TestAE_SyncChangesEvent/trigger_shutdownEvent_during_FullNotif
=== RUN   TestAE_SyncChangesEvent/trigger_syncFullNotifEvent
=== RUN   TestAE_SyncChangesEvent/trigger_syncFullTimerEvent
=== RUN   TestAE_SyncChangesEvent/trigger_syncChangesNotifEvent
--- PASS: TestAE_SyncChangesEvent (0.13s)
    --- PASS: TestAE_SyncChangesEvent/trigger_shutdownEvent (0.00s)
    --- PASS: TestAE_SyncChangesEvent/trigger_shutdownEvent_during_FullNotif (0.10s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncFullNotifEvent (0.01s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncFullTimerEvent (0.02s)
    --- PASS: TestAE_SyncChangesEvent/trigger_syncChangesNotifEvent (0.00s)
=== CONT  TestAE_scaleFactor
=== RUN   TestAE_scaleFactor/100_nodes
=== RUN   TestAE_scaleFactor/200_nodes
=== RUN   TestAE_scaleFactor/1000_nodes
=== RUN   TestAE_scaleFactor/10000_nodes
--- PASS: TestAE_scaleFactor (0.00s)
    --- PASS: TestAE_scaleFactor/100_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/200_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/1000_nodes (0.00s)
    --- PASS: TestAE_scaleFactor/10000_nodes (0.00s)
=== CONT  TestAE_Pause_nestedPauseResume
--- PASS: TestAE_Pause_nestedPauseResume (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/ae	0.340s
=== RUN   TestCacheGet_noIndex
=== PAUSE TestCacheGet_noIndex
=== RUN   TestCacheGet_initError
=== PAUSE TestCacheGet_initError
=== RUN   TestCacheGet_cachedErrorsDontStick
=== PAUSE TestCacheGet_cachedErrorsDontStick
=== RUN   TestCacheGet_blankCacheKey
=== PAUSE TestCacheGet_blankCacheKey
=== RUN   TestCacheGet_blockingInitSameKey
=== PAUSE TestCacheGet_blockingInitSameKey
=== RUN   TestCacheGet_blockingInitDiffKeys
=== PAUSE TestCacheGet_blockingInitDiffKeys
=== RUN   TestCacheGet_blockingIndex
=== PAUSE TestCacheGet_blockingIndex
=== RUN   TestCacheGet_blockingIndexTimeout
=== PAUSE TestCacheGet_blockingIndexTimeout
=== RUN   TestCacheGet_blockingIndexError
=== PAUSE TestCacheGet_blockingIndexError
=== RUN   TestCacheGet_emptyFetchResult
=== PAUSE TestCacheGet_emptyFetchResult
=== RUN   TestCacheGet_periodicRefresh
--- SKIP: TestCacheGet_periodicRefresh (0.00s)
    cache_test.go:441: DM-skipped
=== RUN   TestCacheGet_periodicRefreshMultiple
=== PAUSE TestCacheGet_periodicRefreshMultiple
=== RUN   TestCacheGet_periodicRefreshErrorBackoff
--- SKIP: TestCacheGet_periodicRefreshErrorBackoff (0.00s)
    cache_test.go:529: DM-skipped
=== RUN   TestCacheGet_periodicRefreshBadRPCZeroIndexErrorBackoff
--- SKIP: TestCacheGet_periodicRefreshBadRPCZeroIndexErrorBackoff (0.00s)
    cache_test.go:571: DM-skipped
=== RUN   TestCacheGet_noIndexSetsOne
--- SKIP: TestCacheGet_noIndexSetsOne (0.00s)
    cache_test.go:615: DM-skipped
=== RUN   TestCacheGet_fetchTimeout
=== PAUSE TestCacheGet_fetchTimeout
=== RUN   TestCacheGet_expire
=== PAUSE TestCacheGet_expire
=== RUN   TestCacheGet_expireResetGet
=== PAUSE TestCacheGet_expireResetGet
=== RUN   TestCacheGet_duplicateKeyDifferentType
=== PAUSE TestCacheGet_duplicateKeyDifferentType
=== RUN   TestCacheGet_partitionDC
=== PAUSE TestCacheGet_partitionDC
=== RUN   TestCacheGet_partitionToken
=== PAUSE TestCacheGet_partitionToken
=== RUN   TestCacheGet_refreshAge
--- SKIP: TestCacheGet_refreshAge (0.00s)
    cache_test.go:921: DM-skipped
=== RUN   TestCacheGet_nonRefreshAge
--- SKIP: TestCacheGet_nonRefreshAge (0.00s)
    cache_test.go:1042: DM-skipped
=== RUN   TestCacheGet_nonBlockingType
=== PAUSE TestCacheGet_nonBlockingType
=== RUN   TestExpiryHeap_impl
--- PASS: TestExpiryHeap_impl (0.00s)
=== RUN   TestExpiryHeap
--- PASS: TestExpiryHeap (0.00s)
=== RUN   TestCacheNotify
--- SKIP: TestCacheNotify (0.00s)
    watch_test.go:16: DM-skipped
=== RUN   TestCacheNotifyPolling
--- SKIP: TestCacheNotifyPolling (0.00s)
    watch_test.go:149: DM-skipped
=== RUN   TestCacheWatch_ErrorBackoff
=== PAUSE TestCacheWatch_ErrorBackoff
=== RUN   TestCacheWatch_ErrorBackoffNonBlocking
=== PAUSE TestCacheWatch_ErrorBackoffNonBlocking
=== CONT  TestCacheGet_fetchTimeout
=== CONT  TestCacheGet_partitionToken
--- PASS: TestCacheGet_fetchTimeout (0.01s)
    cache_test.go:704: PASS:	SupportsBlocking()
    cache_test.go:704: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_partitionDC
=== CONT  TestCacheGet_periodicRefreshMultiple
=== CONT  TestCacheGet_noIndex
--- PASS: TestCacheGet_partitionToken (0.01s)
=== CONT  TestCacheWatch_ErrorBackoffNonBlocking
--- PASS: TestCacheGet_partitionDC (0.01s)
=== CONT  TestCacheWatch_ErrorBackoff
--- PASS: TestCacheGet_noIndex (0.03s)
    cache_test.go:47: PASS:	SupportsBlocking()
    cache_test.go:47: PASS:	Fetch(string,string)
    cache_test.go:48: PASS:	SupportsBlocking()
    cache_test.go:48: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_nonBlockingType
--- PASS: TestCacheGet_nonBlockingType (0.05s)
    cache_test.go:1194: PASS:	SupportsBlocking()
    cache_test.go:1194: PASS:	Fetch(string,string)
    cache_test.go:1194: PASS:	Fetch(string,string)
    cache_test.go:1195: PASS:	SupportsBlocking()
    cache_test.go:1195: PASS:	Fetch(string,string)
    cache_test.go:1195: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_emptyFetchResult
--- PASS: TestCacheGet_periodicRefreshMultiple (0.22s)
    cache_test.go:525: PASS:	SupportsBlocking()
    cache_test.go:525: PASS:	Fetch(string,string)
    cache_test.go:525: PASS:	Fetch(string,string)
    cache_test.go:525: PASS:	Fetch(string,string)
    cache_test.go:525: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_blockingIndexError
--- PASS: TestCacheGet_emptyFetchResult (0.23s)
    cache_test.go:435: PASS:	SupportsBlocking()
    cache_test.go:435: PASS:	Fetch(string,string)
    cache_test.go:435: PASS:	Fetch(string,string)
    cache_test.go:436: PASS:	SupportsBlocking()
    cache_test.go:436: PASS:	Fetch(string,string)
    cache_test.go:436: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_blockingIndexTimeout
--- PASS: TestCacheGet_blockingIndexError (0.11s)
    testing.go:26: Error: test fetch error
    cache_test.go:368: PASS:	SupportsBlocking()
    cache_test.go:368: PASS:	Fetch(string,string)
    cache_test.go:368: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_blockingIndex
--- PASS: TestCacheGet_blockingIndex (0.05s)
    cache_test.go:297: PASS:	SupportsBlocking()
    cache_test.go:297: PASS:	Fetch(string,string)
    cache_test.go:297: PASS:	Fetch(string,string)
    cache_test.go:297: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_blockingInitDiffKeys
--- PASS: TestCacheGet_blockingInitDiffKeys (0.05s)
    cache_test.go:263: PASS:	SupportsBlocking()
    cache_test.go:263: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_blockingInitSameKey
--- PASS: TestCacheGet_blockingInitSameKey (0.05s)
    cache_test.go:211: PASS:	SupportsBlocking()
    cache_test.go:211: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_blankCacheKey
--- PASS: TestCacheGet_blankCacheKey (0.02s)
    cache_test.go:176: PASS:	SupportsBlocking()
    cache_test.go:176: PASS:	Fetch(string,string)
    cache_test.go:177: PASS:	SupportsBlocking()
    cache_test.go:177: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_cachedErrorsDontStick
--- PASS: TestCacheWatch_ErrorBackoff (0.51s)
    watch_test.go:324: PASS:	SupportsBlocking()
    watch_test.go:324: PASS:	Fetch(string,string)
    watch_test.go:324: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_initError
--- PASS: TestCacheGet_blockingIndexTimeout (0.20s)
    cache_test.go:333: PASS:	SupportsBlocking()
    cache_test.go:333: PASS:	Fetch(string,string)
    cache_test.go:333: PASS:	Fetch(string,string)
    cache_test.go:333: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_duplicateKeyDifferentType
--- PASS: TestCacheGet_initError (0.03s)
    cache_test.go:81: PASS:	SupportsBlocking()
    cache_test.go:81: PASS:	Fetch(string,string)
    cache_test.go:82: PASS:	SupportsBlocking()
    cache_test.go:82: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_expireResetGet
--- PASS: TestCacheGet_duplicateKeyDifferentType (0.04s)
    cache_test.go:859: PASS:	SupportsBlocking()
    cache_test.go:859: PASS:	Fetch(string,string)
    cache_test.go:860: PASS:	SupportsBlocking()
    cache_test.go:860: PASS:	Fetch(string,string)
    cache_test.go:861: PASS:	SupportsBlocking()
    cache_test.go:861: PASS:	Fetch(string,string)
    cache_test.go:861: PASS:	SupportsBlocking()
    cache_test.go:861: PASS:	Fetch(string,string)
=== CONT  TestCacheGet_expire
--- PASS: TestCacheWatch_ErrorBackoffNonBlocking (0.61s)
    watch_test.go:390: PASS:	SupportsBlocking()
    watch_test.go:390: PASS:	Fetch(string,string)
    watch_test.go:390: PASS:	Fetch(string,string)
--- PASS: TestCacheGet_cachedErrorsDontStick (0.14s)
    cache_test.go:142: PASS:	SupportsBlocking()
    cache_test.go:142: PASS:	Fetch(string,string)
    cache_test.go:142: PASS:	Fetch(string,string)
    cache_test.go:142: PASS:	Fetch(string,string)
    cache_test.go:143: PASS:	SupportsBlocking()
    cache_test.go:143: PASS:	Fetch(string,string)
    cache_test.go:143: PASS:	Fetch(string,string)
    cache_test.go:143: PASS:	Fetch(string,string)
--- PASS: TestCacheGet_expireResetGet (0.48s)
    cache_test.go:812: PASS:	SupportsBlocking()
    cache_test.go:812: PASS:	Fetch(string,string)
    cache_test.go:813: PASS:	SupportsBlocking()
    cache_test.go:813: PASS:	Fetch(string,string)
--- PASS: TestCacheGet_expire (0.54s)
    cache_test.go:758: PASS:	SupportsBlocking()
    cache_test.go:758: PASS:	Fetch(string,string)
    cache_test.go:759: PASS:	SupportsBlocking()
    cache_test.go:759: PASS:	Fetch(string,string)
PASS
ok  	github.com/hashicorp/consul/agent/cache	1.185s
=== RUN   TestCatalogServices
--- PASS: TestCatalogServices (0.00s)
    catalog_services_test.go:52: PASS:	RPC(string,string,string)
=== RUN   TestCatalogServices_badReqType
--- PASS: TestCatalogServices_badReqType (0.00s)
=== RUN   TestCalculateSoftExpire
=== RUN   TestCalculateSoftExpire/72h_just_issued
=== RUN   TestCalculateSoftExpire/72h_in_renew_range
=== RUN   TestCalculateSoftExpire/72h_in_hard_renew
=== RUN   TestCalculateSoftExpire/72h_expired
=== RUN   TestCalculateSoftExpire/1h_just_issued
=== RUN   TestCalculateSoftExpire/1h_in_renew_range
=== RUN   TestCalculateSoftExpire/1h_in_hard_renew
=== RUN   TestCalculateSoftExpire/1h_expired
=== RUN   TestCalculateSoftExpire/too_short_lifetime
--- PASS: TestCalculateSoftExpire (0.01s)
    --- PASS: TestCalculateSoftExpire/72h_just_issued (0.00s)
    --- PASS: TestCalculateSoftExpire/72h_in_renew_range (0.00s)
    --- PASS: TestCalculateSoftExpire/72h_in_hard_renew (0.00s)
    --- PASS: TestCalculateSoftExpire/72h_expired (0.00s)
    --- PASS: TestCalculateSoftExpire/1h_just_issued (0.00s)
    --- PASS: TestCalculateSoftExpire/1h_in_renew_range (0.00s)
    --- PASS: TestCalculateSoftExpire/1h_in_hard_renew (0.00s)
    --- PASS: TestCalculateSoftExpire/1h_expired (0.00s)
    --- PASS: TestCalculateSoftExpire/too_short_lifetime (0.00s)
=== RUN   TestConnectCALeaf_changingRoots
--- SKIP: TestConnectCALeaf_changingRoots (0.00s)
    connect_ca_leaf_test.go:146: DM-skipped
=== RUN   TestConnectCALeaf_changingRootsJitterBetweenCalls
--- SKIP: TestConnectCALeaf_changingRootsJitterBetweenCalls (0.00s)
    connect_ca_leaf_test.go:258: DM-skipped
=== RUN   TestConnectCALeaf_changingRootsBetweenBlockingCalls
--- SKIP: TestConnectCALeaf_changingRootsBetweenBlockingCalls (0.00s)
    connect_ca_leaf_test.go:408: DM-skipped
=== RUN   TestConnectCALeaf_CSRRateLimiting
--- SKIP: TestConnectCALeaf_CSRRateLimiting (0.00s)
    connect_ca_leaf_test.go:514: DM-skipped
=== RUN   TestConnectCALeaf_watchRootsDedupingMultipleCallers
--- SKIP: TestConnectCALeaf_watchRootsDedupingMultipleCallers (0.00s)
    connect_ca_leaf_test.go:696: DM-skipped
=== RUN   TestConnectCALeaf_expiringLeaf
--- SKIP: TestConnectCALeaf_expiringLeaf (0.00s)
    connect_ca_leaf_test.go:878: DM-skipped
=== RUN   TestConnectCARoot
--- PASS: TestConnectCARoot (0.00s)
    connect_ca_root_test.go:43: PASS:	RPC(string,string,string)
=== RUN   TestConnectCARoot_badReqType
--- PASS: TestConnectCARoot_badReqType (0.00s)
=== RUN   TestHealthServices
--- PASS: TestHealthServices (0.00s)
    health_services_test.go:52: PASS:	RPC(string,string,string)
=== RUN   TestHealthServices_badReqType
--- PASS: TestHealthServices_badReqType (0.00s)
=== RUN   TestIntentionMatch
--- PASS: TestIntentionMatch (0.00s)
    intention_match_test.go:43: PASS:	RPC(string,string,string)
=== RUN   TestIntentionMatch_badReqType
--- PASS: TestIntentionMatch_badReqType (0.00s)
=== RUN   TestNodeServices
--- PASS: TestNodeServices (0.00s)
    node_services_test.go:57: PASS:	RPC(string,string,string)
=== RUN   TestNodeServices_badReqType
--- PASS: TestNodeServices_badReqType (0.00s)
=== RUN   TestPreparedQuery
--- PASS: TestPreparedQuery (0.01s)
    prepared_query_test.go:44: PASS:	RPC(string,string,string)
=== RUN   TestPreparedQuery_badReqType
--- PASS: TestPreparedQuery_badReqType (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/cache-types	0.188s
=== RUN   TestParseFlags
=== RUN   TestParseFlags/#00
=== RUN   TestParseFlags/-bind_a
=== RUN   TestParseFlags/-bootstrap
=== RUN   TestParseFlags/-bootstrap=true
=== RUN   TestParseFlags/-bootstrap=false
=== RUN   TestParseFlags/-config-file_a_-config-dir_b_-config-file_c_-config-dir_d
=== RUN   TestParseFlags/-datacenter_a
=== RUN   TestParseFlags/-dns-port_1
=== RUN   TestParseFlags/-grpc-port_1
=== RUN   TestParseFlags/-serf-lan-port_1
=== RUN   TestParseFlags/-serf-wan-port_1
=== RUN   TestParseFlags/-server-port_1
=== RUN   TestParseFlags/-join_a_-join_b
=== RUN   TestParseFlags/-node-meta_a:b_-node-meta_c:d
=== RUN   TestParseFlags/-bootstrap_true
--- PASS: TestParseFlags (0.03s)
    --- PASS: TestParseFlags/#00 (0.00s)
    --- PASS: TestParseFlags/-bind_a (0.00s)
    --- PASS: TestParseFlags/-bootstrap (0.00s)
    --- PASS: TestParseFlags/-bootstrap=true (0.00s)
    --- PASS: TestParseFlags/-bootstrap=false (0.00s)
    --- PASS: TestParseFlags/-config-file_a_-config-dir_b_-config-file_c_-config-dir_d (0.00s)
    --- PASS: TestParseFlags/-datacenter_a (0.00s)
    --- PASS: TestParseFlags/-dns-port_1 (0.00s)
    --- PASS: TestParseFlags/-grpc-port_1 (0.00s)
    --- PASS: TestParseFlags/-serf-lan-port_1 (0.00s)
    --- PASS: TestParseFlags/-serf-wan-port_1 (0.00s)
    --- PASS: TestParseFlags/-server-port_1 (0.00s)
    --- PASS: TestParseFlags/-join_a_-join_b (0.00s)
    --- PASS: TestParseFlags/-node-meta_a:b_-node-meta_c:d (0.00s)
    --- PASS: TestParseFlags/-bootstrap_true (0.00s)
=== RUN   TestMerge
=== RUN   TestMerge/top_level_fields
--- PASS: TestMerge (0.01s)
    --- PASS: TestMerge/top_level_fields (0.01s)
=== RUN   TestPatchSliceOfMaps
=== RUN   TestPatchSliceOfMaps/00:_{"a":{"b":"c"}}_->_{"a":{"b":"c"}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/01:_{"a":[{"b":"c"}]}_->_{"a":{"b":"c"}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/02:_{"a":[{"b":[{"c":"d"}]}]}_->_{"a":{"b":{"c":"d"}}}_skip:_[]
=== RUN   TestPatchSliceOfMaps/03:_{"a":[{"b":"c"}]}_->_{"a":[{"b":"c"}]}_skip:_[a]
=== RUN   TestPatchSliceOfMaps/04:_{_____"services":_[______{_______"checks":_[________{_________"header":_[__________{"a":"b"}_________]________}_______]______}_____]____}_->_{_____"services":_[______{_______"checks":_[________{_________"header":_{"a":"b"}________}_______]______}_____]____}_skip:_[services_services.checks]
--- PASS: TestPatchSliceOfMaps (0.00s)
    --- PASS: TestPatchSliceOfMaps/00:_{"a":{"b":"c"}}_->_{"a":{"b":"c"}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/01:_{"a":[{"b":"c"}]}_->_{"a":{"b":"c"}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/02:_{"a":[{"b":[{"c":"d"}]}]}_->_{"a":{"b":{"c":"d"}}}_skip:_[] (0.00s)
    --- PASS: TestPatchSliceOfMaps/03:_{"a":[{"b":"c"}]}_->_{"a":[{"b":"c"}]}_skip:_[a] (0.00s)
    --- PASS: TestPatchSliceOfMaps/04:_{_____"services":_[______{_______"checks":_[________{_________"header":_[__________{"a":"b"}_________]________}_______]______}_____]____}_->_{_____"services":_[______{_______"checks":_[________{_________"header":_{"a":"b"}________}_______]______}_____]____}_skip:_[services_services.checks] (0.00s)
=== RUN   TestConfigFlagsAndEdgecases
=== RUN   TestConfigFlagsAndEdgecases/-advertise
=== RUN   TestConfigFlagsAndEdgecases/-advertise-wan
=== RUN   TestConfigFlagsAndEdgecases/-advertise_and_-advertise-wan
=== RUN   TestConfigFlagsAndEdgecases/-bind
=== RUN   TestConfigFlagsAndEdgecases/-bootstrap
=== RUN   TestConfigFlagsAndEdgecases/-bootstrap-expect
=== RUN   TestConfigFlagsAndEdgecases/-client
=== RUN   TestConfigFlagsAndEdgecases/-config-dir
=== RUN   TestConfigFlagsAndEdgecases/-config-file_json
=== RUN   TestConfigFlagsAndEdgecases/-config-file_hcl_and_json
=== RUN   TestConfigFlagsAndEdgecases/-data-dir_empty
=== RUN   TestConfigFlagsAndEdgecases/-data-dir_non-directory
=== RUN   TestConfigFlagsAndEdgecases/-datacenter
=== RUN   TestConfigFlagsAndEdgecases/-datacenter_empty
=== RUN   TestConfigFlagsAndEdgecases/-dev
=== RUN   TestConfigFlagsAndEdgecases/-disable-host-node-id
=== RUN   TestConfigFlagsAndEdgecases/-disable-keyring-file
=== RUN   TestConfigFlagsAndEdgecases/-dns-port
=== RUN   TestConfigFlagsAndEdgecases/-domain
=== RUN   TestConfigFlagsAndEdgecases/-enable-script-checks
=== RUN   TestConfigFlagsAndEdgecases/-encrypt
=== RUN   TestConfigFlagsAndEdgecases/-config-format_disabled,_skip_unknown_files
=== RUN   TestConfigFlagsAndEdgecases/-config-format=json
=== RUN   TestConfigFlagsAndEdgecases/-config-format=hcl
=== RUN   TestConfigFlagsAndEdgecases/-config-format_invalid
=== RUN   TestConfigFlagsAndEdgecases/-http-port
=== RUN   TestConfigFlagsAndEdgecases/-join
=== RUN   TestConfigFlagsAndEdgecases/-join-wan
=== RUN   TestConfigFlagsAndEdgecases/-log-level
=== RUN   TestConfigFlagsAndEdgecases/-node
=== RUN   TestConfigFlagsAndEdgecases/-node-id
=== RUN   TestConfigFlagsAndEdgecases/-node-meta
=== RUN   TestConfigFlagsAndEdgecases/-non-voting-server
=== RUN   TestConfigFlagsAndEdgecases/-pid-file
=== RUN   TestConfigFlagsAndEdgecases/-protocol
=== RUN   TestConfigFlagsAndEdgecases/-raft-protocol
=== RUN   TestConfigFlagsAndEdgecases/-recursor
=== RUN   TestConfigFlagsAndEdgecases/-rejoin
=== RUN   TestConfigFlagsAndEdgecases/-retry-interval
=== RUN   TestConfigFlagsAndEdgecases/-retry-interval-wan
=== RUN   TestConfigFlagsAndEdgecases/-retry-join
=== RUN   TestConfigFlagsAndEdgecases/-retry-join-wan
=== RUN   TestConfigFlagsAndEdgecases/-retry-max
=== RUN   TestConfigFlagsAndEdgecases/-retry-max-wan
=== RUN   TestConfigFlagsAndEdgecases/-serf-lan-bind
=== RUN   TestConfigFlagsAndEdgecases/-serf-lan-port
=== RUN   TestConfigFlagsAndEdgecases/-serf-wan-bind
=== RUN   TestConfigFlagsAndEdgecases/-serf-wan-port
=== RUN   TestConfigFlagsAndEdgecases/-server
=== RUN   TestConfigFlagsAndEdgecases/-server-port
=== RUN   TestConfigFlagsAndEdgecases/-syslog
=== RUN   TestConfigFlagsAndEdgecases/-ui
=== RUN   TestConfigFlagsAndEdgecases/-ui-dir
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v4
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v6
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_any_and_advertise_set_should_not_detect
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_any_and_advertise_set_should_not_detect
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr_and_ports_>_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_>_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:client_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:client,_address_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:client,_address_template_and_ports
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_lan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_wan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_with_ports
=== RUN   TestConfigFlagsAndEdgecases/json:allow_disabling_serf_wan_port
=== RUN   TestConfigFlagsAndEdgecases/hcl:allow_disabling_serf_wan_port
=== RUN   TestConfigFlagsAndEdgecases/json:serf_bind_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:serf_bind_address_lan_template
=== RUN   TestConfigFlagsAndEdgecases/json:serf_bind_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:serf_bind_address_wan_template
=== RUN   TestConfigFlagsAndEdgecases/json:dns_recursor_templates_with_deduplication
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_recursor_templates_with_deduplication
=== RUN   TestConfigFlagsAndEdgecases/json:start_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:start_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:start_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:start_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:retry_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:retry_join_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:retry_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/hcl:retry_join_wan_address_template
=== RUN   TestConfigFlagsAndEdgecases/json:precedence:_merge_order
=== RUN   TestConfigFlagsAndEdgecases/hcl:precedence:_merge_order
=== RUN   TestConfigFlagsAndEdgecases/json:precedence:_flag_before_file
=== RUN   TestConfigFlagsAndEdgecases/hcl:precedence:_flag_before_file
=== RUN   TestConfigFlagsAndEdgecases/json:raft_performance_scaling
=== RUN   TestConfigFlagsAndEdgecases/hcl:raft_performance_scaling
=== RUN   TestConfigFlagsAndEdgecases/json:invalid_input
=== RUN   TestConfigFlagsAndEdgecases/hcl:invalid_input
=== RUN   TestConfigFlagsAndEdgecases/json:datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/hcl:datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/json:acl_datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_datacenter_is_lower-cased
=== RUN   TestConfigFlagsAndEdgecases/json:acl_replication_token_enables_acl_replication
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_replication_token_enables_acl_replication
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v4
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v4
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v6
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v6
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v6
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v6
=== RUN   TestConfigFlagsAndEdgecases/ae_interval_invalid_==_0
=== RUN   TestConfigFlagsAndEdgecases/ae_interval_invalid_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:acl_datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:acl_datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:autopilot.max_trailing_logs_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:autopilot.max_trailing_logs_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_does_not_allow_multiple_addresses
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_does_not_allow_multiple_addresses
=== RUN   TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_a_unix_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_a_unix_socket
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap_without_server
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap_without_server
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_without_server
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_without_server
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_dev_mode
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_dev_mode
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect=1_equals_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=1_equals_bootstrap
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect=2_warning
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=2_warning
=== RUN   TestConfigFlagsAndEdgecases/json:bootstrap-expect_>_2_but_even_warning
=== RUN   TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_>_2_but_even_warning
=== RUN   TestConfigFlagsAndEdgecases/json:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly
=== RUN   TestConfigFlagsAndEdgecases/json:client_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:client_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/json:datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:datacenter_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:dns_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_does_not_allow_socket
=== RUN   TestConfigFlagsAndEdgecases/json:ui_and_ui_dir
=== RUN   TestConfigFlagsAndEdgecases/hcl:ui_and_ui_dir
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_addr_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_addr_any
=== RUN   TestConfigFlagsAndEdgecases/json:advertise_addr_wan_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:advertise_addr_wan_any
=== RUN   TestConfigFlagsAndEdgecases/json:recursors_any
=== RUN   TestConfigFlagsAndEdgecases/hcl:recursors_any
=== RUN   TestConfigFlagsAndEdgecases/json:dns_config.udp_answer_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_config.udp_answer_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:dns_config.a_record_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/hcl:dns_config.a_record_limit_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_<_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_<_0
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_==_0
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_==_0
=== RUN   TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_>_10
=== RUN   TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_>_10
=== RUN   TestConfigFlagsAndEdgecases/node_name_invalid
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_key_too_long
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_key_too_long
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_value_too_long
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_value_too_long
=== RUN   TestConfigFlagsAndEdgecases/json:node_meta_too_many_keys
=== RUN   TestConfigFlagsAndEdgecases/hcl:node_meta_too_many_keys
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_http
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_http
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_https
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_https
=== RUN   TestConfigFlagsAndEdgecases/json:unique_listeners_http_vs_https
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_listeners_http_vs_https
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_HTTP_vs_RPC
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_HTTP_vs_RPC
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_LAN
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_LAN
=== RUN   TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_WAN
=== RUN   TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_WAN
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_ID
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_ID
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_nested_sidecar
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_nested_sidecar
=== RUN   TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_managed_proxy
=== RUN   TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_managed_proxy
=== RUN   TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_cannot_be_empty
=== RUN   TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_must_start_with_+_or_-
=== RUN   TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_must_start_with_+_or_-
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_has_invalid_key
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_has_invalid_key
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_given_but_LAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_LAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/json:encrypt_given_but_WAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_WAN_keyring_exists
=== RUN   TestConfigFlagsAndEdgecases/json:multiple_check_files
=== RUN   TestConfigFlagsAndEdgecases/hcl:multiple_check_files
=== RUN   TestConfigFlagsAndEdgecases/json:grpc_check
=== RUN   TestConfigFlagsAndEdgecases/hcl:grpc_check
=== RUN   TestConfigFlagsAndEdgecases/json:alias_check_with_no_node
=== RUN   TestConfigFlagsAndEdgecases/hcl:alias_check_with_no_node
=== RUN   TestConfigFlagsAndEdgecases/json:multiple_service_files
=== RUN   TestConfigFlagsAndEdgecases/hcl:multiple_service_files
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_key
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_key
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_value
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_value
=== RUN   TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_many_meta
=== RUN   TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_many_meta
=== RUN   TestConfigFlagsAndEdgecases/json:translated_keys
=== RUN   TestConfigFlagsAndEdgecases/hcl:translated_keys
=== RUN   TestConfigFlagsAndEdgecases/json:ignore_snapshot_agent_sub-object
=== RUN   TestConfigFlagsAndEdgecases/hcl:ignore_snapshot_agent_sub-object
=== RUN   TestConfigFlagsAndEdgecases/json:Service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/hcl:Service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/json:Multiple_service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/hcl:Multiple_service_managed_proxy_'upstreams'
=== RUN   TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_root
=== RUN   TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_root
=== RUN   TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_api_registration
=== RUN   TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_api_registration
=== RUN   TestConfigFlagsAndEdgecases/json:service.connectsidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/hcl:service.connectsidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/json:services.connect.sidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/hcl:services.connect.sidecar_service_with_checks_and_upstreams
=== RUN   TestConfigFlagsAndEdgecases/json:verify_server_hostname_implies_verify_outgoing
=== RUN   TestConfigFlagsAndEdgecases/hcl:verify_server_hostname_implies_verify_outgoing
=== RUN   TestConfigFlagsAndEdgecases/json:test_connect_vault_provider_configuration
=== RUN   TestConfigFlagsAndEdgecases/hcl:test_connect_vault_provider_configuration
--- PASS: TestConfigFlagsAndEdgecases (32.05s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise-wan (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-advertise_and_-advertise-wan (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-bind (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-bootstrap (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-bootstrap-expect (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-client (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-dir (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-file_json (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-file_hcl_and_json (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/-data-dir_empty (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-data-dir_non-directory (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-datacenter (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/-datacenter_empty (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-dev (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-disable-host-node-id (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-disable-keyring-file (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-dns-port (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/-domain (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/-enable-script-checks (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/-encrypt (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format_disabled,_skip_unknown_files (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format=json (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format=hcl (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-config-format_invalid (0.00s)
    --- PASS: TestConfigFlagsAndEdgecases/-http-port (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-join (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-join-wan (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-log-level (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-node (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-node-id (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-node-meta (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-non-voting-server (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/-pid-file (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-protocol (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-raft-protocol (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-recursor (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-rejoin (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-interval (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-interval-wan (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-join (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-join-wan (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-max (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/-retry-max-wan (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-lan-bind (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-lan-port (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-wan-bind (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/-serf-wan-port (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-server (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/-server-port (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/-syslog (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/-ui-dir (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_v4 (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v4 (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_v6 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_v6 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_any_and_advertise_set_should_not_detect (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_any_and_advertise_set_should_not_detect (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_==_0 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_==_0 (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_<_0 (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_<_0 (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr_and_ports_>_0 (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr_and_ports_>_0 (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_==_0 (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_==_0 (0.22s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports_<_0 (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports_<_0 (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_addr,_addresses_and_ports (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_addr,_addresses_and_ports (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_template_and_ports (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_template_and_ports (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client,_address_template_and_ports (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client,_address_template_and_ports (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_lan_template (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_template (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_wan_template (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_template (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_lan_with_ports (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_lan_with_ports (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_wan_with_ports (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_wan_with_ports (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:allow_disabling_serf_wan_port (0.19s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:allow_disabling_serf_wan_port (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:serf_bind_address_lan_template (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:serf_bind_address_lan_template (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:serf_bind_address_wan_template (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:serf_bind_address_wan_template (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_recursor_templates_with_deduplication (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_recursor_templates_with_deduplication (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:start_join_address_template (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:start_join_address_template (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/json:start_join_wan_address_template (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:start_join_wan_address_template (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:retry_join_address_template (0.25s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:retry_join_address_template (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:retry_join_wan_address_template (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:retry_join_wan_address_template (0.31s)
    --- PASS: TestConfigFlagsAndEdgecases/json:precedence:_merge_order (0.24s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:precedence:_merge_order (0.26s)
    --- PASS: TestConfigFlagsAndEdgecases/json:precedence:_flag_before_file (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:precedence:_flag_before_file (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:raft_performance_scaling (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:raft_performance_scaling (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:invalid_input (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:invalid_input (0.03s)
    --- PASS: TestConfigFlagsAndEdgecases/json:datacenter_is_lower-cased (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:datacenter_is_lower-cased (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_datacenter_is_lower-cased (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_datacenter_is_lower-cased (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_replication_token_enables_acl_replication (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_replication_token_enables_acl_replication (0.18s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v4 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v4 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v4 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v4 (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v4 (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v4 (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_fails_v6 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_fails_v6 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_none_v6 (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_none_v6 (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_address_detect_multiple_v6 (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_address_detect_multiple_v6 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/ae_interval_invalid_==_0 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/ae_interval_invalid_<_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:acl_datacenter_invalid (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:acl_datacenter_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:autopilot.max_trailing_logs_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:autopilot.max_trailing_logs_invalid (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_empty (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_empty (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_does_not_allow_multiple_addresses (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_does_not_allow_multiple_addresses (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bind_addr_cannot_be_a_unix_socket (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bind_addr_cannot_be_a_unix_socket (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap_without_server (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap_without_server (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_without_server (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_without_server (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_invalid (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_dev_mode (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_dev_mode (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_and_bootstrap (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_and_bootstrap (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect=1_equals_bootstrap (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=1_equals_bootstrap (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect=2_warning (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect=2_warning (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:bootstrap-expect_>_2_but_even_warning (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:bootstrap-expect_>_2_but_even_warning (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_mode_sets_LeaveOnTerm_and_SkipLeaveOnInt_correctly (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/json:client_does_not_allow_socket (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:client_does_not_allow_socket (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:datacenter_invalid (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:datacenter_invalid (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_does_not_allow_socket (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_does_not_allow_socket (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ui_and_ui_dir (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ui_and_ui_dir (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_addr_any (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_addr_any (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:advertise_addr_wan_any (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:advertise_addr_wan_any (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:recursors_any (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:recursors_any (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_config.udp_answer_limit_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_config.udp_answer_limit_invalid (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:dns_config.a_record_limit_invalid (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:dns_config.a_record_limit_invalid (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_<_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_<_0 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_==_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_==_0 (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:performance.raft_multiplier_>_10 (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:performance.raft_multiplier_>_10 (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/node_name_invalid (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_key_too_long (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_key_too_long (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_value_too_long (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_value_too_long (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:node_meta_too_many_keys (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:node_meta_too_many_keys (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_http (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_http (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_dns_vs_https (0.11s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_dns_vs_https (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_listeners_http_vs_https (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_listeners_http_vs_https (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_HTTP_vs_RPC (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_HTTP_vs_RPC (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_LAN (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_LAN (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:unique_advertise_addresses_RPC_vs_Serf_WAN (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:unique_advertise_addresses_RPC_vs_Serf_WAN (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_ID (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_ID (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_nested_sidecar (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_nested_sidecar (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:sidecar_service_can't_have_managed_proxy (0.13s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:sidecar_service_can't_have_managed_proxy (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_cannot_be_empty (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_cannot_be_empty (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/json:telemetry.prefix_filter_must_start_with_+_or_- (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:telemetry.prefix_filter_must_start_with_+_or_- (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_has_invalid_key (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_has_invalid_key (0.05s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_given_but_LAN_keyring_exists (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_LAN_keyring_exists (0.12s)
    --- PASS: TestConfigFlagsAndEdgecases/json:encrypt_given_but_WAN_keyring_exists (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:encrypt_given_but_WAN_keyring_exists (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:multiple_check_files (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:multiple_check_files (0.17s)
    --- PASS: TestConfigFlagsAndEdgecases/json:grpc_check (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:grpc_check (0.25s)
    --- PASS: TestConfigFlagsAndEdgecases/json:alias_check_with_no_node (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:alias_check_with_no_node (0.14s)
    --- PASS: TestConfigFlagsAndEdgecases/json:multiple_service_files (0.15s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:multiple_service_files (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_key (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_key (0.04s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_long_value (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_long_value (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service_with_wrong_meta:_too_many_meta (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service_with_wrong_meta:_too_many_meta (0.06s)
    --- PASS: TestConfigFlagsAndEdgecases/json:translated_keys (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:translated_keys (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/json:ignore_snapshot_agent_sub-object (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:ignore_snapshot_agent_sub-object (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:Service_managed_proxy_'upstreams' (0.09s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:Service_managed_proxy_'upstreams' (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:Multiple_service_managed_proxy_'upstreams' (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:Multiple_service_managed_proxy_'upstreams' (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_root (0.10s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_root (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:enabling_Connect_allow_managed_api_registration (0.07s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:enabling_Connect_allow_managed_api_registration (0.08s)
    --- PASS: TestConfigFlagsAndEdgecases/json:service.connectsidecar_service_with_checks_and_upstreams (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:service.connectsidecar_service_with_checks_and_upstreams (0.16s)
    --- PASS: TestConfigFlagsAndEdgecases/json:services.connect.sidecar_service_with_checks_and_upstreams (0.21s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:services.connect.sidecar_service_with_checks_and_upstreams (0.20s)
    --- PASS: TestConfigFlagsAndEdgecases/json:verify_server_hostname_implies_verify_outgoing (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:verify_server_hostname_implies_verify_outgoing (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/json:test_connect_vault_provider_configuration (0.23s)
    --- PASS: TestConfigFlagsAndEdgecases/hcl:test_connect_vault_provider_configuration (0.22s)
=== RUN   TestFullConfig
=== RUN   TestFullConfig/hcl
=== RUN   TestFullConfig/json
--- PASS: TestFullConfig (0.79s)
    runtime_test.go:4653: "RuntimeConfig.ACLEnableKeyListPolicy" is zero value
    --- PASS: TestFullConfig/hcl (0.47s)
    --- PASS: TestFullConfig/json (0.31s)
=== RUN   TestNonZero
=== RUN   TestNonZero/nil
=== RUN   TestNonZero/zero_bool
=== RUN   TestNonZero/zero_string
=== RUN   TestNonZero/zero_int
=== RUN   TestNonZero/zero_int8
=== RUN   TestNonZero/zero_int16
=== RUN   TestNonZero/zero_int32
=== RUN   TestNonZero/zero_int64
=== RUN   TestNonZero/zero_uint
=== RUN   TestNonZero/zero_uint8
=== RUN   TestNonZero/zero_uint16
=== RUN   TestNonZero/zero_uint32
=== RUN   TestNonZero/zero_uint64
=== RUN   TestNonZero/zero_float32
=== RUN   TestNonZero/zero_float64
=== RUN   TestNonZero/ptr_to_zero_value
=== RUN   TestNonZero/empty_slice
=== RUN   TestNonZero/slice_with_zero_value
=== RUN   TestNonZero/empty_map
=== RUN   TestNonZero/map_with_zero_value_key
=== RUN   TestNonZero/map_with_zero_value_elem
=== RUN   TestNonZero/struct_with_nil_field
=== RUN   TestNonZero/struct_with_zero_value_field
=== RUN   TestNonZero/struct_with_empty_array
--- PASS: TestNonZero (0.02s)
    --- PASS: TestNonZero/nil (0.00s)
    --- PASS: TestNonZero/zero_bool (0.00s)
    --- PASS: TestNonZero/zero_string (0.00s)
    --- PASS: TestNonZero/zero_int (0.00s)
    --- PASS: TestNonZero/zero_int8 (0.00s)
    --- PASS: TestNonZero/zero_int16 (0.00s)
    --- PASS: TestNonZero/zero_int32 (0.00s)
    --- PASS: TestNonZero/zero_int64 (0.00s)
    --- PASS: TestNonZero/zero_uint (0.00s)
    --- PASS: TestNonZero/zero_uint8 (0.00s)
    --- PASS: TestNonZero/zero_uint16 (0.00s)
    --- PASS: TestNonZero/zero_uint32 (0.00s)
    --- PASS: TestNonZero/zero_uint64 (0.00s)
    --- PASS: TestNonZero/zero_float32 (0.00s)
    --- PASS: TestNonZero/zero_float64 (0.00s)
    --- PASS: TestNonZero/ptr_to_zero_value (0.00s)
    --- PASS: TestNonZero/empty_slice (0.00s)
    --- PASS: TestNonZero/slice_with_zero_value (0.00s)
    --- PASS: TestNonZero/empty_map (0.00s)
    --- PASS: TestNonZero/map_with_zero_value_key (0.00s)
    --- PASS: TestNonZero/map_with_zero_value_elem (0.00s)
    --- PASS: TestNonZero/struct_with_nil_field (0.00s)
    --- PASS: TestNonZero/struct_with_zero_value_field (0.00s)
    --- PASS: TestNonZero/struct_with_empty_array (0.00s)
=== RUN   TestConfigDecodeBytes
=== PAUSE TestConfigDecodeBytes
=== RUN   TestSanitize
--- PASS: TestSanitize (0.01s)
=== RUN   TestRuntime_apiAddresses
--- PASS: TestRuntime_apiAddresses (0.00s)
=== RUN   TestRuntime_APIConfigHTTPS
--- PASS: TestRuntime_APIConfigHTTPS (0.00s)
=== RUN   TestRuntime_APIConfigHTTP
--- PASS: TestRuntime_APIConfigHTTP (0.00s)
=== RUN   TestRuntime_APIConfigUNIX
--- PASS: TestRuntime_APIConfigUNIX (0.00s)
=== RUN   TestRuntime_APIConfigANYAddrV4
--- PASS: TestRuntime_APIConfigANYAddrV4 (0.00s)
=== RUN   TestRuntime_APIConfigANYAddrV6
--- PASS: TestRuntime_APIConfigANYAddrV6 (0.00s)
=== RUN   TestRuntime_ClientAddress
--- PASS: TestRuntime_ClientAddress (0.00s)
=== RUN   TestRuntime_ClientAddressAnyV4
--- PASS: TestRuntime_ClientAddressAnyV4 (0.00s)
=== RUN   TestRuntime_ClientAddressAnyV6
--- PASS: TestRuntime_ClientAddressAnyV6 (0.00s)
=== RUN   TestRuntime_ToTLSUtilConfig
--- PASS: TestRuntime_ToTLSUtilConfig (0.00s)
=== RUN   TestSegments
=== RUN   TestSegments/json:segment_name_not_in_OSS
=== RUN   TestSegments/hcl:segment_name_not_in_OSS
=== RUN   TestSegments/json:segment_port_must_be_set
=== RUN   TestSegments/hcl:segment_port_must_be_set
=== RUN   TestSegments/json:segments_not_in_OSS
=== RUN   TestSegments/hcl:segments_not_in_OSS
--- PASS: TestSegments (0.27s)
    --- PASS: TestSegments/json:segment_name_not_in_OSS (0.04s)
    --- PASS: TestSegments/hcl:segment_name_not_in_OSS (0.04s)
    --- PASS: TestSegments/json:segment_port_must_be_set (0.04s)
    --- PASS: TestSegments/hcl:segment_port_must_be_set (0.06s)
    --- PASS: TestSegments/json:segments_not_in_OSS (0.04s)
    --- PASS: TestSegments/hcl:segments_not_in_OSS (0.04s)
=== RUN   TestTranslateKeys
=== RUN   TestTranslateKeys/x->y
=== RUN   TestTranslateKeys/discard_x
=== RUN   TestTranslateKeys/b.x->b.y
=== RUN   TestTranslateKeys/json:_x->y
=== RUN   TestTranslateKeys/json:_X->y
=== RUN   TestTranslateKeys/json:_discard_x
=== RUN   TestTranslateKeys/json:_b.x->b.y
=== RUN   TestTranslateKeys/json:_b[0].x->b[0].y
--- PASS: TestTranslateKeys (0.01s)
    --- PASS: TestTranslateKeys/x->y (0.00s)
    --- PASS: TestTranslateKeys/discard_x (0.00s)
    --- PASS: TestTranslateKeys/b.x->b.y (0.00s)
    --- PASS: TestTranslateKeys/json:_x->y (0.00s)
    --- PASS: TestTranslateKeys/json:_X->y (0.00s)
    --- PASS: TestTranslateKeys/json:_discard_x (0.00s)
    --- PASS: TestTranslateKeys/json:_b.x->b.y (0.00s)
    --- PASS: TestTranslateKeys/json:_b[0].x->b[0].y (0.00s)
=== CONT  TestConfigDecodeBytes
--- PASS: TestConfigDecodeBytes (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/config	33.336s
=== RUN   TestCollectHostInfo
--- PASS: TestCollectHostInfo (0.39s)
PASS
ok  	github.com/hashicorp/consul/agent/debug	0.559s
?   	github.com/hashicorp/consul/agent/exec	[no test files]
=== RUN   TestAgentAntiEntropy_Services
--- SKIP: TestAgentAntiEntropy_Services (0.00s)
    state_test.go:30: DM-skipped
=== RUN   TestAgentAntiEntropy_Services_ConnectProxy
=== PAUSE TestAgentAntiEntropy_Services_ConnectProxy
=== RUN   TestAgent_ServiceWatchCh
=== PAUSE TestAgent_ServiceWatchCh
=== RUN   TestAgentAntiEntropy_EnableTagOverride
--- SKIP: TestAgentAntiEntropy_EnableTagOverride (0.00s)
    state_test.go:507: DM-skipped
=== RUN   TestAgentAntiEntropy_Services_WithChecks
=== PAUSE TestAgentAntiEntropy_Services_WithChecks
=== RUN   TestAgentAntiEntropy_Services_ACLDeny
=== PAUSE TestAgentAntiEntropy_Services_ACLDeny
=== RUN   TestAgentAntiEntropy_Checks
=== PAUSE TestAgentAntiEntropy_Checks
=== RUN   TestAgentAntiEntropy_Checks_ACLDeny
=== PAUSE TestAgentAntiEntropy_Checks_ACLDeny
=== RUN   TestAgent_UpdateCheck_DiscardOutput
--- SKIP: TestAgent_UpdateCheck_DiscardOutput (0.00s)
    state_test.go:1336: DM-skipped
=== RUN   TestAgentAntiEntropy_Check_DeferSync
=== PAUSE TestAgentAntiEntropy_Check_DeferSync
=== RUN   TestAgentAntiEntropy_NodeInfo
=== PAUSE TestAgentAntiEntropy_NodeInfo
=== RUN   TestAgent_ServiceTokens
=== PAUSE TestAgent_ServiceTokens
=== RUN   TestAgent_CheckTokens
=== PAUSE TestAgent_CheckTokens
=== RUN   TestAgent_CheckCriticalTime
--- SKIP: TestAgent_CheckCriticalTime (0.00s)
    state_test.go:1706: DM-skipped
=== RUN   TestAgent_AddCheckFailure
=== PAUSE TestAgent_AddCheckFailure
=== RUN   TestAgent_AliasCheck
=== PAUSE TestAgent_AliasCheck
=== RUN   TestAgent_sendCoordinate
=== PAUSE TestAgent_sendCoordinate
=== RUN   TestState_Notify
=== PAUSE TestState_Notify
=== RUN   TestStateProxyManagement
=== PAUSE TestStateProxyManagement
=== RUN   TestStateProxyRestore
=== PAUSE TestStateProxyRestore
=== CONT  TestAgentAntiEntropy_Services_ConnectProxy
=== CONT  TestState_Notify
--- PASS: TestState_Notify (0.00s)
=== CONT  TestStateProxyRestore
=== CONT  TestAgent_CheckTokens
--- PASS: TestStateProxyRestore (0.01s)
=== CONT  TestStateProxyManagement
=== CONT  TestAgentAntiEntropy_Checks_ACLDeny
--- PASS: TestStateProxyManagement (0.01s)
=== CONT  TestAgentAntiEntropy_Checks
--- PASS: TestAgent_CheckTokens (0.18s)
=== CONT  TestAgentAntiEntropy_Services_ACLDeny
WARNING: bootstrap = true: do not enable unless necessary
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:12.503813 [WARN] agent: Node name "Node db37f41c-ed6b-1eea-4c0a-49d910c31f68" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:12.546235 [WARN] agent: Node name "Node de3ed3e8-147a-befd-f6dd-b1b4aba33873" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:12.584848 [WARN] agent: Node name "Node 6d0b618e-989f-819e-8a41-88b32b098845" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:12.736599 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:12.736746 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:12.736663 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:12.736904 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:12.736925 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:12.736962 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:12.736999 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:12.737124 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:12.737197 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:12.737302 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:12.737441 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:12.737637 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: The 'acl_datacenter' field is deprecated. Use the 'primary_datacenter' field instead.
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:12.811896 [WARN] agent: Node name "Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:12.812272 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:12.812335 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:12.812472 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:12.812566 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:de3ed3e8-147a-befd-f6dd-b1b4aba33873 Address:127.0.0.1:40012}]
2019/11/27 02:17:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6d0b618e-989f-819e-8a41-88b32b098845 Address:127.0.0.1:40018}]
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40012 [Follower] entering Follower state (Leader: "")
2019/11/27 02:17:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:db37f41c-ed6b-1eea-4c0a-49d910c31f68 Address:127.0.0.1:40006}]
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40018 [Follower] entering Follower state (Leader: "")
2019/11/27 02:17:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:354c1c83-caba-e6e7-b2e4-487bb52f56e7 Address:127.0.0.1:40024}]
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.691256 [INFO] serf: EventMemberJoin: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873.dc1 127.0.0.1
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.693641 [INFO] serf: EventMemberJoin: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7.dc1 127.0.0.1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.695530 [INFO] serf: EventMemberJoin: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873 127.0.0.1
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.705275 [INFO] agent: Started DNS server 127.0.0.1:40007 (udp)
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40024 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.701552 [INFO] serf: EventMemberJoin: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7 127.0.0.1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.712498 [INFO] serf: EventMemberJoin: Node db37f41c-ed6b-1eea-4c0a-49d910c31f68.dc1 127.0.0.1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.713189 [INFO] consul: Handled member-join event for server "Node de3ed3e8-147a-befd-f6dd-b1b4aba33873.dc1" in area "wan"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.713614 [INFO] consul: Adding LAN server Node de3ed3e8-147a-befd-f6dd-b1b4aba33873 (Addr: tcp/127.0.0.1:40012) (DC: dc1)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.714334 [INFO] consul: Adding LAN server Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7 (Addr: tcp/127.0.0.1:40024) (DC: dc1)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.715121 [INFO] consul: Handled member-join event for server "Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7.dc1" in area "wan"
2019/11/27 02:17:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40012 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40018 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.748956 [INFO] serf: EventMemberJoin: Node 6d0b618e-989f-819e-8a41-88b32b098845.dc1 127.0.0.1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.752113 [INFO] agent: Started DNS server 127.0.0.1:40007 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.752983 [INFO] agent: Started DNS server 127.0.0.1:40019 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.753440 [INFO] agent: Started DNS server 127.0.0.1:40019 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.754219 [INFO] agent: Started HTTP server on 127.0.0.1:40008 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:14.754313 [INFO] agent: started state syncer
2019/11/27 02:17:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.755541 [INFO] agent: Started HTTP server on 127.0.0.1:40020 (tcp)
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40024 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:14.755719 [INFO] agent: started state syncer
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.757867 [INFO] serf: EventMemberJoin: Node db37f41c-ed6b-1eea-4c0a-49d910c31f68 127.0.0.1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.759187 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.759205 [INFO] consul: Adding LAN server Node db37f41c-ed6b-1eea-4c0a-49d910c31f68 (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.759273 [INFO] consul: Handled member-join event for server "Node db37f41c-ed6b-1eea-4c0a-49d910c31f68.dc1" in area "wan"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.759566 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.761424 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:14.761527 [INFO] agent: started state syncer
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.762904 [INFO] serf: EventMemberJoin: Node 6d0b618e-989f-819e-8a41-88b32b098845 127.0.0.1
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.764731 [INFO] consul: Adding LAN server Node 6d0b618e-989f-819e-8a41-88b32b098845 (Addr: tcp/127.0.0.1:40018) (DC: dc1)
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.765253 [INFO] consul: Handled member-join event for server "Node 6d0b618e-989f-819e-8a41-88b32b098845.dc1" in area "wan"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.765809 [INFO] agent: Started DNS server 127.0.0.1:40013 (tcp)
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.766570 [INFO] agent: Started DNS server 127.0.0.1:40013 (udp)
2019/11/27 02:17:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:14 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.780538 [INFO] agent: Started HTTP server on 127.0.0.1:40014 (tcp)
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:14.780677 [INFO] agent: started state syncer
2019/11/27 02:17:15 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:15 [INFO]  raft: Node at 127.0.0.1:40012 [Leader] entering Leader state
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.324999 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.325589 [INFO] consul: New leader elected: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873
2019/11/27 02:17:15 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:15 [INFO]  raft: Node at 127.0.0.1:40018 [Leader] entering Leader state
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:15.327760 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:15.328158 [INFO] consul: New leader elected: Node 6d0b618e-989f-819e-8a41-88b32b098845
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.409202 [ERR] agent: failed to sync remote state: ACL not found
2019/11/27 02:17:15 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:15 [INFO]  raft: Node at 127.0.0.1:40024 [Leader] entering Leader state
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.423635 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.424103 [INFO] consul: New leader elected: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7
2019/11/27 02:17:15 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:15 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:15.426780 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:15.427186 [INFO] consul: New leader elected: Node db37f41c-ed6b-1eea-4c0a-49d910c31f68
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.465454 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.469824 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.733562 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.747027 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.747112 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.750757 [INFO] acl: initializing acls
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:15.750919 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:15.890537 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.891427 [INFO] consul: Created ACL 'global-management' policy
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.891503 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.895386 [INFO] acl: initializing acls
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:15.895537 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:16.147456 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:16.147948 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:16.237266 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:16.492575 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:16.852104 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.140191 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.140286 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.141090 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.141146 [INFO] serf: EventMemberUpdate: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.142059 [INFO] serf: EventMemberUpdate: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7.dc1
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.142306 [INFO] serf: EventMemberUpdate: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:17.142908 [INFO] serf: EventMemberUpdate: Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7.dc1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.272589 [ERR] agent: failed to sync remote state: ACL not found
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.403920 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.404798 [INFO] serf: EventMemberUpdate: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.405442 [INFO] serf: EventMemberUpdate: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873.dc1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.406040 [INFO] consul: Created ACL anonymous token from configuration
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.406104 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.407037 [INFO] serf: EventMemberUpdate: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:17.407733 [INFO] serf: EventMemberUpdate: Node de3ed3e8-147a-befd-f6dd-b1b4aba33873.dc1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:17.535631 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:17.536198 [DEBUG] consul: Skipping self join check for "Node db37f41c-ed6b-1eea-4c0a-49d910c31f68" since the cluster is too small
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:17.536362 [INFO] consul: member 'Node db37f41c-ed6b-1eea-4c0a-49d910c31f68' joined, marking health alive
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:17.958420 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:17.958533 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:18.258582 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:18.259089 [DEBUG] consul: Skipping self join check for "Node 6d0b618e-989f-819e-8a41-88b32b098845" since the cluster is too small
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:18.259250 [INFO] consul: member 'Node 6d0b618e-989f-819e-8a41-88b32b098845' joined, marking health alive
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:18.591503 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:18.591575 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:19.090003 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:19.090559 [DEBUG] consul: Skipping self join check for "Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:19.090726 [INFO] consul: member 'Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7' joined, marking health alive
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:19.567703 [WARN] consul: error getting server health from "Node 6d0b618e-989f-819e-8a41-88b32b098845": context deadline exceeded
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:19.634707 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:19.635155 [DEBUG] consul: Skipping self join check for "Node de3ed3e8-147a-befd-f6dd-b1b4aba33873" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:19.635260 [INFO] consul: member 'Node de3ed3e8-147a-befd-f6dd-b1b4aba33873' joined, marking health alive
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:19.637357 [INFO] agent: Synced check "mysql"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:19.637418 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:19.637684 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.770755 [INFO] agent: Synced service "redis-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.770891 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.770992 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.771038 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.771071 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.771233 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:19.771292 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:19.974976 [DEBUG] consul: Skipping self join check for "Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:19.975515 [DEBUG] consul: Skipping self join check for "Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7" since the cluster is too small
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:19.995756 [DEBUG] consul: dropping node "Node 354c1c83-caba-e6e7-b2e4-487bb52f56e7" from result due to ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:20.138405 [DEBUG] consul: Skipping self join check for "Node de3ed3e8-147a-befd-f6dd-b1b4aba33873" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:20.138931 [DEBUG] consul: Skipping self join check for "Node de3ed3e8-147a-befd-f6dd-b1b4aba33873" since the cluster is too small
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:20.157642 [DEBUG] consul: dropping node "Node de3ed3e8-147a-befd-f6dd-b1b4aba33873" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:20.505985 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:20.508479 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:20.905782 [INFO] agent: Synced check "redis"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:20.905864 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:20.906586 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:20.906658 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:20.912162 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:21.036020 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:21.036099 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:21.037209 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:21.037658 [WARN] agent: Service "mysql" registration blocked by ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.039792 [INFO] agent: Synced service "web-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.040415 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.042823 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.042892 [DEBUG] agent: Service "cache-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.042978 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.043023 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.043097 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.044000 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.044062 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.044147 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:21.044221 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.536815 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.537044 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.537608 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.537774 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.537890 [DEBUG] agent: Check "redis" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.537992 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:21.802101 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:21.971226 [INFO] agent: Synced check "cache"
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:21.973490 [INFO] agent: Deregistered service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:22.080757 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:22.080829 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:22.081806 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.225634 [INFO] agent: Synced service "cache-proxy"
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.228710 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.229192 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.229276 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.229338 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:22.357127 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.357628 [INFO] agent: Deregistered check "lb"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.357705 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.358720 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.358842 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.363758 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.517143 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518148 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518219 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518283 [INFO] agent: Stopping DNS server 127.0.0.1:40019 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518453 [INFO] agent: Stopping DNS server 127.0.0.1:40019 (udp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518619 [INFO] agent: Stopping HTTP server 127.0.0.1:40020 (tcp)
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518838 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_ACLDeny - 2019/11/27 02:17:22.518913 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_ACLDeny (10.19s)
=== CONT  TestAgentAntiEntropy_Services_WithChecks
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:22.602659 [WARN] agent: Node name "Node 60996a87-a1cf-5077-7d58-a635423cb453" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:22.603036 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:22.603097 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:22.603262 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:22.603370 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:22.634732 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:22.636151 [WARN] agent: Check "mysql-check" registration blocked by ACLs
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.636418 [INFO] agent: Deregistered check "redis"
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.636484 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.636526 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.748464 [INFO] agent: Deregistered service "lb-proxy"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.748547 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.749778 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.749858 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.749898 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:22.749933 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.964816 [INFO] agent: Synced node info
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.965279 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.965353 [INFO] consul: shutting down server
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.965405 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.965950 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966031 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966076 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966119 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966230 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966284 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966324 [DEBUG] agent: Check "cache" in sync
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:22.966376 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.215422 [INFO] agent: Deregistered service "cache-proxy"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:23.215481 [INFO] agent: Synced check "api-check"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.215513 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:23.215533 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.216258 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:23.216361 [DEBUG] consul: dropping check "api-check" from result due to ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.216387 [DEBUG] agent: Service "mysql-proxy" in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:23.216416 [DEBUG] consul: dropping check "serfHealth" from result due to ACLs
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.216425 [DEBUG] agent: Service "redis-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.218194 [DEBUG] agent: Service "web-proxy" in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.218425 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.219010 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.219122 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.457480 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.464361 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.633876 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.634737 [INFO] agent: consul server down
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.634800 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.634868 [INFO] agent: Stopping DNS server 127.0.0.1:40013 (tcp)
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.635030 [INFO] agent: Stopping DNS server 127.0.0.1:40013 (udp)
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.635213 [INFO] agent: Stopping HTTP server 127.0.0.1:40014 (tcp)
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.635426 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Checks - 2019/11/27 02:17:23.635499 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Checks (11.47s)
=== CONT  TestAgent_ServiceWatchCh
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:23.636147 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.638989 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_ServiceWatchCh - 2019/11/27 02:17:23.710599 [WARN] agent: Node name "Node fc56da02-93f9-328c-1b67-9a2e581b0313" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_ServiceWatchCh - 2019/11/27 02:17:23.710972 [DEBUG] tlsutil: Update with version 1
TestAgent_ServiceWatchCh - 2019/11/27 02:17:23.711049 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_ServiceWatchCh - 2019/11/27 02:17:23.711247 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_ServiceWatchCh - 2019/11/27 02:17:23.711382 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768023 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768119 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768194 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768382 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768566 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768883 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_ConnectProxy - 2019/11/27 02:17:23.768974 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_ConnectProxy (11.63s)
=== CONT  TestAgent_AliasCheck
=== CONT  TestAgent_sendCoordinate
--- PASS: TestAgent_AliasCheck (0.05s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgent_sendCoordinate - 2019/11/27 02:17:23.880712 [WARN] agent: Node name "Node 7848ff23-3c77-6b71-df8d-b921b49f91af" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgent_sendCoordinate - 2019/11/27 02:17:23.881177 [DEBUG] tlsutil: Update with version 1
TestAgent_sendCoordinate - 2019/11/27 02:17:23.881356 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgent_sendCoordinate - 2019/11/27 02:17:23.881786 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgent_sendCoordinate - 2019/11/27 02:17:23.881980 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:24.051151 [INFO] agent: Synced service "api"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:25.942409 [INFO] agent: Deregistered check "api-check"
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:25.942875 [WARN] agent: Check "mysql-check" registration blocked by ACLs
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:25.942944 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:25.943311 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:25.943402 [INFO] consul: shutting down server
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:25.943455 [WARN] serf: Shutdown without a Leave
2019/11/27 02:17:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:60996a87-a1cf-5077-7d58-a635423cb453 Address:127.0.0.1:40030}]
2019/11/27 02:17:26 [INFO]  raft: Node at 127.0.0.1:40030 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.185926 [INFO] serf: EventMemberJoin: Node 60996a87-a1cf-5077-7d58-a635423cb453.dc1 127.0.0.1
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.200233 [INFO] serf: EventMemberJoin: Node 60996a87-a1cf-5077-7d58-a635423cb453 127.0.0.1
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.202517 [INFO] consul: Adding LAN server Node 60996a87-a1cf-5077-7d58-a635423cb453 (Addr: tcp/127.0.0.1:40030) (DC: dc1)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.202970 [INFO] consul: Handled member-join event for server "Node 60996a87-a1cf-5077-7d58-a635423cb453.dc1" in area "wan"
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.204243 [INFO] agent: Started DNS server 127.0.0.1:40025 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.204322 [INFO] agent: Started DNS server 127.0.0.1:40025 (udp)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.209601 [INFO] agent: Started HTTP server on 127.0.0.1:40026 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:26.209733 [INFO] agent: started state syncer
2019/11/27 02:17:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:26 [INFO]  raft: Node at 127.0.0.1:40030 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.322537 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.469588 [INFO] manager: shutting down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.470404 [INFO] agent: consul server down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.470463 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.470523 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.470688 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (udp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.470872 [INFO] agent: Stopping HTTP server 127.0.0.1:40008 (tcp)
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.471103 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Checks_ACLDeny - 2019/11/27 02:17:26.471180 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Checks_ACLDeny (14.31s)
=== CONT  TestAgent_AddCheckFailure
=== CONT  TestAgentAntiEntropy_NodeInfo
--- PASS: TestAgent_AddCheckFailure (0.04s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:26.586474 [WARN] agent: Node name "Node 2fba0c0f-b144-933e-c538-7143bb68518d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:26.587044 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:26.587306 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:26.587596 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:26.587815 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:17:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fc56da02-93f9-328c-1b67-9a2e581b0313 Address:127.0.0.1:40036}]
2019/11/27 02:17:26 [INFO]  raft: Node at 127.0.0.1:40036 [Follower] entering Follower state (Leader: "")
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.795704 [INFO] serf: EventMemberJoin: Node fc56da02-93f9-328c-1b67-9a2e581b0313.dc1 127.0.0.1
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.799805 [INFO] serf: EventMemberJoin: Node fc56da02-93f9-328c-1b67-9a2e581b0313 127.0.0.1
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.800553 [INFO] consul: Adding LAN server Node fc56da02-93f9-328c-1b67-9a2e581b0313 (Addr: tcp/127.0.0.1:40036) (DC: dc1)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.800606 [INFO] consul: Handled member-join event for server "Node fc56da02-93f9-328c-1b67-9a2e581b0313.dc1" in area "wan"
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.801729 [INFO] agent: Started DNS server 127.0.0.1:40031 (tcp)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.801964 [INFO] agent: Started DNS server 127.0.0.1:40031 (udp)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.803881 [INFO] agent: Started HTTP server on 127.0.0.1:40032 (tcp)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:26.803974 [INFO] agent: started state syncer
2019/11/27 02:17:26 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:26 [INFO]  raft: Node at 127.0.0.1:40036 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:26 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7848ff23-3c77-6b71-df8d-b921b49f91af Address:127.0.0.1:40042}]
TestAgent_sendCoordinate - 2019/11/27 02:17:26.963854 [INFO] serf: EventMemberJoin: Node 7848ff23-3c77-6b71-df8d-b921b49f91af.dc1 127.0.0.1
2019/11/27 02:17:26 [INFO]  raft: Node at 127.0.0.1:40042 [Follower] entering Follower state (Leader: "")
TestAgent_sendCoordinate - 2019/11/27 02:17:26.971011 [INFO] serf: EventMemberJoin: Node 7848ff23-3c77-6b71-df8d-b921b49f91af 127.0.0.1
TestAgent_sendCoordinate - 2019/11/27 02:17:26.972538 [INFO] consul: Adding LAN server Node 7848ff23-3c77-6b71-df8d-b921b49f91af (Addr: tcp/127.0.0.1:40042) (DC: dc1)
TestAgent_sendCoordinate - 2019/11/27 02:17:26.973231 [INFO] consul: Handled member-join event for server "Node 7848ff23-3c77-6b71-df8d-b921b49f91af.dc1" in area "wan"
TestAgent_sendCoordinate - 2019/11/27 02:17:26.975819 [INFO] agent: Started DNS server 127.0.0.1:40037 (tcp)
TestAgent_sendCoordinate - 2019/11/27 02:17:26.976087 [INFO] agent: Started DNS server 127.0.0.1:40037 (udp)
TestAgent_sendCoordinate - 2019/11/27 02:17:26.978914 [INFO] agent: Started HTTP server on 127.0.0.1:40038 (tcp)
TestAgent_sendCoordinate - 2019/11/27 02:17:26.979010 [INFO] agent: started state syncer
2019/11/27 02:17:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:40042 [Candidate] entering Candidate state in term 2
2019/11/27 02:17:27 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:40030 [Leader] entering Leader state
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:27.259966 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:27.260415 [INFO] consul: New leader elected: Node 60996a87-a1cf-5077-7d58-a635423cb453
2019/11/27 02:17:27 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:27 [INFO]  raft: Node at 127.0.0.1:40036 [Leader] entering Leader state
TestAgent_ServiceWatchCh - 2019/11/27 02:17:27.868313 [INFO] consul: cluster leadership acquired
TestAgent_ServiceWatchCh - 2019/11/27 02:17:27.868829 [INFO] consul: New leader elected: Node fc56da02-93f9-328c-1b67-9a2e581b0313
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:27.870216 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:27.870312 [DEBUG] agent: Node info in sync
2019/11/27 02:17:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:40042 [Leader] entering Leader state
TestAgent_sendCoordinate - 2019/11/27 02:17:28.012310 [INFO] consul: cluster leadership acquired
TestAgent_sendCoordinate - 2019/11/27 02:17:28.012787 [INFO] consul: New leader elected: Node 7848ff23-3c77-6b71-df8d-b921b49f91af
2019/11/27 02:17:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:40e4a748-2192-161a-0510-9bf59fe950b5 Address:127.0.0.1:40048}]
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:40048 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.143592 [INFO] serf: EventMemberJoin: Node 2fba0c0f-b144-933e-c538-7143bb68518d.dc1 127.0.0.1
2019/11/27 02:17:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:28 [INFO]  raft: Node at 127.0.0.1:40048 [Candidate] entering Candidate state in term 2
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.308336 [INFO] serf: EventMemberJoin: Node 2fba0c0f-b144-933e-c538-7143bb68518d 127.0.0.1
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.314907 [INFO] consul: Handled member-join event for server "Node 2fba0c0f-b144-933e-c538-7143bb68518d.dc1" in area "wan"
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.315214 [INFO] consul: Adding LAN server Node 2fba0c0f-b144-933e-c538-7143bb68518d (Addr: tcp/127.0.0.1:40048) (DC: dc1)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.315865 [INFO] agent: Started DNS server 127.0.0.1:40043 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.315942 [INFO] agent: Started DNS server 127.0.0.1:40043 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.322805 [INFO] agent: Started HTTP server on 127.0.0.1:40044 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:28.322900 [INFO] agent: started state syncer
TestAgent_ServiceWatchCh - 2019/11/27 02:17:28.401215 [INFO] agent: Synced node info
TestAgent_ServiceWatchCh - 2019/11/27 02:17:28.401330 [DEBUG] agent: Node info in sync
TestAgent_sendCoordinate - 2019/11/27 02:17:28.557003 [INFO] agent: Synced node info
2019/11/27 02:17:29 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:29 [INFO]  raft: Node at 127.0.0.1:40048 [Leader] entering Leader state
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:29.355466 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:29.355897 [INFO] consul: New leader elected: Node 2fba0c0f-b144-933e-c538-7143bb68518d
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:29.496671 [DEBUG] agent: Node info in sync
TestAgent_sendCoordinate - 2019/11/27 02:17:29.581888 [INFO] agent: Requesting shutdown
TestAgent_sendCoordinate - 2019/11/27 02:17:29.582003 [INFO] consul: shutting down server
TestAgent_sendCoordinate - 2019/11/27 02:17:29.582049 [WARN] serf: Shutdown without a Leave
TestAgent_sendCoordinate - 2019/11/27 02:17:29.862083 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:29.864850 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:29.865276 [DEBUG] consul: Skipping self join check for "Node 60996a87-a1cf-5077-7d58-a635423cb453" since the cluster is too small
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:29.865446 [INFO] consul: member 'Node 60996a87-a1cf-5077-7d58-a635423cb453' joined, marking health alive
TestAgent_sendCoordinate - 2019/11/27 02:17:29.955326 [DEBUG] agent: Node info in sync
TestAgent_sendCoordinate - 2019/11/27 02:17:29.955448 [DEBUG] agent: Node info in sync
TestAgent_ServiceWatchCh - 2019/11/27 02:17:30.095645 [DEBUG] agent: Node info in sync
TestAgent_sendCoordinate - 2019/11/27 02:17:30.166878 [INFO] manager: shutting down
TestAgent_sendCoordinate - 2019/11/27 02:17:30.168206 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgent_sendCoordinate - 2019/11/27 02:17:30.168375 [INFO] agent: consul server down
TestAgent_sendCoordinate - 2019/11/27 02:17:30.168429 [INFO] agent: shutdown complete
TestAgent_sendCoordinate - 2019/11/27 02:17:30.168484 [INFO] agent: Stopping DNS server 127.0.0.1:40037 (tcp)
TestAgent_sendCoordinate - 2019/11/27 02:17:30.168628 [INFO] agent: Stopping DNS server 127.0.0.1:40037 (udp)
TestAgent_sendCoordinate - 2019/11/27 02:17:30.168813 [INFO] agent: Stopping HTTP server 127.0.0.1:40038 (tcp)
TestAgent_sendCoordinate - 2019/11/27 02:17:30.169234 [INFO] agent: Waiting for endpoints to shut down
TestAgent_sendCoordinate - 2019/11/27 02:17:30.169316 [INFO] agent: Endpoints down
--- PASS: TestAgent_sendCoordinate (6.35s)
    state_test.go:1858: 10 1 100ms
=== CONT  TestAgent_ServiceTokens
--- PASS: TestAgent_ServiceTokens (0.06s)
=== CONT  TestAgentAntiEntropy_Check_DeferSync
WARNING: bootstrap = true: do not enable unless necessary
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:30.288596 [WARN] agent: Node name "Node 05e785eb-41ee-d939-63ed-2976f9c34892" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:30.289605 [DEBUG] tlsutil: Update with version 1
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:30.289894 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:30.290443 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:30.290808 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:30.380496 [INFO] agent: Synced node info
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:30.399895 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgent_ServiceWatchCh - 2019/11/27 02:17:30.747576 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgent_ServiceWatchCh - 2019/11/27 02:17:30.748065 [DEBUG] consul: Skipping self join check for "Node fc56da02-93f9-328c-1b67-9a2e581b0313" since the cluster is too small
TestAgent_ServiceWatchCh - 2019/11/27 02:17:30.748238 [INFO] consul: member 'Node fc56da02-93f9-328c-1b67-9a2e581b0313' joined, marking health alive
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.171363 [INFO] agent: Synced service "mysql"
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.171443 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.171478 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.172173 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.590706 [INFO] agent: Synced service "mysql"
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.701201 [INFO] agent: Synced service "svc_id1"
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.701271 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799100 [INFO] agent: Synced service "redis"
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799201 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799254 [DEBUG] agent: Check "redis:1" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799293 [DEBUG] agent: Check "redis:2" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799321 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799622 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799738 [INFO] consul: shutting down server
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.799788 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800080 [DEBUG] agent: Service "redis" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800132 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800178 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800216 [DEBUG] agent: Check "redis:1" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800252 [DEBUG] agent: Check "redis:2" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800281 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800356 [DEBUG] agent: Service "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800395 [DEBUG] agent: Service "redis" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800437 [DEBUG] agent: Check "mysql" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800478 [DEBUG] agent: Check "redis:1" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800512 [DEBUG] agent: Check "redis:2" in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.800541 [DEBUG] agent: Node info in sync
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.901427 [INFO] agent: Synced service "svc_id1"
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.903254 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:31.901851 [WARN] serf: Shutdown without a Leave
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.903990 [DEBUG] agent: Service "svc_id1" in sync
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.904128 [DEBUG] agent: Node info in sync
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.904791 [INFO] agent: Requesting shutdown
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.904877 [INFO] consul: shutting down server
TestAgent_ServiceWatchCh - 2019/11/27 02:17:31.904924 [WARN] serf: Shutdown without a Leave
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.035449 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.035576 [INFO] manager: shutting down
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.036400 [INFO] agent: consul server down
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.036467 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.036524 [INFO] agent: Stopping DNS server 127.0.0.1:40025 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.036747 [INFO] agent: Stopping DNS server 127.0.0.1:40025 (udp)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.036940 [INFO] agent: Stopping HTTP server 127.0.0.1:40026 (tcp)
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.037171 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Services_WithChecks - 2019/11/27 02:17:32.037250 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Services_WithChecks (9.52s)
2019/11/27 02:17:32 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:05e785eb-41ee-d939-63ed-2976f9c34892 Address:127.0.0.1:40054}]
2019/11/27 02:17:32 [INFO]  raft: Node at 127.0.0.1:40054 [Follower] entering Follower state (Leader: "")
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.041348 [INFO] serf: EventMemberJoin: Node 05e785eb-41ee-d939-63ed-2976f9c34892.dc1 127.0.0.1
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.044275 [INFO] serf: EventMemberJoin: Node 05e785eb-41ee-d939-63ed-2976f9c34892 127.0.0.1
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.045082 [INFO] consul: Adding LAN server Node 05e785eb-41ee-d939-63ed-2976f9c34892 (Addr: tcp/127.0.0.1:40054) (DC: dc1)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.045414 [INFO] agent: Started DNS server 127.0.0.1:40049 (udp)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.045442 [INFO] consul: Handled member-join event for server "Node 05e785eb-41ee-d939-63ed-2976f9c34892.dc1" in area "wan"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.045785 [INFO] agent: Started DNS server 127.0.0.1:40049 (tcp)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.047855 [INFO] agent: Started HTTP server on 127.0.0.1:40050 (tcp)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.047942 [INFO] agent: started state syncer
2019/11/27 02:17:32 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:17:32 [INFO]  raft: Node at 127.0.0.1:40054 [Candidate] entering Candidate state in term 2
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.122424 [INFO] manager: shutting down
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.125196 [WARN] agent: Deregistering service "svc_id1" failed. leadership lost while committing log
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.125281 [ERR] agent: failed to sync changes: leadership lost while committing log
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.125641 [INFO] agent: consul server down
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.125703 [INFO] agent: shutdown complete
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.125762 [INFO] agent: Stopping DNS server 127.0.0.1:40031 (tcp)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.125916 [INFO] agent: Stopping DNS server 127.0.0.1:40031 (udp)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.126089 [INFO] agent: Stopping HTTP server 127.0.0.1:40032 (tcp)
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.126306 [INFO] agent: Waiting for endpoints to shut down
TestAgent_ServiceWatchCh - 2019/11/27 02:17:32.126401 [INFO] agent: Endpoints down
--- PASS: TestAgent_ServiceWatchCh (8.49s)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.126876 [INFO] agent: Synced node info
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.723619 [INFO] agent: Synced node info
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.723922 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.723993 [INFO] consul: shutting down server
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.724041 [WARN] serf: Shutdown without a Leave
2019/11/27 02:17:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:17:32 [INFO]  raft: Node at 127.0.0.1:40054 [Leader] entering Leader state
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.728013 [INFO] consul: cluster leadership acquired
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:32.728522 [INFO] consul: New leader elected: Node 05e785eb-41ee-d939-63ed-2976f9c34892
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.822186 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.945593 [INFO] manager: shutting down
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:32.950651 [WARN] consul: error getting server health from "Node 2fba0c0f-b144-933e-c538-7143bb68518d": rpc error making call: EOF
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:33.234309 [INFO] agent: Synced node info
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.236854 [INFO] agent: consul server down
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.236914 [INFO] agent: shutdown complete
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.236967 [INFO] agent: Stopping DNS server 127.0.0.1:40043 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237107 [INFO] agent: Stopping DNS server 127.0.0.1:40043 (udp)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237264 [INFO] agent: Stopping HTTP server 127.0.0.1:40044 (tcp)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237475 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237544 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_NodeInfo (6.72s)
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237715 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237765 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237859 [ERR] connect: Apply failed leadership lost while committing log
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.237897 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestAgentAntiEntropy_NodeInfo - 2019/11/27 02:17:33.946162 [WARN] consul: error getting server health from "Node 2fba0c0f-b144-933e-c538-7143bb68518d": context deadline exceeded
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:35.268918 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:35.269416 [DEBUG] consul: Skipping self join check for "Node 05e785eb-41ee-d939-63ed-2976f9c34892" since the cluster is too small
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:35.269590 [INFO] consul: member 'Node 05e785eb-41ee-d939-63ed-2976f9c34892' joined, marking health alive
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:35.606399 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.047073 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.047167 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.047435 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.567806 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.567884 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.568000 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:36.568042 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.280378 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.280474 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.571088 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.768998 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.769077 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.945847 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:37.945968 [DEBUG] agent: Check "web" in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.122986 [INFO] agent: Synced node info
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.589745 [INFO] agent: Synced check "web"
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.589829 [DEBUG] agent: Node info in sync
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.615064 [INFO] agent: Requesting shutdown
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.615192 [INFO] consul: shutting down server
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.615258 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.679114 [WARN] serf: Shutdown without a Leave
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.799588 [INFO] manager: shutting down
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.800445 [INFO] agent: consul server down
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.800515 [INFO] agent: shutdown complete
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.800600 [INFO] agent: Stopping DNS server 127.0.0.1:40049 (tcp)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.800821 [INFO] agent: Stopping DNS server 127.0.0.1:40049 (udp)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.801016 [INFO] agent: Stopping HTTP server 127.0.0.1:40050 (tcp)
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.801289 [INFO] agent: Waiting for endpoints to shut down
TestAgentAntiEntropy_Check_DeferSync - 2019/11/27 02:17:38.801369 [INFO] agent: Endpoints down
--- PASS: TestAgentAntiEntropy_Check_DeferSync (8.57s)
PASS
ok  	github.com/hashicorp/consul/agent/local	26.888s
=== RUN   TestBuild
=== RUN   TestBuild/no_version
=== RUN   TestBuild/bad_version
=== RUN   TestBuild/good_version
=== RUN   TestBuild/rc_version
=== RUN   TestBuild/ent_version
--- PASS: TestBuild (0.00s)
    --- PASS: TestBuild/no_version (0.00s)
    --- PASS: TestBuild/bad_version (0.00s)
    --- PASS: TestBuild/good_version (0.00s)
    --- PASS: TestBuild/rc_version (0.00s)
    --- PASS: TestBuild/ent_version (0.00s)
=== RUN   TestServer_Key_Equal
--- PASS: TestServer_Key_Equal (0.00s)
=== RUN   TestServer_Key
--- PASS: TestServer_Key (0.00s)
=== RUN   TestServer_Key_params
--- PASS: TestServer_Key_params (0.00s)
=== RUN   TestIsConsulServer
--- PASS: TestIsConsulServer (0.00s)
=== RUN   TestIsConsulServer_Optional
--- PASS: TestIsConsulServer_Optional (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/metadata	0.087s
?   	github.com/hashicorp/consul/agent/mock	[no test files]
?   	github.com/hashicorp/consul/agent/pool	[no test files]
=== RUN   TestManager_BasicLifecycle
--- SKIP: TestManager_BasicLifecycle (0.00s)
    manager_test.go:42: DM-skipped
=== RUN   TestManager_deliverLatest
--- PASS: TestManager_deliverLatest (0.00s)
=== RUN   TestStateChanged
=== RUN   TestStateChanged/nil_node_service
=== RUN   TestStateChanged/same_service
=== RUN   TestStateChanged/same_service,_different_token
=== RUN   TestStateChanged/different_service_ID
=== RUN   TestStateChanged/different_address
=== RUN   TestStateChanged/different_port
=== RUN   TestStateChanged/different_service_kind
=== RUN   TestStateChanged/different_proxy_target
=== RUN   TestStateChanged/different_proxy_upstreams
--- PASS: TestStateChanged (0.01s)
    --- PASS: TestStateChanged/nil_node_service (0.00s)
    --- PASS: TestStateChanged/same_service (0.00s)
    --- PASS: TestStateChanged/same_service,_different_token (0.00s)
    --- PASS: TestStateChanged/different_service_ID (0.00s)
    --- PASS: TestStateChanged/different_address (0.00s)
    --- PASS: TestStateChanged/different_port (0.00s)
    --- PASS: TestStateChanged/different_service_kind (0.00s)
    --- PASS: TestStateChanged/different_proxy_target (0.00s)
    --- PASS: TestStateChanged/different_proxy_upstreams (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/proxycfg	0.189s
=== RUN   TestDaemon_impl
--- PASS: TestDaemon_impl (0.00s)
=== RUN   TestDaemonStartStop
=== PAUSE TestDaemonStartStop
=== RUN   TestDaemonRestart
=== PAUSE TestDaemonRestart
=== RUN   TestDaemonLaunchesNewProcessGroup
=== PAUSE TestDaemonLaunchesNewProcessGroup
=== RUN   TestDaemonStop_kill
=== PAUSE TestDaemonStop_kill
=== RUN   TestDaemonStop_killAdopted
=== PAUSE TestDaemonStop_killAdopted
=== RUN   TestDaemonStart_pidFile
=== PAUSE TestDaemonStart_pidFile
=== RUN   TestDaemonRestart_pidFile
=== PAUSE TestDaemonRestart_pidFile
=== RUN   TestDaemonEqual
=== RUN   TestDaemonEqual/Different_type
=== RUN   TestDaemonEqual/Nil
=== RUN   TestDaemonEqual/Equal
=== RUN   TestDaemonEqual/Different_proxy_ID
=== RUN   TestDaemonEqual/Different_path
=== RUN   TestDaemonEqual/Different_dir
=== RUN   TestDaemonEqual/Different_args
=== RUN   TestDaemonEqual/Different_token
--- PASS: TestDaemonEqual (0.00s)
    --- PASS: TestDaemonEqual/Different_type (0.00s)
    --- PASS: TestDaemonEqual/Nil (0.00s)
    --- PASS: TestDaemonEqual/Equal (0.00s)
    --- PASS: TestDaemonEqual/Different_proxy_ID (0.00s)
    --- PASS: TestDaemonEqual/Different_path (0.00s)
    --- PASS: TestDaemonEqual/Different_dir (0.00s)
    --- PASS: TestDaemonEqual/Different_args (0.00s)
    --- PASS: TestDaemonEqual/Different_token (0.00s)
=== RUN   TestDaemonMarshalSnapshot
=== RUN   TestDaemonMarshalSnapshot/stopped_daemon
=== RUN   TestDaemonMarshalSnapshot/basic
--- PASS: TestDaemonMarshalSnapshot (0.00s)
    --- PASS: TestDaemonMarshalSnapshot/stopped_daemon (0.00s)
    --- PASS: TestDaemonMarshalSnapshot/basic (0.00s)
=== RUN   TestDaemonUnmarshalSnapshot
=== PAUSE TestDaemonUnmarshalSnapshot
=== RUN   TestDaemonUnmarshalSnapshot_notRunning
=== PAUSE TestDaemonUnmarshalSnapshot_notRunning
=== RUN   TestManagerClose_noRun
=== PAUSE TestManagerClose_noRun
=== RUN   TestManagerRun_initialSync
=== PAUSE TestManagerRun_initialSync
=== RUN   TestManagerRun_syncNew
=== PAUSE TestManagerRun_syncNew
=== RUN   TestManagerRun_syncDelete
=== PAUSE TestManagerRun_syncDelete
=== RUN   TestManagerRun_syncUpdate
=== PAUSE TestManagerRun_syncUpdate
=== RUN   TestManagerRun_daemonLogs
=== PAUSE TestManagerRun_daemonLogs
=== RUN   TestManagerRun_daemonPid
--- SKIP: TestManagerRun_daemonPid (0.00s)
    manager_test.go:262: DM-skipped
=== RUN   TestManagerPassesEnvironment
=== PAUSE TestManagerPassesEnvironment
=== RUN   TestManagerPassesProxyEnv
--- SKIP: TestManagerPassesProxyEnv (0.00s)
    manager_test.go:354: DM-skipped
=== RUN   TestManagerRun_snapshotRestore
=== PAUSE TestManagerRun_snapshotRestore
=== RUN   TestManagerRun_rootDisallow
2019/11/27 02:17:24 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:24 [WARN] agent/proxy: running as root, will not start managed proxies
2019/11/27 02:17:25 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
--- PASS: TestManagerRun_rootDisallow (0.78s)
=== RUN   TestNoop_impl
--- PASS: TestNoop_impl (0.00s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== CONT  TestDaemonStartStop
=== CONT  TestManagerRun_snapshotRestore
=== CONT  TestManagerPassesEnvironment
=== CONT  TestManagerRun_daemonLogs
logger: 2019/11/27 02:17:25 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy919268582/file"}
2019/11/27 02:17:25 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:25 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:25 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:25 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy109088520/file"}
2019/11/27 02:17:25 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "environ", "/tmp/test-agent-proxy344266618/env-variables"}
2019/11/27 02:17:25 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "output", "/tmp/test-agent-proxy783165585/notify"}
logger: 2019/11/27 02:17:25 [INFO] agent/proxy: daemon exited with exit code: 0
=== CONT  TestManagerRun_syncUpdate
--- PASS: TestDaemonStartStop (0.09s)
2019/11/27 02:17:25 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:25 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy317397739/file"}
2019/11/27 02:17:25 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/11/27 02:17:26 [INFO] agent/proxy: daemon left running
--- PASS: TestManagerRun_daemonLogs (0.63s)
=== CONT  TestManagerRun_syncDelete
2019/11/27 02:17:26 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy117348437/file"}
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestManagerPassesEnvironment (0.65s)
=== CONT  TestManagerRun_syncNew
2019/11/27 02:17:26 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy317397739/file2"}
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/11/27 02:17:26 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy109088520/file2"}
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy089597538/file"}
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy317397739/file2"}
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy089597538/file2"}
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
--- PASS: TestManagerRun_syncDelete (0.26s)
=== CONT  TestManagerRun_initialSync
2019/11/27 02:17:26 [DEBUG] agent/proxy: managed Connect proxy manager started
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy496592868/file"}
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 1
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy089597538/file"}
2019/11/27 02:17:26 [DEBUG] agent/proxy: Stopping managed Connect proxy manager
=== CONT  TestManagerClose_noRun
--- PASS: TestManagerRun_syncUpdate (1.04s)
--- PASS: TestManagerClose_noRun (0.00s)
=== CONT  TestDaemonUnmarshalSnapshot_notRunning
logger: 2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy787228854/file"}
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy089597538/file2"}
logger: 2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonUnmarshalSnapshot_notRunning (0.09s)
=== CONT  TestDaemonUnmarshalSnapshot
logger: 2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy333871261/file"}
logger: 2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 0
2019/11/27 02:17:26 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy496592868/file"}
2019/11/27 02:17:26 [INFO] agent/proxy: daemon exited with exit code: 1
2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 1
2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 1
--- PASS: TestManagerRun_initialSync (0.75s)
=== CONT  TestDaemonRestart_pidFile
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy040290136/file"}
--- PASS: TestManagerRun_syncNew (1.00s)
=== CONT  TestDaemonStart_pidFile
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-once", "/tmp/test-agent-proxy246737111/file"}
Unknown command: "start-once"
2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with error: process 32460 is dead or running as another user
2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestManagerRun_snapshotRestore (1.72s)
=== CONT  TestDaemonStop_killAdopted
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 2
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-once", "/tmp/test-agent-proxy246737111/file"}
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy040290136/file"}
Unknown command: "start-once"
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: graceful wait of 200ms passed, killing
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 2
=== CONT  TestDaemonStop_kill
--- PASS: TestDaemonStart_pidFile (0.39s)
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "stop-kill", "/tmp/test-agent-proxy081287457/file"}
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonRestart_pidFile (0.44s)
=== CONT  TestDaemonLaunchesNewProcessGroup
--- PASS: TestDaemonStop_killAdopted (0.37s)
=== CONT  TestDaemonRestart
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy660199419/file"}
2019/11/27 02:17:27 Started child
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy498041868/file"}
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 0
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "restart", "/tmp/test-agent-proxy660199419/file"}
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with error: process 32584 is dead or running as another user
--- PASS: TestDaemonUnmarshalSnapshot (1.05s)
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: graceful wait of 200ms passed, killing
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon left running
logger: 2019/11/27 02:17:27 [INFO] agent/proxy: daemon exited with exit code: 0
--- PASS: TestDaemonRestart (0.29s)
2019/11/27 02:17:27 Started child
logger: 2019/11/27 02:17:27 [DEBUG] agent/proxy: starting proxy: "/tmp/go-build133710598/b583/proxyprocess.test" []string{"-test.run=TestHelperProcess", "--", "WANT_HELPER_PROCESS", "start-stop", "/tmp/test-agent-proxy498041868/file"}
--- PASS: TestDaemonStop_kill (0.44s)
logger: 2019/11/27 02:17:28 [INFO] agent/proxy: daemon exited with error: process 32614 is dead or running as another user
--- PASS: TestDaemonLaunchesNewProcessGroup (0.74s)
    daemon_test.go:224: Child PID was 32662 and still 32662
    daemon_test.go:241: Child PID was 32662 and is now 32690
PASS
ok  	github.com/hashicorp/consul/agent/proxyprocess	3.729s
=== RUN   TestManagerInternal_cycleServer
--- PASS: TestManagerInternal_cycleServer (0.00s)
=== RUN   TestManagerInternal_getServerList
--- PASS: TestManagerInternal_getServerList (0.00s)
=== RUN   TestManagerInternal_New
--- PASS: TestManagerInternal_New (0.00s)
=== RUN   TestManagerInternal_reconcileServerList
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [WARN] manager: No servers available
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s00 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [WARN] manager: No servers available
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 2 servers, next active server is s00 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s03 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s03 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s04 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 4 servers, next active server is s02 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 6 servers, next active server is s09 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s51 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 72 servers, next active server is s33 (Addr: /) (DC: )
--- PASS: TestManagerInternal_reconcileServerList (0.00s)
=== RUN   TestManagerInternal_refreshServerRebalanceTimer
--- PASS: TestManagerInternal_refreshServerRebalanceTimer (0.00s)
=== RUN   TestManagerInternal_saveServerList
--- PASS: TestManagerInternal_saveServerList (0.00s)
=== RUN   TestRouter_Shutdown
2019/11/27 02:17:32 [INFO] manager: shutting down
--- PASS: TestRouter_Shutdown (0.00s)
2019/11/27 02:17:32 [INFO] manager: shutting down
=== RUN   TestRouter_Routing
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
--- PASS: TestRouter_Routing (0.00s)
2019/11/27 02:17:32 [INFO] manager: shutting down
=== RUN   TestRouter_Routing_Offline
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [INFO] manager: shutting down
2019/11/27 02:17:32 [DEBUG] manager: pinging server "node3.dc1 (Addr: tcp/127.0.0.4:8300) (DC: dc1)" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "node4.dc1 (Addr: tcp/127.0.0.5:8300) (DC: dc1)" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "node1.dc1 (Addr: tcp/127.0.0.2:8300) (DC: dc1)" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "node2.dc1 (Addr: tcp/127.0.0.3:8300) (DC: dc1)" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
--- PASS: TestRouter_Routing_Offline (0.00s)
=== RUN   TestRouter_GetDatacenters
--- PASS: TestRouter_GetDatacenters (0.00s)
=== RUN   TestRouter_distanceSorter
--- PASS: TestRouter_distanceSorter (0.00s)
=== RUN   TestRouter_GetDatacentersByDistance
--- PASS: TestRouter_GetDatacentersByDistance (0.00s)
=== RUN   TestRouter_GetDatacenterMaps
--- PASS: TestRouter_GetDatacenterMaps (0.00s)
=== RUN   TestServers_AddServer
--- PASS: TestServers_AddServer (0.00s)
=== RUN   TestServers_IsOffline
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 1 servers, next active server is s1 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s1 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: No healthy servers during rebalance, aborting
--- PASS: TestServers_IsOffline (0.01s)
=== RUN   TestServers_FindServer
2019/11/27 02:17:32 [WARN] manager: No servers available
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s1"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s2"
--- PASS: TestServers_FindServer (0.00s)
=== RUN   TestServers_New
--- PASS: TestServers_New (0.00s)
=== RUN   TestServers_NotifyFailedServer
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s1"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s2"
--- PASS: TestServers_NotifyFailedServer (0.00s)
=== RUN   TestServers_NumServers
--- PASS: TestServers_NumServers (0.00s)
=== RUN   TestServers_RebalanceServers
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s18 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s65 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s48 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s53 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s91 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s07 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s49 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s33 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s09 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s20 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s24 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s39 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s13 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s14 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s22 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s72 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s06 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s65 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s77 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s24 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s04 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s26 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s31 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s72 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s71 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s79 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s25 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s91 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s57 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s15 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s87 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s82 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s73 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s83 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s16 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s22 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s94 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s19 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s53 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s17 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s57 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s64 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s53 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s20 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s76 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s56 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s95 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s27 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s05 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s10 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s44 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s91 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s19 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s10 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s41 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s33 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s07 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s76 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s74 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s25 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s08 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s57 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s34 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s14 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s45 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s92 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s71 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s36 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s80 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s26 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s48 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s15 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s59 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s80 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s43 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s41 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s11 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s79 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s78 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s14 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s61 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s31 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s56 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s88 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s39 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s44 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s28 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s20 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s93 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s06 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s31 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s82 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s35 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s62 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: pinging server "s13 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:32 [DEBUG] manager: Rebalanced 100 servers, next active server is s41 (Addr: /) (DC: )
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:32 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s71 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s99 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s22 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s69 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s30 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s02 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s57 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s48 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s31 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s42 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s86 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s90 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s72 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s80 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s53 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s55 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s52 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s26 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s12 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s95 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s25 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s50 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s22 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s61 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s01 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s23 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s72 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s90 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s22 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s78 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s72 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s24 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s95 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s63 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s34 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s58 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s36 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s13 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s42 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s93 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s87 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s31 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s91 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s56 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s88 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s59 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s69 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s76 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s29 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s45 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s58 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s51 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s31 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s42 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s80 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s18 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s29 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s25 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s38 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s60 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s73 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s94 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s07 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s92 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s81 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s10 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s63 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s08 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s70 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s47 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s89 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s99 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s51 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s65 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s96 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s92 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s46 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s97 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s09 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s36 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s56 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s44 (Addr: /) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s47 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 100 servers, next active server is s17 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s75"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s51"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s91"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s93"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s61"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s48"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s78"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s67"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s72"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s63"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s62"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s88"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s45"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s99"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s85"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s83"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s41"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s53"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s82"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s94"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s77"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s50"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s32"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s00"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s42"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s11"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s46"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s30"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s36"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s87"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s74"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s25"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s60"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s95"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s33"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s22"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s52"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s20"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s38"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s65"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s68"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s49"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s26"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s73"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s23"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s80"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s71"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s21"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s56"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s27"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s76"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s98"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s31"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s34"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s24"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s84"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s44"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s10"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s43"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s47"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s96"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s90"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s81"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s92"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s66"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s55"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s79"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s57"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s59"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s58"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s89"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s70"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s97"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s69"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s86"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s35"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s37"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s40"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s29"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s64"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s39"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s06"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s54"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s28"
--- PASS: TestServers_RebalanceServers (1.12s)
=== RUN   TestServers_RebalanceServers_AvoidFailed
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: pinging server "s2 (Addr: faux/s2) (DC: )" failed: %!s(<nil>)
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s1 (Addr: faux/s1) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 3 servers, next active server is s3 (Addr: faux/s3) (DC: )
--- PASS: TestServers_RebalanceServers_AvoidFailed (0.03s)
=== RUN   TestManager_RemoveServer
2019/11/27 02:17:33 [DEBUG] manager: Rebalanced 19 servers, next active server is s10 (Addr: /) (DC: )
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s13"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s08"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s14"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s16"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s17"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s09"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s01"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s18"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s12"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s03"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s19"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s07"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s04"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s05"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s15"
2019/11/27 02:17:33 [DEBUG] manager: cycled away from server "s02"
--- PASS: TestManager_RemoveServer (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/router	1.247s
=== RUN   TestStructs_ACLCaches
=== PAUSE TestStructs_ACLCaches
=== RUN   TestStructs_ACL_IsSame
--- PASS: TestStructs_ACL_IsSame (0.00s)
=== RUN   TestStructs_ACL_Convert
=== PAUSE TestStructs_ACL_Convert
=== RUN   TestStructs_ACLToken_Convert
=== PAUSE TestStructs_ACLToken_Convert
=== RUN   TestStructs_ACLToken_PolicyIDs
=== PAUSE TestStructs_ACLToken_PolicyIDs
=== RUN   TestStructs_ACLToken_EmbeddedPolicy
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy
=== RUN   TestStructs_ACLToken_SetHash
=== PAUSE TestStructs_ACLToken_SetHash
=== RUN   TestStructs_ACLToken_EstimateSize
=== PAUSE TestStructs_ACLToken_EstimateSize
=== RUN   TestStructs_ACLToken_Stub
=== PAUSE TestStructs_ACLToken_Stub
=== RUN   TestStructs_ACLTokens_Sort
=== PAUSE TestStructs_ACLTokens_Sort
=== RUN   TestStructs_ACLTokenListStubs_Sort
=== PAUSE TestStructs_ACLTokenListStubs_Sort
=== RUN   TestStructs_ACLPolicy_Stub
=== PAUSE TestStructs_ACLPolicy_Stub
=== RUN   TestStructs_ACLPolicy_SetHash
=== PAUSE TestStructs_ACLPolicy_SetHash
=== RUN   TestStructs_ACLPolicy_EstimateSize
=== PAUSE TestStructs_ACLPolicy_EstimateSize
=== RUN   TestStructs_ACLPolicies_Sort
=== PAUSE TestStructs_ACLPolicies_Sort
=== RUN   TestStructs_ACLPolicyListStubs_Sort
=== PAUSE TestStructs_ACLPolicyListStubs_Sort
=== RUN   TestStructs_ACLPolicies_resolveWithCache
=== PAUSE TestStructs_ACLPolicies_resolveWithCache
=== RUN   TestStructs_ACLPolicies_Compile
=== PAUSE TestStructs_ACLPolicies_Compile
=== RUN   TestCheckDefinition_Defaults
=== PAUSE TestCheckDefinition_Defaults
=== RUN   TestCheckDefinition_CheckType
=== PAUSE TestCheckDefinition_CheckType
=== RUN   TestCheckDefinitionToCheckType
=== PAUSE TestCheckDefinitionToCheckType
=== RUN   TestCAConfiguration_GetCommonConfig
=== RUN   TestCAConfiguration_GetCommonConfig/basic_defaults
=== RUN   TestCAConfiguration_GetCommonConfig/basic_defaults_after_encoding_fun
--- PASS: TestCAConfiguration_GetCommonConfig (0.00s)
    --- PASS: TestCAConfiguration_GetCommonConfig/basic_defaults (0.00s)
    --- PASS: TestCAConfiguration_GetCommonConfig/basic_defaults_after_encoding_fun (0.00s)
=== RUN   TestConnectProxyConfig_ToAPI
=== RUN   TestConnectProxyConfig_ToAPI/service
--- PASS: TestConnectProxyConfig_ToAPI (0.00s)
    --- PASS: TestConnectProxyConfig_ToAPI/service (0.00s)
=== RUN   TestUpstream_MarshalJSON
=== RUN   TestUpstream_MarshalJSON/service
=== RUN   TestUpstream_MarshalJSON/pq
--- PASS: TestUpstream_MarshalJSON (0.00s)
    --- PASS: TestUpstream_MarshalJSON/service (0.00s)
    --- PASS: TestUpstream_MarshalJSON/pq (0.00s)
=== RUN   TestUpstream_UnmarshalJSON
=== RUN   TestUpstream_UnmarshalJSON/service
=== RUN   TestUpstream_UnmarshalJSON/pq
--- PASS: TestUpstream_UnmarshalJSON (0.00s)
    --- PASS: TestUpstream_UnmarshalJSON/service (0.00s)
    --- PASS: TestUpstream_UnmarshalJSON/pq (0.00s)
=== RUN   TestConnectManagedProxy_ParseConfig
=== RUN   TestConnectManagedProxy_ParseConfig/empty
=== RUN   TestConnectManagedProxy_ParseConfig/specified
=== RUN   TestConnectManagedProxy_ParseConfig/stringy_port
=== RUN   TestConnectManagedProxy_ParseConfig/empty_addr
=== RUN   TestConnectManagedProxy_ParseConfig/empty_port
=== RUN   TestConnectManagedProxy_ParseConfig/junk_address
=== RUN   TestConnectManagedProxy_ParseConfig/zero_port,_missing_addr
=== RUN   TestConnectManagedProxy_ParseConfig/extra_fields_present
--- PASS: TestConnectManagedProxy_ParseConfig (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/specified (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/stringy_port (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty_addr (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/empty_port (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/junk_address (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/zero_port,_missing_addr (0.00s)
    --- PASS: TestConnectManagedProxy_ParseConfig/extra_fields_present (0.00s)
=== RUN   TestIntentionGetACLPrefix
=== RUN   TestIntentionGetACLPrefix/unset_name
=== RUN   TestIntentionGetACLPrefix/set_name
--- PASS: TestIntentionGetACLPrefix (0.00s)
    --- PASS: TestIntentionGetACLPrefix/unset_name (0.00s)
    --- PASS: TestIntentionGetACLPrefix/set_name (0.00s)
=== RUN   TestIntentionValidate
=== RUN   TestIntentionValidate/long_description
=== RUN   TestIntentionValidate/no_action_set
=== RUN   TestIntentionValidate/no_SourceNS
=== RUN   TestIntentionValidate/no_SourceName
=== RUN   TestIntentionValidate/no_DestinationNS
=== RUN   TestIntentionValidate/no_DestinationName
=== RUN   TestIntentionValidate/SourceNS_partial_wildcard
=== RUN   TestIntentionValidate/SourceName_partial_wildcard
=== RUN   TestIntentionValidate/SourceName_exact_following_wildcard
=== RUN   TestIntentionValidate/DestinationNS_partial_wildcard
=== RUN   TestIntentionValidate/DestinationName_partial_wildcard
=== RUN   TestIntentionValidate/DestinationName_exact_following_wildcard
=== RUN   TestIntentionValidate/SourceType_is_not_set
=== RUN   TestIntentionValidate/SourceType_is_other
--- PASS: TestIntentionValidate (0.01s)
    --- PASS: TestIntentionValidate/long_description (0.00s)
    --- PASS: TestIntentionValidate/no_action_set (0.00s)
    --- PASS: TestIntentionValidate/no_SourceNS (0.00s)
    --- PASS: TestIntentionValidate/no_SourceName (0.00s)
    --- PASS: TestIntentionValidate/no_DestinationNS (0.00s)
    --- PASS: TestIntentionValidate/no_DestinationName (0.00s)
    --- PASS: TestIntentionValidate/SourceNS_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceName_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceName_exact_following_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationNS_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationName_partial_wildcard (0.00s)
    --- PASS: TestIntentionValidate/DestinationName_exact_following_wildcard (0.00s)
    --- PASS: TestIntentionValidate/SourceType_is_not_set (0.00s)
    --- PASS: TestIntentionValidate/SourceType_is_other (0.00s)
=== RUN   TestIntentionPrecedenceSorter
=== RUN   TestIntentionPrecedenceSorter/exhaustive_list
=== RUN   TestIntentionPrecedenceSorter/tiebreak_deterministically
--- PASS: TestIntentionPrecedenceSorter (0.00s)
    --- PASS: TestIntentionPrecedenceSorter/exhaustive_list (0.00s)
    --- PASS: TestIntentionPrecedenceSorter/tiebreak_deterministically (0.00s)
=== RUN   TestStructs_PreparedQuery_GetACLPrefix
--- PASS: TestStructs_PreparedQuery_GetACLPrefix (0.00s)
=== RUN   TestAgentStructs_CheckTypes
=== PAUSE TestAgentStructs_CheckTypes
=== RUN   TestServiceDefinitionValidate
=== RUN   TestServiceDefinitionValidate/valid
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_a_port_set
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_no_port_set
=== RUN   TestServiceDefinitionValidate/managed_proxy_with_native_set
--- PASS: TestServiceDefinitionValidate (0.00s)
    --- PASS: TestServiceDefinitionValidate/valid (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_a_port_set (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_no_port_set (0.00s)
    --- PASS: TestServiceDefinitionValidate/managed_proxy_with_native_set (0.00s)
=== RUN   TestServiceDefinitionConnectProxy_json
=== RUN   TestServiceDefinitionConnectProxy_json/no_config
=== RUN   TestServiceDefinitionConnectProxy_json/basic_config
=== RUN   TestServiceDefinitionConnectProxy_json/config_with_upstreams
--- PASS: TestServiceDefinitionConnectProxy_json (0.01s)
    --- PASS: TestServiceDefinitionConnectProxy_json/no_config (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
    --- PASS: TestServiceDefinitionConnectProxy_json/basic_config (0.00s)
        service_definition_test.go:196: error: %!s(<nil>)
    --- PASS: TestServiceDefinitionConnectProxy_json/config_with_upstreams (0.01s)
        service_definition_test.go:196: error: %!s(<nil>)
=== RUN   TestEncodeDecode
--- PASS: TestEncodeDecode (0.00s)
=== RUN   TestStructs_Implements
--- PASS: TestStructs_Implements (0.00s)
=== RUN   TestStructs_RegisterRequest_ChangesNode
--- PASS: TestStructs_RegisterRequest_ChangesNode (0.00s)
=== RUN   TestNode_IsSame
--- PASS: TestNode_IsSame (0.00s)
=== RUN   TestStructs_ServiceNode_IsSameService
--- PASS: TestStructs_ServiceNode_IsSameService (0.00s)
=== RUN   TestStructs_ServiceNode_PartialClone
--- PASS: TestStructs_ServiceNode_PartialClone (0.00s)
=== RUN   TestStructs_ServiceNode_Conversions
--- PASS: TestStructs_ServiceNode_Conversions (0.00s)
=== RUN   TestStructs_NodeService_ValidateConnectProxy
=== RUN   TestStructs_NodeService_ValidateConnectProxy/valid
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_whitespace_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_valid_ProxyDestination
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_port_set
=== RUN   TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_ConnectNative_set
--- PASS: TestStructs_NodeService_ValidateConnectProxy (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/valid (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_whitespace_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_valid_ProxyDestination (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_no_port_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateConnectProxy/connect-proxy:_ConnectNative_set (0.00s)
=== RUN   TestStructs_NodeService_ValidateSidecarService
=== RUN   TestStructs_NodeService_ValidateSidecarService/valid
=== RUN   TestStructs_NodeService_ValidateSidecarService/ID_can't_be_set
=== RUN   TestStructs_NodeService_ValidateSidecarService/Nested_sidecar_can't_be_set
=== RUN   TestStructs_NodeService_ValidateSidecarService/Sidecar_can't_have_managed_proxy
--- PASS: TestStructs_NodeService_ValidateSidecarService (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/valid (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/ID_can't_be_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/Nested_sidecar_can't_be_set (0.00s)
    --- PASS: TestStructs_NodeService_ValidateSidecarService/Sidecar_can't_have_managed_proxy (0.00s)
=== RUN   TestStructs_NodeService_IsSame
--- PASS: TestStructs_NodeService_IsSame (0.00s)
=== RUN   TestStructs_HealthCheck_IsSame
--- PASS: TestStructs_HealthCheck_IsSame (0.00s)
=== RUN   TestStructs_HealthCheck_Marshalling
--- PASS: TestStructs_HealthCheck_Marshalling (0.00s)
=== RUN   TestStructs_HealthCheck_Clone
--- PASS: TestStructs_HealthCheck_Clone (0.00s)
=== RUN   TestStructs_CheckServiceNodes_Shuffle
--- PASS: TestStructs_CheckServiceNodes_Shuffle (0.02s)
=== RUN   TestStructs_CheckServiceNodes_Filter
--- PASS: TestStructs_CheckServiceNodes_Filter (0.00s)
=== RUN   TestStructs_DirEntry_Clone
--- PASS: TestStructs_DirEntry_Clone (0.01s)
=== RUN   TestStructs_ValidateMetadata
--- PASS: TestStructs_ValidateMetadata (0.00s)
=== RUN   TestStructs_validateMetaPair
--- PASS: TestStructs_validateMetaPair (0.00s)
=== RUN   TestSpecificServiceRequest_CacheInfo
=== RUN   TestSpecificServiceRequest_CacheInfo/basic_params
=== RUN   TestSpecificServiceRequest_CacheInfo/name_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/node_meta_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/address_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/tag_filter_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/connect_should_be_considered
=== RUN   TestSpecificServiceRequest_CacheInfo/tags_should_be_different
=== RUN   TestSpecificServiceRequest_CacheInfo/tags_should_not_depend_on_order
=== RUN   TestSpecificServiceRequest_CacheInfo/legacy_requests_with_singular_tag_should_be_different
--- PASS: TestSpecificServiceRequest_CacheInfo (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/basic_params (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/name_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/node_meta_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/address_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tag_filter_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/connect_should_be_considered (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tags_should_be_different (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/tags_should_not_depend_on_order (0.00s)
    --- PASS: TestSpecificServiceRequest_CacheInfo/legacy_requests_with_singular_tag_should_be_different (0.00s)
=== CONT  TestStructs_ACLCaches
=== CONT  TestStructs_ACLPolicy_SetHash
=== CONT  TestAgentStructs_CheckTypes
=== CONT  TestStructs_ACLToken_EstimateSize
--- PASS: TestStructs_ACLToken_EstimateSize (0.00s)
=== CONT  TestCheckDefinition_CheckType
=== CONT  TestCheckDefinitionToCheckType
--- PASS: TestCheckDefinitionToCheckType (0.00s)
=== CONT  TestStructs_ACLPolicy_Stub
--- PASS: TestStructs_ACLPolicy_Stub (0.00s)
=== CONT  TestStructs_ACLTokenListStubs_Sort
--- PASS: TestStructs_ACLTokenListStubs_Sort (0.00s)
=== CONT  TestStructs_ACLTokens_Sort
--- PASS: TestStructs_ACLTokens_Sort (0.00s)
=== CONT  TestStructs_ACLToken_Stub
=== RUN   TestStructs_ACLToken_Stub/Basic
=== PAUSE TestStructs_ACLToken_Stub/Basic
=== RUN   TestStructs_ACLToken_Stub/Legacy
=== PAUSE TestStructs_ACLToken_Stub/Legacy
--- PASS: TestAgentStructs_CheckTypes (0.00s)
=== CONT  TestStructs_ACLPolicies_resolveWithCache
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Cache_Misses
--- PASS: TestCheckDefinition_CheckType (0.00s)
=== CONT  TestCheckDefinition_Defaults
--- PASS: TestCheckDefinition_Defaults (0.00s)
=== CONT  TestStructs_ACLPolicies_Compile
=== RUN   TestStructs_ACLPolicy_SetHash/Nil_Hash_-_Generate
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Check_Cache
=== RUN   TestStructs_ACLPolicies_Compile/Cache_Miss
=== RUN   TestStructs_ACLPolicy_SetHash/Hash_Set_-_Dont_Generate
=== RUN   TestStructs_ACLPolicy_SetHash/Hash_Set_-_Generate
--- PASS: TestStructs_ACLPolicy_SetHash (0.01s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Nil_Hash_-_Generate (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Hash_Set_-_Dont_Generate (0.00s)
    --- PASS: TestStructs_ACLPolicy_SetHash/Hash_Set_-_Generate (0.00s)
=== CONT  TestStructs_ACLToken_PolicyIDs
=== RUN   TestStructs_ACLToken_PolicyIDs/Basic
=== PAUSE TestStructs_ACLToken_PolicyIDs/Basic
=== RUN   TestStructs_ACLToken_PolicyIDs/Legacy_Management
=== PAUSE TestStructs_ACLToken_PolicyIDs/Legacy_Management
=== RUN   TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== PAUSE TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
=== RUN   TestStructs_ACLToken_PolicyIDs/No_Policies
=== PAUSE TestStructs_ACLToken_PolicyIDs/No_Policies
=== CONT  TestStructs_ACLToken_SetHash
=== RUN   TestStructs_ACLToken_SetHash/Nil_Hash_-_Generate
=== RUN   TestStructs_ACLToken_SetHash/Hash_Set_-_Dont_Generate
=== RUN   TestStructs_ACLToken_SetHash/Hash_Set_-_Generate
--- PASS: TestStructs_ACLToken_SetHash (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Nil_Hash_-_Generate (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Hash_Set_-_Dont_Generate (0.00s)
    --- PASS: TestStructs_ACLToken_SetHash/Hash_Set_-_Generate (0.00s)
=== CONT  TestStructs_ACLToken_EmbeddedPolicy
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== RUN   TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== PAUSE TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== CONT  TestStructs_ACLPolicies_Sort
--- PASS: TestStructs_ACLPolicies_Sort (0.00s)
=== CONT  TestStructs_ACLPolicyListStubs_Sort
--- PASS: TestStructs_ACLPolicyListStubs_Sort (0.00s)
=== CONT  TestStructs_ACLPolicy_EstimateSize
--- PASS: TestStructs_ACLPolicy_EstimateSize (0.00s)
=== CONT  TestStructs_ACLToken_Convert
=== RUN   TestStructs_ACLToken_Convert/Management
=== PAUSE TestStructs_ACLToken_Convert/Management
=== RUN   TestStructs_ACLToken_Convert/Client
=== PAUSE TestStructs_ACLToken_Convert/Client
=== RUN   TestStructs_ACLToken_Convert/Unconvertible
=== PAUSE TestStructs_ACLToken_Convert/Unconvertible
=== CONT  TestStructs_ACL_Convert
--- PASS: TestStructs_ACL_Convert (0.00s)
=== CONT  TestStructs_ACLToken_Stub/Basic
=== CONT  TestStructs_ACLToken_Stub/Legacy
=== RUN   TestStructs_ACLPolicies_resolveWithCache/Cache_Hits
--- PASS: TestStructs_ACLToken_Stub (0.00s)
    --- PASS: TestStructs_ACLToken_Stub/Basic (0.00s)
    --- PASS: TestStructs_ACLToken_Stub/Legacy (0.00s)
=== CONT  TestStructs_ACLToken_PolicyIDs/Basic
=== RUN   TestStructs_ACLPolicies_Compile/Check_Cache
=== RUN   TestStructs_ACLCaches/New
=== PAUSE TestStructs_ACLCaches/New
=== RUN   TestStructs_ACLCaches/Identities
=== PAUSE TestStructs_ACLCaches/Identities
=== RUN   TestStructs_ACLCaches/Policies
=== PAUSE TestStructs_ACLCaches/Policies
=== RUN   TestStructs_ACLCaches/ParsedPolicies
=== PAUSE TestStructs_ACLCaches/ParsedPolicies
=== RUN   TestStructs_ACLCaches/Authorizers
=== PAUSE TestStructs_ACLCaches/Authorizers
=== CONT  TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules
--- PASS: TestStructs_ACLPolicies_resolveWithCache (0.01s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Cache_Misses (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Check_Cache (0.00s)
    --- PASS: TestStructs_ACLPolicies_resolveWithCache/Cache_Hits (0.00s)
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/No_Rules
=== CONT  TestStructs_ACLToken_Convert/Management
=== CONT  TestStructs_ACLToken_PolicyIDs/No_Policies
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules
=== CONT  TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client
=== CONT  TestStructs_ACLToken_Convert/Unconvertible
=== CONT  TestStructs_ACLToken_Convert/Client
=== RUN   TestStructs_ACLPolicies_Compile/Cache_Hit
--- PASS: TestStructs_ACLToken_Convert (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Management (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Unconvertible (0.00s)
    --- PASS: TestStructs_ACLToken_Convert/Client (0.00s)
=== CONT  TestStructs_ACLCaches/Authorizers
--- PASS: TestStructs_ACLToken_EmbeddedPolicy (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/No_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/Same_Policy_for_Tokens_with_same_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_EmbeddedPolicy/Legacy_Client (0.00s)
=== CONT  TestStructs_ACLToken_PolicyIDs/Legacy_Management
=== CONT  TestStructs_ACLCaches/New
=== CONT  TestStructs_ACLCaches/ParsedPolicies
=== CONT  TestStructs_ACLCaches/Policies
=== RUN   TestStructs_ACLCaches/New/Valid_Sizes
--- PASS: TestStructs_ACLPolicies_Compile (0.02s)
    --- PASS: TestStructs_ACLPolicies_Compile/Cache_Miss (0.01s)
    --- PASS: TestStructs_ACLPolicies_Compile/Check_Cache (0.00s)
    --- PASS: TestStructs_ACLPolicies_Compile/Cache_Hit (0.00s)
=== CONT  TestStructs_ACLCaches/Identities
--- PASS: TestStructs_ACLToken_PolicyIDs (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Basic (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Legacy_Management_With_Rules (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/No_Policies (0.00s)
    --- PASS: TestStructs_ACLToken_PolicyIDs/Legacy_Management (0.00s)
=== PAUSE TestStructs_ACLCaches/New/Valid_Sizes
=== RUN   TestStructs_ACLCaches/New/Zero_Sizes
=== PAUSE TestStructs_ACLCaches/New/Zero_Sizes
=== CONT  TestStructs_ACLCaches/New/Valid_Sizes
=== CONT  TestStructs_ACLCaches/New/Zero_Sizes
--- PASS: TestStructs_ACLCaches (0.01s)
    --- PASS: TestStructs_ACLCaches/Authorizers (0.00s)
    --- PASS: TestStructs_ACLCaches/ParsedPolicies (0.00s)
    --- PASS: TestStructs_ACLCaches/Policies (0.00s)
    --- PASS: TestStructs_ACLCaches/Identities (0.00s)
    --- PASS: TestStructs_ACLCaches/New (0.00s)
        --- PASS: TestStructs_ACLCaches/New/Valid_Sizes (0.00s)
        --- PASS: TestStructs_ACLCaches/New/Zero_Sizes (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/structs	0.213s
?   	github.com/hashicorp/consul/agent/systemd	[no test files]
=== RUN   TestStore_RegularTokens
=== PAUSE TestStore_RegularTokens
=== RUN   TestStore_AgentMasterToken
=== PAUSE TestStore_AgentMasterToken
=== CONT  TestStore_RegularTokens
=== RUN   TestStore_RegularTokens/set_user_-_config
=== PAUSE TestStore_RegularTokens/set_user_-_config
=== RUN   TestStore_RegularTokens/set_user_-_api
=== PAUSE TestStore_RegularTokens/set_user_-_api
=== RUN   TestStore_RegularTokens/set_agent_-_config
=== PAUSE TestStore_RegularTokens/set_agent_-_config
=== RUN   TestStore_RegularTokens/set_agent_-_api
=== PAUSE TestStore_RegularTokens/set_agent_-_api
=== RUN   TestStore_RegularTokens/set_user_and_agent
=== PAUSE TestStore_RegularTokens/set_user_and_agent
=== RUN   TestStore_RegularTokens/set_repl_-_config
=== PAUSE TestStore_RegularTokens/set_repl_-_config
=== RUN   TestStore_RegularTokens/set_repl_-_api
=== PAUSE TestStore_RegularTokens/set_repl_-_api
=== RUN   TestStore_RegularTokens/set_master_-_config
=== PAUSE TestStore_RegularTokens/set_master_-_config
=== RUN   TestStore_RegularTokens/set_master_-_api
=== PAUSE TestStore_RegularTokens/set_master_-_api
=== RUN   TestStore_RegularTokens/set_all
=== PAUSE TestStore_RegularTokens/set_all
=== CONT  TestStore_RegularTokens/set_user_-_config
=== CONT  TestStore_RegularTokens/set_repl_-_config
=== CONT  TestStore_RegularTokens/set_all
=== CONT  TestStore_RegularTokens/set_agent_-_api
=== CONT  TestStore_RegularTokens/set_user_and_agent
=== CONT  TestStore_RegularTokens/set_master_-_api
=== CONT  TestStore_RegularTokens/set_repl_-_api
=== CONT  TestStore_RegularTokens/set_master_-_config
=== CONT  TestStore_RegularTokens/set_agent_-_config
=== CONT  TestStore_RegularTokens/set_user_-_api
=== CONT  TestStore_AgentMasterToken
--- PASS: TestStore_AgentMasterToken (0.00s)
--- PASS: TestStore_RegularTokens (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_all (0.00s)
    --- PASS: TestStore_RegularTokens/set_repl_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_agent_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_repl_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_master_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_agent_-_config (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_-_api (0.00s)
    --- PASS: TestStore_RegularTokens/set_user_and_agent (0.00s)
    --- PASS: TestStore_RegularTokens/set_master_-_api (0.00s)
PASS
ok  	github.com/hashicorp/consul/agent/token	0.049s
=== RUN   TestServer_StreamAggregatedResources_BasicProtocol
--- PASS: TestServer_StreamAggregatedResources_BasicProtocol (0.17s)
=== RUN   TestServer_StreamAggregatedResources_ACLEnforcement
--- SKIP: TestServer_StreamAggregatedResources_ACLEnforcement (0.00s)
    server_test.go:560: DM-skipped
=== RUN   TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedDuringDiscoveryRequest
2019/11/27 02:18:06 [DEBUG] Error handling ADS stream: rpc error: code = Unauthenticated desc = unauthenticated: ACL not found
--- PASS: TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedDuringDiscoveryRequest (0.05s)
=== RUN   TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedInBackground
2019/11/27 02:18:06 [DEBUG] Error handling ADS stream: rpc error: code = Unauthenticated desc = unauthenticated: ACL not found
--- PASS: TestServer_StreamAggregatedResources_ACLTokenDeleted_StreamTerminatedInBackground (0.17s)
=== RUN   TestServer_Check
=== RUN   TestServer_Check/auth_allowed
=== RUN   TestServer_Check/auth_denied
=== RUN   TestServer_Check/no_source
=== RUN   TestServer_Check/no_dest
=== RUN   TestServer_Check/dest_invalid_format
=== RUN   TestServer_Check/dest_not_a_service_URI
=== RUN   TestServer_Check/ACL_not_got_permission_for_authz_call
=== RUN   TestServer_Check/Random_error_running_authz
--- PASS: TestServer_Check (0.01s)
    --- PASS: TestServer_Check/auth_allowed (0.00s)
    --- PASS: TestServer_Check/auth_denied (0.00s)
    --- PASS: TestServer_Check/no_source (0.00s)
    --- PASS: TestServer_Check/no_dest (0.00s)
    --- PASS: TestServer_Check/dest_invalid_format (0.00s)
    --- PASS: TestServer_Check/dest_not_a_service_URI (0.00s)
    --- PASS: TestServer_Check/ACL_not_got_permission_for_authz_call (0.00s)
    --- PASS: TestServer_Check/Random_error_running_authz (0.00s)
=== RUN   TestServer_ConfigOverridesListeners
=== RUN   TestServer_ConfigOverridesListeners/sanity_check_no_custom
=== RUN   TestServer_ConfigOverridesListeners/custom_public_listener_no_type
=== RUN   TestServer_ConfigOverridesListeners/custom_public_listener_with_type
=== RUN   TestServer_ConfigOverridesListeners/custom_public_listener_with_TLS_should_be_overridden
=== RUN   TestServer_ConfigOverridesListeners/custom_upstream_no_type
=== RUN   TestServer_ConfigOverridesListeners/custom_upstream_with_type
--- PASS: TestServer_ConfigOverridesListeners (0.26s)
    --- PASS: TestServer_ConfigOverridesListeners/sanity_check_no_custom (0.05s)
    --- PASS: TestServer_ConfigOverridesListeners/custom_public_listener_no_type (0.04s)
    --- PASS: TestServer_ConfigOverridesListeners/custom_public_listener_with_type (0.06s)
    --- PASS: TestServer_ConfigOverridesListeners/custom_public_listener_with_TLS_should_be_overridden (0.04s)
    --- PASS: TestServer_ConfigOverridesListeners/custom_upstream_no_type (0.04s)
    --- PASS: TestServer_ConfigOverridesListeners/custom_upstream_with_type (0.04s)
=== RUN   TestServer_ConfigOverridesClusters
=== RUN   TestServer_ConfigOverridesClusters/sanity_check_no_custom
version_info:"00000001" resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\tlocal_app\"\002\010\005:\020\n\016\022\t127.0.0.1\030\220?" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\nservice:db\020\003\032\004\n\002\032\000\"\002\010\005Z\237\020\n\234\020\n\000\022\237\t\n\263\007\032\260\007-----BEGIN CERTIFICATE-----\nMIICjTCCAjOgAwIBAgIIIQRMTrmtxfMwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABNoGHK9Wqx++EGEG\nCpT2Pr7rM8QLzOW6D5Vs5CZIxzluWrJc5BIwflSUIIpFFoSqQ5rvBQoAAi58hjqW\nrxZZ7tWjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEXzc4OjNiOjRkOmNl\nOjFhOjNiOjZlOmQzOmFiOjIyOjU0OjFiOjdkOjI2OjQ5OjVlOmE0OmJhOmE3OjBl\nOjg0Ojk3Ojc5OjYxOjdhOmJjOjVlOmE3OjI2OmVmOmI4OjdjMGoGA1UdIwRjMGGA\nXzhhOjk3OmQ3OmU1OjVhOjBhOjA5OmQ4OjlkOmVhOjBkOjQ2Ojg3OmJkOmE4Ojli\nOjQ5OjNhOmExOjgxOmE4OmJkOmQxOjI1OjZiOmNiOjY3OjAxOmIzOjkxOmMwOjFl\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNIADBFAiAWkUk+/MQcQUwrd9OQEfO/gq9oZt3DbP5wBKZzLEfXEgIh\nAPM4rHS71y23gO5ivm5V9BjiADXJY9RJYATVPMn0QOCT\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEICY/8cYixSWjO9kMJRb32lhKMwnqRMLOo+NE4MJZFKdjoAoGCCqGSM49\nAwEHoUQDQgAE2gYcr1arH74QYQYKlPY+vuszxAvM5boPlWzkJkjHOW5aslzkEjB+\nVJQgikUWhKpDmu8FCgACLnyGOpavFlnu1Q==\n-----END EC PRIVATE KEY-----\n\032\365\006\n\362\006\032\357\006-----BEGIN CERTIFICATE-----\nMIICXTCCAgSgAwIBAgIIBAD0bXZhg+UwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTAwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAR/SljI\nHXeENvDVasdHCpAva2YomgKIgVbaXR4QfnNjNjA5x7FwzMDyZ7/k7xGltC31fDRp\nd50UJ50duHsAHz8Io4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzhhOjk3OmQ3OmU1OjVhOjBhOjA5OmQ4OjlkOmVhOjBk\nOjQ2Ojg3OmJkOmE4OjliOjQ5OjNhOmExOjgxOmE4OmJkOmQxOjI1OjZiOmNiOjY3\nOjAxOmIzOjkxOmMwOjFlMGoGA1UdIwRjMGGAXzhhOjk3OmQ3OmU1OjVhOjBhOjA5\nOmQ4OjlkOmVhOjBkOjQ2Ojg3OmJkOmE4OjliOjQ5OjNhOmExOjgxOmE4OmJkOmQx\nOjI1OjZiOmNiOjY3OjAxOmIzOjkxOmMwOjFlMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDRwAwRAIgNYrK5DzYD4/Keq0XfnXzhWLQouD3YyENBqNRU7cIpFYC\nIBELaJ8WBJru3kL9gncK/Iy/brfNsIQcdri8aKcrPpxg\n-----END CERTIFICATE-----\n" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\030prepared_query:geo-cache\020\003\032\004\n\002\032\000\"\002\010\005Z\237\020\n\234\020\n\000\022\237\t\n\263\007\032\260\007-----BEGIN CERTIFICATE-----\nMIICjTCCAjOgAwIBAgIIIQRMTrmtxfMwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABNoGHK9Wqx++EGEG\nCpT2Pr7rM8QLzOW6D5Vs5CZIxzluWrJc5BIwflSUIIpFFoSqQ5rvBQoAAi58hjqW\nrxZZ7tWjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEXzc4OjNiOjRkOmNl\nOjFhOjNiOjZlOmQzOmFiOjIyOjU0OjFiOjdkOjI2OjQ5OjVlOmE0OmJhOmE3OjBl\nOjg0Ojk3Ojc5OjYxOjdhOmJjOjVlOmE3OjI2OmVmOmI4OjdjMGoGA1UdIwRjMGGA\nXzhhOjk3OmQ3OmU1OjVhOjBhOjA5OmQ4OjlkOmVhOjBkOjQ2Ojg3OmJkOmE4Ojli\nOjQ5OjNhOmExOjgxOmE4OmJkOmQxOjI1OjZiOmNiOjY3OjAxOmIzOjkxOmMwOjFl\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNIADBFAiAWkUk+/MQcQUwrd9OQEfO/gq9oZt3DbP5wBKZzLEfXEgIh\nAPM4rHS71y23gO5ivm5V9BjiADXJY9RJYATVPMn0QOCT\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEICY/8cYixSWjO9kMJRb32lhKMwnqRMLOo+NE4MJZFKdjoAoGCCqGSM49\nAwEHoUQDQgAE2gYcr1arH74QYQYKlPY+vuszxAvM5boPlWzkJkjHOW5aslzkEjB+\nVJQgikUWhKpDmu8FCgACLnyGOpavFlnu1Q==\n-----END EC PRIVATE KEY-----\n\032\365\006\n\362\006\032\357\006-----BEGIN CERTIFICATE-----\nMIICXTCCAgSgAwIBAgIIBAD0bXZhg+UwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTAwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAR/SljI\nHXeENvDVasdHCpAva2YomgKIgVbaXR4QfnNjNjA5x7FwzMDyZ7/k7xGltC31fDRp\nd50UJ50duHsAHz8Io4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzhhOjk3OmQ3OmU1OjVhOjBhOjA5OmQ4OjlkOmVhOjBk\nOjQ2Ojg3OmJkOmE4OjliOjQ5OjNhOmExOjgxOmE4OmJkOmQxOjI1OjZiOmNiOjY3\nOjAxOmIzOjkxOmMwOjFlMGoGA1UdIwRjMGGAXzhhOjk3OmQ3OmU1OjVhOjBhOjA5\nOmQ4OjlkOmVhOjBkOjQ2Ojg3OmJkOmE4OjliOjQ5OjNhOmExOjgxOmE4OmJkOmQx\nOjI1OjZiOmNiOjY3OjAxOmIzOjkxOmMwOjFlMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDRwAwRAIgNYrK5DzYD4/Keq0XfnXzhWLQouD3YyENBqNRU7cIpFYC\nIBELaJ8WBJru3kL9gncK/Iy/brfNsIQcdri8aKcrPpxg\n-----END CERTIFICATE-----\n" > type_url:"type.googleapis.com/envoy.api.v2.Cluster" nonce:"00000001" 
=== RUN   TestServer_ConfigOverridesClusters/custom_public_with_no_type
version_info:"00000001" resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\007mylocal\"\002\010\005:\020\n\016\022\t127.0.0.1\030\220?" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\nservice:db\020\003\032\004\n\002\032\000\"\002\010\005Z\247\020\n\244\020\n\000\022\243\t\n\267\007\032\264\007-----BEGIN CERTIFICATE-----\nMIICjjCCAjOgAwIBAgIIPys2Zj/SSQEwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMTAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABEDWoq1XKw9QmPar\nF/mxjx9TKC5dy3aEDch/67x9e9HblxoRpjaV1LbEyfe4O+jsqfGb/RHa/vwg003v\n7ClHLayjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEX2M5OjE1OmVjOjI4\nOmQwOjcyOmIyOjc5OjIxOjhhOmU0OjMxOjQwOmIwOjk4OjJkOmRhOjFiOjUyOjUz\nOjY5OjY4OmZlOmZmOjE2OmQ0OjcwOjA4OmVkOjE4Ojg3OjNkMGoGA1UdIwRjMGGA\nXzAzOjhjOjY3OjgxOjU1OjQ5OjI4Ojc2OjlmOjhmOmJkOmNiOjE3OjYxOjA2Ojk1\nOjIxOjJkOjhiOmVhOjA4OjgwOjI2OmU4OjMyOmFkOmQ1OmM4OjMxOjg4Ojk0OjBi\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNJADBGAiEAz80MSc/3kI7AjLbxtZooYDvnT9XmpmRt71MqAKEMRCkC\nIQDUK249VitccyQcX7cnGFWKm2tL/N4lodWu0Kx+wL5GZQ==\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIMo/b8HBPS311bSO/NQtoivZsC+pp3DF4nVezm1uo+QLoAoGCCqGSM49\nAwEHoUQDQgAEQNairVcrD1CY9qsX+bGPH1MoLl3LdoQNyH/rvH170duXGhGmNpXU\ntsTJ97g76Oyp8Zv9Edr+/CDTTe/sKUctrA==\n-----END EC PRIVATE KEY-----\n\032\371\006\n\366\006\032\363\006-----BEGIN CERTIFICATE-----\nMIICXjCCAgSgAwIBAgIIYWSBnKYqPnIwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMTAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTEwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQW/5yX\n0VKa7jGJxpLrU846PdcApKaBovebnn2lUVwwh4p71D9MGZa25kr7vtHWwb1P8w3b\n9+otq9obzsIsBTF+o4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzAzOjhjOjY3OjgxOjU1OjQ5OjI4Ojc2OjlmOjhmOmJk\nOmNiOjE3OjYxOjA2Ojk1OjIxOjJkOjhiOmVhOjA4OjgwOjI2OmU4OjMyOmFkOmQ1\nOmM4OjMxOjg4Ojk0OjBiMGoGA1UdIwRjMGGAXzAzOjhjOjY3OjgxOjU1OjQ5OjI4\nOjc2OjlmOjhmOmJkOmNiOjE3OjYxOjA2Ojk1OjIxOjJkOjhiOmVhOjA4OjgwOjI2\nOmU4OjMyOmFkOmQ1OmM4OjMxOjg4Ojk0OjBiMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDSAAwRQIhAP+1oUo1ARdwmVuuk7kq31pmVaZDRQL7vUE+y3NicgQK\nAiBhqu6p3aylxlmI4+Mi5lc+/vJ10Nw6oo7LUeQgyXMaeQ==\n-----END CERTIFICATE-----\n" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\030prepared_query:geo-cache\020\003\032\004\n\002\032\000\"\002\010\005Z\247\020\n\244\020\n\000\022\243\t\n\267\007\032\264\007-----BEGIN CERTIFICATE-----\nMIICjjCCAjOgAwIBAgIIPys2Zj/SSQEwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMTAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABEDWoq1XKw9QmPar\nF/mxjx9TKC5dy3aEDch/67x9e9HblxoRpjaV1LbEyfe4O+jsqfGb/RHa/vwg003v\n7ClHLayjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEX2M5OjE1OmVjOjI4\nOmQwOjcyOmIyOjc5OjIxOjhhOmU0OjMxOjQwOmIwOjk4OjJkOmRhOjFiOjUyOjUz\nOjY5OjY4OmZlOmZmOjE2OmQ0OjcwOjA4OmVkOjE4Ojg3OjNkMGoGA1UdIwRjMGGA\nXzAzOjhjOjY3OjgxOjU1OjQ5OjI4Ojc2OjlmOjhmOmJkOmNiOjE3OjYxOjA2Ojk1\nOjIxOjJkOjhiOmVhOjA4OjgwOjI2OmU4OjMyOmFkOmQ1OmM4OjMxOjg4Ojk0OjBi\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNJADBGAiEAz80MSc/3kI7AjLbxtZooYDvnT9XmpmRt71MqAKEMRCkC\nIQDUK249VitccyQcX7cnGFWKm2tL/N4lodWu0Kx+wL5GZQ==\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIMo/b8HBPS311bSO/NQtoivZsC+pp3DF4nVezm1uo+QLoAoGCCqGSM49\nAwEHoUQDQgAEQNairVcrD1CY9qsX+bGPH1MoLl3LdoQNyH/rvH170duXGhGmNpXU\ntsTJ97g76Oyp8Zv9Edr+/CDTTe/sKUctrA==\n-----END EC PRIVATE KEY-----\n\032\371\006\n\366\006\032\363\006-----BEGIN CERTIFICATE-----\nMIICXjCCAgSgAwIBAgIIYWSBnKYqPnIwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMTAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTEwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQW/5yX\n0VKa7jGJxpLrU846PdcApKaBovebnn2lUVwwh4p71D9MGZa25kr7vtHWwb1P8w3b\n9+otq9obzsIsBTF+o4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzAzOjhjOjY3OjgxOjU1OjQ5OjI4Ojc2OjlmOjhmOmJk\nOmNiOjE3OjYxOjA2Ojk1OjIxOjJkOjhiOmVhOjA4OjgwOjI2OmU4OjMyOmFkOmQ1\nOmM4OjMxOjg4Ojk0OjBiMGoGA1UdIwRjMGGAXzAzOjhjOjY3OjgxOjU1OjQ5OjI4\nOjc2OjlmOjhmOmJkOmNiOjE3OjYxOjA2Ojk1OjIxOjJkOjhiOmVhOjA4OjgwOjI2\nOmU4OjMyOmFkOmQ1OmM4OjMxOjg4Ojk0OjBiMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDSAAwRQIhAP+1oUo1ARdwmVuuk7kq31pmVaZDRQL7vUE+y3NicgQK\nAiBhqu6p3aylxlmI4+Mi5lc+/vJ10Nw6oo7LUeQgyXMaeQ==\n-----END CERTIFICATE-----\n" > type_url:"type.googleapis.com/envoy.api.v2.Cluster" nonce:"00000001" 
=== RUN   TestServer_ConfigOverridesClusters/custom_public_with_type
version_info:"00000001" resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\007mylocal\"\002\010\005:\020\n\016\022\t127.0.0.1\030\220?" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\nservice:db\020\003\032\004\n\002\032\000\"\002\010\005Z\243\020\n\240\020\n\000\022\243\t\n\267\007\032\264\007-----BEGIN CERTIFICATE-----\nMIICjjCCAjOgAwIBAgIIdkWJ+sxRt/UwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMjAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABF+GkINWqLvuUyNg\nslaVUPDTBbj9p2sTOILQZ0C8oaF13JYdSlYr2++BuSpd3x3BAzvpEgfpNBWbnEZk\nrbZKarGjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEXzY2OjZkOjc0OjM1\nOmM3OjlmOjQ4OjJlOjljOjlmOjlmOmZjOjJmOjRkOjJmOjMxOjM5Ojg4OmI0OmFk\nOmVhOjA0OjQ5OjRkOjcyOjVjOmQyOjRkOmJkOjI1Ojk5OmU1MGoGA1UdIwRjMGGA\nXzY0Ojc0OjlhOmQ4OjY5OjUyOjEyOjg1OjI3OmQyOjUxOjBmOjk4OjI5OjI4OjA1\nOjg3OmNmOjc5OmExOjFiOmQyOjg2OmYzOjg3OmE3OjI5OmFmOmVkOjVjOjE1OjBi\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNJADBGAiEAq8y9723AkYMiJVW3bZDHErbyPn7MKZ5GiW0ohF1tnScC\nIQD0GghMnQFY2m7xHNVZYRWlNQhbO5zjNiWnjIk+oVVvMg==\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIGM+CMUFdUwKcFgrf8cldjsx+lI+yq8O+UGkXgwgljR1oAoGCCqGSM49\nAwEHoUQDQgAEX4aQg1aou+5TI2CyVpVQ8NMFuP2naxM4gtBnQLyhoXXclh1KVivb\n74G5Kl3fHcEDO+kSB+k0FZucRmSttkpqsQ==\n-----END EC PRIVATE KEY-----\n\032\365\006\n\362\006\032\357\006-----BEGIN CERTIFICATE-----\nMIICXTCCAgSgAwIBAgIIHpaNMI7Tn7YwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMjAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTIwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQTqYPJ\n/QgaPvZ2fFwIjgngf6feJZa3cXEde6t9/AsnmF5DMEgqdERQdQDWY3hjy6O58DA/\nC9YfdbZcqLtjvU5Ao4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzY0Ojc0OjlhOmQ4OjY5OjUyOjEyOjg1OjI3OmQyOjUx\nOjBmOjk4OjI5OjI4OjA1Ojg3OmNmOjc5OmExOjFiOmQyOjg2OmYzOjg3OmE3OjI5\nOmFmOmVkOjVjOjE1OjBiMGoGA1UdIwRjMGGAXzY0Ojc0OjlhOmQ4OjY5OjUyOjEy\nOjg1OjI3OmQyOjUxOjBmOjk4OjI5OjI4OjA1Ojg3OmNmOjc5OmExOjFiOmQyOjg2\nOmYzOjg3OmE3OjI5OmFmOmVkOjVjOjE1OjBiMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDRwAwRAIgcYl1FCcYoAzdoPXDJYs96smmLek8FcUj+UhHwelISRsC\nIHM0I0y2VgaLYPqKmdpisL+CpX8JxxAYT5jkTnCtOLqi\n-----END CERTIFICATE-----\n" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\030prepared_query:geo-cache\020\003\032\004\n\002\032\000\"\002\010\005Z\243\020\n\240\020\n\000\022\243\t\n\267\007\032\264\007-----BEGIN CERTIFICATE-----\nMIICjjCCAjOgAwIBAgIIdkWJ+sxRt/UwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMjAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABF+GkINWqLvuUyNg\nslaVUPDTBbj9p2sTOILQZ0C8oaF13JYdSlYr2++BuSpd3x3BAzvpEgfpNBWbnEZk\nrbZKarGjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEXzY2OjZkOjc0OjM1\nOmM3OjlmOjQ4OjJlOjljOjlmOjlmOmZjOjJmOjRkOjJmOjMxOjM5Ojg4OmI0OmFk\nOmVhOjA0OjQ5OjRkOjcyOjVjOmQyOjRkOmJkOjI1Ojk5OmU1MGoGA1UdIwRjMGGA\nXzY0Ojc0OjlhOmQ4OjY5OjUyOjEyOjg1OjI3OmQyOjUxOjBmOjk4OjI5OjI4OjA1\nOjg3OmNmOjc5OmExOjFiOmQyOjg2OmYzOjg3OmE3OjI5OmFmOmVkOjVjOjE1OjBi\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNJADBGAiEAq8y9723AkYMiJVW3bZDHErbyPn7MKZ5GiW0ohF1tnScC\nIQD0GghMnQFY2m7xHNVZYRWlNQhbO5zjNiWnjIk+oVVvMg==\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIGM+CMUFdUwKcFgrf8cldjsx+lI+yq8O+UGkXgwgljR1oAoGCCqGSM49\nAwEHoUQDQgAEX4aQg1aou+5TI2CyVpVQ8NMFuP2naxM4gtBnQLyhoXXclh1KVivb\n74G5Kl3fHcEDO+kSB+k0FZucRmSttkpqsQ==\n-----END EC PRIVATE KEY-----\n\032\365\006\n\362\006\032\357\006-----BEGIN CERTIFICATE-----\nMIICXTCCAgSgAwIBAgIIHpaNMI7Tn7YwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMjAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTIwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQTqYPJ\n/QgaPvZ2fFwIjgngf6feJZa3cXEde6t9/AsnmF5DMEgqdERQdQDWY3hjy6O58DA/\nC9YfdbZcqLtjvU5Ao4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzY0Ojc0OjlhOmQ4OjY5OjUyOjEyOjg1OjI3OmQyOjUx\nOjBmOjk4OjI5OjI4OjA1Ojg3OmNmOjc5OmExOjFiOmQyOjg2OmYzOjg3OmE3OjI5\nOmFmOmVkOjVjOjE1OjBiMGoGA1UdIwRjMGGAXzY0Ojc0OjlhOmQ4OjY5OjUyOjEy\nOjg1OjI3OmQyOjUxOjBmOjk4OjI5OjI4OjA1Ojg3OmNmOjc5OmExOjFiOmQyOjg2\nOmYzOjg3OmE3OjI5OmFmOmVkOjVjOjE1OjBiMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDRwAwRAIgcYl1FCcYoAzdoPXDJYs96smmLek8FcUj+UhHwelISRsC\nIHM0I0y2VgaLYPqKmdpisL+CpX8JxxAYT5jkTnCtOLqi\n-----END CERTIFICATE-----\n" > type_url:"type.googleapis.com/envoy.api.v2.Cluster" nonce:"00000001" 
=== RUN   TestServer_ConfigOverridesClusters/custom_upstream_with_no_type
version_info:"00000001" resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\tlocal_app\"\002\010\005:\020\n\016\022\t127.0.0.1\030\220?" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\tmyservice\020\003\032\004\n\002\032\000\"\002\010\005Z\243\020\n\240\020\n\000\022\237\t\n\263\007\032\260\007-----BEGIN CERTIFICATE-----\nMIICjTCCAjOgAwIBAgIIS767Sy/OTeEwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMzAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABFIY1UYko9R1qJhQ\n2rrihGjPcc1UhkPQ5Yva54xzPdlep+KGjtTiq3CW+Z40L5Mn4u2RzH9wzZ7MHl98\nhs6naGWjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEX2FmOjFkOmMwOjE5\nOjM5OjUyOjgxOmM5OjEyOjA1OmI2OjU2OjVhOjQ1OmUxOjc5OmEzOmI1OjUzOjBk\nOjBiOjZlOjU3OjJiOmMwOjkyOmJiOjllOmI5Ojg5OjRlOjM2MGoGA1UdIwRjMGGA\nXzIwOjZmOjdjOmZkOmQ2OjI5OmUzOjMyOmZjOjcyOjhjOjcwOjgwOjRhOjRjOjVm\nOmU4OmViOjhlOmNlOjQ3OjQ3OmE1OjA5OmQyOjIwOjg4OmVkOjFlOmI2OmZjOjU2\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNIADBFAiEAowfmFpFG6DfgqX0DtYZQKhRl1QPqxhr+H+gvSmM0cQUC\nIEv0rtbjExJHFX2RzyE5QArnFscUQ24ssUMD/k337GCu\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIBXGW7qGI/xxwSPilMtkPVmKHYKAPLNny9B0loxqImxsoAoGCCqGSM49\nAwEHoUQDQgAEUhjVRiSj1HWomFDauuKEaM9xzVSGQ9Dli9rnjHM92V6n4oaO1OKr\ncJb5njQvkyfi7ZHMf3DNnsweX3yGzqdoZQ==\n-----END EC PRIVATE KEY-----\n\032\371\006\n\366\006\032\363\006-----BEGIN CERTIFICATE-----\nMIICXzCCAgSgAwIBAgIIVDur7lvaVO4wCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMzAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTMwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAASO5T/1\nHDKgw4gdUQpZMpUYjtU/ycTMDQ5R+sYqpSBuTWWNTsFD69oHTpjLaJJryQYoNdIk\nR6qT+SFS0Eyo/DcHo4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzIwOjZmOjdjOmZkOmQ2OjI5OmUzOjMyOmZjOjcyOjhj\nOjcwOjgwOjRhOjRjOjVmOmU4OmViOjhlOmNlOjQ3OjQ3OmE1OjA5OmQyOjIwOjg4\nOmVkOjFlOmI2OmZjOjU2MGoGA1UdIwRjMGGAXzIwOjZmOjdjOmZkOmQ2OjI5OmUz\nOjMyOmZjOjcyOjhjOjcwOjgwOjRhOjRjOjVmOmU4OmViOjhlOmNlOjQ3OjQ3OmE1\nOjA5OmQyOjIwOjg4OmVkOjFlOmI2OmZjOjU2MD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDSQAwRgIhAN8V2txObkE1WdW1yBiVyyksjSpcTPs1TVG3L2/dAvZ/\nAiEAoJdzmTdLPMfIFaOV4yyciJJMOPAhszjjG+TYJaHoTmg=\n-----END CERTIFICATE-----\n" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\030prepared_query:geo-cache\020\003\032\004\n\002\032\000\"\002\010\005Z\243\020\n\240\020\n\000\022\237\t\n\263\007\032\260\007-----BEGIN CERTIFICATE-----\nMIICjTCCAjOgAwIBAgIIS767Sy/OTeEwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMzAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABFIY1UYko9R1qJhQ\n2rrihGjPcc1UhkPQ5Yva54xzPdlep+KGjtTiq3CW+Z40L5Mn4u2RzH9wzZ7MHl98\nhs6naGWjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEX2FmOjFkOmMwOjE5\nOjM5OjUyOjgxOmM5OjEyOjA1OmI2OjU2OjVhOjQ1OmUxOjc5OmEzOmI1OjUzOjBk\nOjBiOjZlOjU3OjJiOmMwOjkyOmJiOjllOmI5Ojg5OjRlOjM2MGoGA1UdIwRjMGGA\nXzIwOjZmOjdjOmZkOmQ2OjI5OmUzOjMyOmZjOjcyOjhjOjcwOjgwOjRhOjRjOjVm\nOmU4OmViOjhlOmNlOjQ3OjQ3OmE1OjA5OmQyOjIwOjg4OmVkOjFlOmI2OmZjOjU2\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNIADBFAiEAowfmFpFG6DfgqX0DtYZQKhRl1QPqxhr+H+gvSmM0cQUC\nIEv0rtbjExJHFX2RzyE5QArnFscUQ24ssUMD/k337GCu\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIBXGW7qGI/xxwSPilMtkPVmKHYKAPLNny9B0loxqImxsoAoGCCqGSM49\nAwEHoUQDQgAEUhjVRiSj1HWomFDauuKEaM9xzVSGQ9Dli9rnjHM92V6n4oaO1OKr\ncJb5njQvkyfi7ZHMf3DNnsweX3yGzqdoZQ==\n-----END EC PRIVATE KEY-----\n\032\371\006\n\366\006\032\363\006-----BEGIN CERTIFICATE-----\nMIICXzCCAgSgAwIBAgIIVDur7lvaVO4wCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxMzAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTMwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAASO5T/1\nHDKgw4gdUQpZMpUYjtU/ycTMDQ5R+sYqpSBuTWWNTsFD69oHTpjLaJJryQYoNdIk\nR6qT+SFS0Eyo/DcHo4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEXzIwOjZmOjdjOmZkOmQ2OjI5OmUzOjMyOmZjOjcyOjhj\nOjcwOjgwOjRhOjRjOjVmOmU4OmViOjhlOmNlOjQ3OjQ3OmE1OjA5OmQyOjIwOjg4\nOmVkOjFlOmI2OmZjOjU2MGoGA1UdIwRjMGGAXzIwOjZmOjdjOmZkOmQ2OjI5OmUz\nOjMyOmZjOjcyOjhjOjcwOjgwOjRhOjRjOjVmOmU4OmViOjhlOmNlOjQ3OjQ3OmE1\nOjA5OmQyOjIwOjg4OmVkOjFlOmI2OmZjOjU2MD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDSQAwRgIhAN8V2txObkE1WdW1yBiVyyksjSpcTPs1TVG3L2/dAvZ/\nAiEAoJdzmTdLPMfIFaOV4yyciJJMOPAhszjjG+TYJaHoTmg=\n-----END CERTIFICATE-----\n" > type_url:"type.googleapis.com/envoy.api.v2.Cluster" nonce:"00000001" 
=== RUN   TestServer_ConfigOverridesClusters/custom_upstream_with_type
version_info:"00000001" resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\tlocal_app\"\002\010\005:\020\n\016\022\t127.0.0.1\030\220?" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\tmyservice\020\003\032\004\n\002\032\000\"\002\010\005Z\237\020\n\234\020\n\000\022\237\t\n\263\007\032\260\007-----BEGIN CERTIFICATE-----\nMIICjTCCAjOgAwIBAgIIV/oacs9zqCswCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxNDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABAca8WvEDe/eI99s\nWJfyKbLnR3RxuqpOA2nST3qn0glO21XF58QTg7eo+QImMJ/GNFIplWZiNk2y0T+v\npPifzVOjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEX2E0Ojk5OmM1OjI0\nOjFiOjU0OmI3OjcwOjBiOmViOjI1OjI0OmE0OjRmOjNmOjc4OjUxOmM2OjQwOmU1\nOmRlOmRiOjA4OjFhOjNiOmEyOjZhOmUyOmE0OjY4OjJmOjFlMGoGA1UdIwRjMGGA\nX2MyOmViOmZhOjllOjJhOjYwOmU5Ojk0Ojg2OmRjOmFkOjQ2OmEyOjNmOjE1OjEz\nOmU1OmI5OmIxOjZkOjI0OmE5OjkwOmY5OjA2OjBmOjVkOjM4OjhiOjQ0OjdmOmMx\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNIADBFAiAOYjIAJOG3KS4hlZOJJ3+6KTJwl82+0pYjx4JWRO24EgIh\nANXP5M9H4ekgemLFWownMZXvasJxvpr+ptdaU0bZYT1o\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIEUFWpy5lYSgnfCvLE2dUxf87/lhyqSAExM5FWn7X0B0oAoGCCqGSM49\nAwEHoUQDQgAEBxrxa8QN794j32xYl/IpsudHdHG6qk4DadJPeqfSCU7bVcXnxBOD\nt6j5AiYwn8Y0UimVZmI2TbLRP6+k+J/NUw==\n-----END EC PRIVATE KEY-----\n\032\365\006\n\362\006\032\357\006-----BEGIN CERTIFICATE-----\nMIICXTCCAgSgAwIBAgIIDqY060jmzfwwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxNDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTQwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQngSlO\nc5XYWTAyv6KqrEPdMk3AYdy9PM9abao9wrQQX9GEiyiVjatQSiXdShDE3onfnB4r\nI1aCbYGDGFbkNNXPo4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEX2MyOmViOmZhOjllOjJhOjYwOmU5Ojk0Ojg2OmRjOmFk\nOjQ2OmEyOjNmOjE1OjEzOmU1OmI5OmIxOjZkOjI0OmE5OjkwOmY5OjA2OjBmOjVk\nOjM4OjhiOjQ0OjdmOmMxMGoGA1UdIwRjMGGAX2MyOmViOmZhOjllOjJhOjYwOmU5\nOjk0Ojg2OmRjOmFkOjQ2OmEyOjNmOjE1OjEzOmU1OmI5OmIxOjZkOjI0OmE5Ojkw\nOmY5OjA2OjBmOjVkOjM4OjhiOjQ0OjdmOmMxMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDRwAwRAIgKVKnNt4Py0WKGdM9uH8qZAp9AAyXDbjtHhbwWN4xw+IC\nIG/t4f2kDfnsc2qFxoPV8W29462VwLdLBscmS3BAEicS\n-----END CERTIFICATE-----\n" > resources:<type_url:"type.googleapis.com/envoy.api.v2.Cluster" value:"\n\030prepared_query:geo-cache\020\003\032\004\n\002\032\000\"\002\010\005Z\237\020\n\234\020\n\000\022\237\t\n\263\007\032\260\007-----BEGIN CERTIFICATE-----\nMIICjTCCAjOgAwIBAgIIV/oacs9zqCswCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxNDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMA4xDDAK\nBgNVBAMTA3dlYjBZMBMGByqGSM49AgEGCCqGSM49AwEHA0IABAca8WvEDe/eI99s\nWJfyKbLnR3RxuqpOA2nST3qn0glO21XF58QTg7eo+QImMJ/GNFIplWZiNk2y0T+v\npPifzVOjggFyMIIBbjAOBgNVHQ8BAf8EBAMCA7gwHQYDVR0lBBYwFAYIKwYBBQUH\nAwIGCCsGAQUFBwMBMAwGA1UdEwEB/wQCMAAwaAYDVR0OBGEEX2E0Ojk5OmM1OjI0\nOjFiOjU0OmI3OjcwOjBiOmViOjI1OjI0OmE0OjRmOjNmOjc4OjUxOmM2OjQwOmU1\nOmRlOmRiOjA4OjFhOjNiOmEyOjZhOmUyOmE0OjY4OjJmOjFlMGoGA1UdIwRjMGGA\nX2MyOmViOmZhOjllOjJhOjYwOmU5Ojk0Ojg2OmRjOmFkOjQ2OmEyOjNmOjE1OjEz\nOmU1OmI5OmIxOjZkOjI0OmE5OjkwOmY5OjA2OjBmOjVkOjM4OjhiOjQ0OjdmOmMx\nMFkGA1UdEQRSMFCGTnNwaWZmZTovLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1\nNTU1NTU1NTU1NS5jb25zdWwvbnMvZGVmYXVsdC9kYy9kYzEvc3ZjL3dlYjAKBggq\nhkjOPQQDAgNIADBFAiAOYjIAJOG3KS4hlZOJJ3+6KTJwl82+0pYjx4JWRO24EgIh\nANXP5M9H4ekgemLFWownMZXvasJxvpr+ptdaU0bZYT1o\n-----END CERTIFICATE-----\n\022\346\001\032\343\001-----BEGIN EC PRIVATE KEY-----\nMHcCAQEEIEUFWpy5lYSgnfCvLE2dUxf87/lhyqSAExM5FWn7X0B0oAoGCCqGSM49\nAwEHoUQDQgAEBxrxa8QN794j32xYl/IpsudHdHG6qk4DadJPeqfSCU7bVcXnxBOD\nt6j5AiYwn8Y0UimVZmI2TbLRP6+k+J/NUw==\n-----END EC PRIVATE KEY-----\n\032\365\006\n\362\006\032\357\006-----BEGIN CERTIFICATE-----\nMIICXTCCAgSgAwIBAgIIDqY060jmzfwwCgYIKoZIzj0EAwIwFTETMBEGA1UEAxMK\nVGVzdCBDQSAxNDAeFw0xOTExMjcwMjE4MDdaFw0yOTExMjcwMjE4MDdaMBUxEzAR\nBgNVBAMTClRlc3QgQ0EgMTQwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQngSlO\nc5XYWTAyv6KqrEPdMk3AYdy9PM9abao9wrQQX9GEiyiVjatQSiXdShDE3onfnB4r\nI1aCbYGDGFbkNNXPo4IBPDCCATgwDgYDVR0PAQH/BAQDAgGGMA8GA1UdEwEB/wQF\nMAMBAf8waAYDVR0OBGEEX2MyOmViOmZhOjllOjJhOjYwOmU5Ojk0Ojg2OmRjOmFk\nOjQ2OmEyOjNmOjE1OjEzOmU1OmI5OmIxOjZkOjI0OmE5OjkwOmY5OjA2OjBmOjVk\nOjM4OjhiOjQ0OjdmOmMxMGoGA1UdIwRjMGGAX2MyOmViOmZhOjllOjJhOjYwOmU5\nOjk0Ojg2OmRjOmFkOjQ2OmEyOjNmOjE1OjEzOmU1OmI5OmIxOjZkOjI0OmE5Ojkw\nOmY5OjA2OjBmOjVkOjM4OjhiOjQ0OjdmOmMxMD8GA1UdEQQ4MDaGNHNwaWZmZTov\nLzExMTExMTExLTIyMjItMzMzMy00NDQ0LTU1NTU1NTU1NTU1NS5jb25zdWwwCgYI\nKoZIzj0EAwIDRwAwRAIgKVKnNt4Py0WKGdM9uH8qZAp9AAyXDbjtHhbwWN4xw+IC\nIG/t4f2kDfnsc2qFxoPV8W29462VwLdLBscmS3BAEicS\n-----END CERTIFICATE-----\n" > type_url:"type.googleapis.com/envoy.api.v2.Cluster" nonce:"00000001" 
--- PASS: TestServer_ConfigOverridesClusters (0.18s)
    --- PASS: TestServer_ConfigOverridesClusters/sanity_check_no_custom (0.04s)
    --- PASS: TestServer_ConfigOverridesClusters/custom_public_with_no_type (0.04s)
    --- PASS: TestServer_ConfigOverridesClusters/custom_public_with_type (0.03s)
    --- PASS: TestServer_ConfigOverridesClusters/custom_upstream_with_no_type (0.04s)
    --- PASS: TestServer_ConfigOverridesClusters/custom_upstream_with_type (0.04s)
PASS
ok  	github.com/hashicorp/consul/agent/xds	1.083s
?   	github.com/hashicorp/consul/command	[no test files]
?   	github.com/hashicorp/consul/command/acl	[no test files]
=== RUN   TestAgentTokensCommand_noTabs
=== PAUSE TestAgentTokensCommand_noTabs
=== RUN   TestAgentTokensCommand
=== PAUSE TestAgentTokensCommand
=== CONT  TestAgentTokensCommand_noTabs
=== CONT  TestAgentTokensCommand
--- PASS: TestAgentTokensCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestAgentTokensCommand - 2019/11/27 02:18:23.842014 [WARN] agent: Node name "Node 96330595-b808-8736-00bd-ed7cfa77c737" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentTokensCommand - 2019/11/27 02:18:23.842837 [DEBUG] tlsutil: Update with version 1
TestAgentTokensCommand - 2019/11/27 02:18:23.842908 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentTokensCommand - 2019/11/27 02:18:23.843218 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentTokensCommand - 2019/11/27 02:18:23.843539 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:28 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:96330595-b808-8736-00bd-ed7cfa77c737 Address:127.0.0.1:43006}]
2019/11/27 02:18:28 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
2019/11/27 02:18:28 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:28 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
TestAgentTokensCommand - 2019/11/27 02:18:28.722711 [INFO] serf: EventMemberJoin: Node 96330595-b808-8736-00bd-ed7cfa77c737.dc1 127.0.0.1
TestAgentTokensCommand - 2019/11/27 02:18:28.726788 [INFO] serf: EventMemberJoin: Node 96330595-b808-8736-00bd-ed7cfa77c737 127.0.0.1
TestAgentTokensCommand - 2019/11/27 02:18:28.727701 [INFO] consul: Adding LAN server Node 96330595-b808-8736-00bd-ed7cfa77c737 (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestAgentTokensCommand - 2019/11/27 02:18:28.728590 [INFO] consul: Handled member-join event for server "Node 96330595-b808-8736-00bd-ed7cfa77c737.dc1" in area "wan"
TestAgentTokensCommand - 2019/11/27 02:18:28.729209 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestAgentTokensCommand - 2019/11/27 02:18:28.729274 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestAgentTokensCommand - 2019/11/27 02:18:28.736808 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestAgentTokensCommand - 2019/11/27 02:18:28.736928 [INFO] agent: started state syncer
2019/11/27 02:18:29 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:29 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestAgentTokensCommand - 2019/11/27 02:18:29.374942 [INFO] consul: cluster leadership acquired
TestAgentTokensCommand - 2019/11/27 02:18:29.375578 [INFO] consul: New leader elected: Node 96330595-b808-8736-00bd-ed7cfa77c737
TestAgentTokensCommand - 2019/11/27 02:18:29.482295 [INFO] acl: initializing acls
TestAgentTokensCommand - 2019/11/27 02:18:29.703875 [ERR] agent: failed to sync remote state: ACL not found
TestAgentTokensCommand - 2019/11/27 02:18:29.774828 [INFO] acl: initializing acls
TestAgentTokensCommand - 2019/11/27 02:18:29.775220 [INFO] consul: Created ACL 'global-management' policy
TestAgentTokensCommand - 2019/11/27 02:18:29.775275 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentTokensCommand - 2019/11/27 02:18:29.944885 [INFO] consul: Created ACL 'global-management' policy
TestAgentTokensCommand - 2019/11/27 02:18:29.944979 [WARN] consul: Configuring a non-UUID master token is deprecated
TestAgentTokensCommand - 2019/11/27 02:18:30.153511 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentTokensCommand - 2019/11/27 02:18:30.475384 [INFO] consul: Bootstrapped ACL master token from configuration
TestAgentTokensCommand - 2019/11/27 02:18:30.476055 [INFO] consul: Created ACL anonymous token from configuration
TestAgentTokensCommand - 2019/11/27 02:18:30.476150 [DEBUG] acl: transitioning out of legacy ACL mode
TestAgentTokensCommand - 2019/11/27 02:18:30.477170 [INFO] serf: EventMemberUpdate: Node 96330595-b808-8736-00bd-ed7cfa77c737
TestAgentTokensCommand - 2019/11/27 02:18:30.481633 [INFO] serf: EventMemberUpdate: Node 96330595-b808-8736-00bd-ed7cfa77c737.dc1
TestAgentTokensCommand - 2019/11/27 02:18:30.630952 [INFO] consul: Created ACL anonymous token from configuration
TestAgentTokensCommand - 2019/11/27 02:18:30.632145 [INFO] serf: EventMemberUpdate: Node 96330595-b808-8736-00bd-ed7cfa77c737
TestAgentTokensCommand - 2019/11/27 02:18:30.632939 [INFO] serf: EventMemberUpdate: Node 96330595-b808-8736-00bd-ed7cfa77c737.dc1
TestAgentTokensCommand - 2019/11/27 02:18:31.564150 [INFO] agent: Synced node info
TestAgentTokensCommand - 2019/11/27 02:18:31.564273 [DEBUG] agent: Node info in sync
TestAgentTokensCommand - 2019/11/27 02:18:31.946065 [DEBUG] http: Request PUT /v1/acl/token (367.485271ms) from=127.0.0.1:37308
TestAgentTokensCommand - 2019/11/27 02:18:31.952765 [INFO] agent: Updated agent's ACL token "default"
TestAgentTokensCommand - 2019/11/27 02:18:31.952874 [DEBUG] http: Request PUT /v1/agent/token/default (838.365µs) from=127.0.0.1:37310
TestAgentTokensCommand - 2019/11/27 02:18:31.968033 [INFO] agent: Updated agent's ACL token "agent"
TestAgentTokensCommand - 2019/11/27 02:18:31.968164 [DEBUG] http: Request PUT /v1/agent/token/agent (821.031µs) from=127.0.0.1:37312
TestAgentTokensCommand - 2019/11/27 02:18:31.971484 [INFO] agent: Updated agent's ACL token "agent_master"
TestAgentTokensCommand - 2019/11/27 02:18:31.971601 [DEBUG] http: Request PUT /v1/agent/token/agent_master (750.361µs) from=127.0.0.1:37314
TestAgentTokensCommand - 2019/11/27 02:18:31.975543 [INFO] agent: Updated agent's ACL token "replication"
TestAgentTokensCommand - 2019/11/27 02:18:31.975650 [DEBUG] http: Request PUT /v1/agent/token/replication (581.689µs) from=127.0.0.1:37316
TestAgentTokensCommand - 2019/11/27 02:18:31.976327 [INFO] agent: Requesting shutdown
TestAgentTokensCommand - 2019/11/27 02:18:31.976409 [INFO] consul: shutting down server
TestAgentTokensCommand - 2019/11/27 02:18:31.976452 [WARN] serf: Shutdown without a Leave
TestAgentTokensCommand - 2019/11/27 02:18:32.085308 [WARN] serf: Shutdown without a Leave
TestAgentTokensCommand - 2019/11/27 02:18:32.202923 [INFO] manager: shutting down
TestAgentTokensCommand - 2019/11/27 02:18:32.203310 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestAgentTokensCommand - 2019/11/27 02:18:32.203455 [INFO] agent: consul server down
TestAgentTokensCommand - 2019/11/27 02:18:32.203495 [INFO] agent: shutdown complete
TestAgentTokensCommand - 2019/11/27 02:18:32.203543 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestAgentTokensCommand - 2019/11/27 02:18:32.203672 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestAgentTokensCommand - 2019/11/27 02:18:32.203822 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestAgentTokensCommand - 2019/11/27 02:18:32.204710 [INFO] agent: Waiting for endpoints to shut down
TestAgentTokensCommand - 2019/11/27 02:18:32.204896 [INFO] agent: Endpoints down
--- PASS: TestAgentTokensCommand (8.44s)
PASS
ok  	github.com/hashicorp/consul/command/acl/agenttokens	8.585s
=== RUN   TestBootstrapCommand_noTabs
=== PAUSE TestBootstrapCommand_noTabs
=== RUN   TestBootstrapCommand
=== PAUSE TestBootstrapCommand
=== CONT  TestBootstrapCommand_noTabs
=== CONT  TestBootstrapCommand
--- PASS: TestBootstrapCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestBootstrapCommand - 2019/11/27 02:18:33.218746 [WARN] agent: Node name "Node 981352ac-d0b7-b7ae-918e-780c939bca18" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestBootstrapCommand - 2019/11/27 02:18:33.220148 [DEBUG] tlsutil: Update with version 1
TestBootstrapCommand - 2019/11/27 02:18:33.220412 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestBootstrapCommand - 2019/11/27 02:18:33.223730 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestBootstrapCommand - 2019/11/27 02:18:33.224426 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:38 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:981352ac-d0b7-b7ae-918e-780c939bca18 Address:127.0.0.1:13006}]
2019/11/27 02:18:38 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestBootstrapCommand - 2019/11/27 02:18:38.270656 [INFO] serf: EventMemberJoin: Node 981352ac-d0b7-b7ae-918e-780c939bca18.dc1 127.0.0.1
TestBootstrapCommand - 2019/11/27 02:18:38.276377 [INFO] serf: EventMemberJoin: Node 981352ac-d0b7-b7ae-918e-780c939bca18 127.0.0.1
TestBootstrapCommand - 2019/11/27 02:18:38.278752 [INFO] consul: Adding LAN server Node 981352ac-d0b7-b7ae-918e-780c939bca18 (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestBootstrapCommand - 2019/11/27 02:18:38.280954 [INFO] consul: Handled member-join event for server "Node 981352ac-d0b7-b7ae-918e-780c939bca18.dc1" in area "wan"
TestBootstrapCommand - 2019/11/27 02:18:38.281853 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestBootstrapCommand - 2019/11/27 02:18:38.281927 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestBootstrapCommand - 2019/11/27 02:18:38.284570 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestBootstrapCommand - 2019/11/27 02:18:38.284825 [INFO] agent: started state syncer
2019/11/27 02:18:38 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:38 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:39 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:39 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestBootstrapCommand - 2019/11/27 02:18:39.685914 [INFO] consul: cluster leadership acquired
TestBootstrapCommand - 2019/11/27 02:18:39.686744 [INFO] consul: New leader elected: Node 981352ac-d0b7-b7ae-918e-780c939bca18
TestBootstrapCommand - 2019/11/27 02:18:39.832879 [INFO] acl: initializing acls
TestBootstrapCommand - 2019/11/27 02:18:40.011762 [ERR] agent: failed to sync remote state: ACL not found
TestBootstrapCommand - 2019/11/27 02:18:40.660645 [INFO] acl: initializing acls
TestBootstrapCommand - 2019/11/27 02:18:40.661300 [INFO] consul: Created ACL 'global-management' policy
TestBootstrapCommand - 2019/11/27 02:18:41.430039 [INFO] consul: Created ACL 'global-management' policy
TestBootstrapCommand - 2019/11/27 02:18:41.785631 [INFO] consul: Created ACL anonymous token from configuration
TestBootstrapCommand - 2019/11/27 02:18:41.785746 [DEBUG] acl: transitioning out of legacy ACL mode
TestBootstrapCommand - 2019/11/27 02:18:41.787950 [INFO] serf: EventMemberUpdate: Node 981352ac-d0b7-b7ae-918e-780c939bca18
TestBootstrapCommand - 2019/11/27 02:18:41.789243 [INFO] serf: EventMemberUpdate: Node 981352ac-d0b7-b7ae-918e-780c939bca18.dc1
TestBootstrapCommand - 2019/11/27 02:18:42.074478 [INFO] consul: Created ACL anonymous token from configuration
TestBootstrapCommand - 2019/11/27 02:18:42.075292 [INFO] serf: EventMemberUpdate: Node 981352ac-d0b7-b7ae-918e-780c939bca18
TestBootstrapCommand - 2019/11/27 02:18:42.075870 [INFO] serf: EventMemberUpdate: Node 981352ac-d0b7-b7ae-918e-780c939bca18.dc1
TestBootstrapCommand - 2019/11/27 02:18:43.552080 [INFO] agent: Synced node info
TestBootstrapCommand - 2019/11/27 02:18:43.552210 [DEBUG] agent: Node info in sync
TestBootstrapCommand - 2019/11/27 02:18:43.552542 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestBootstrapCommand - 2019/11/27 02:18:43.552967 [DEBUG] consul: Skipping self join check for "Node 981352ac-d0b7-b7ae-918e-780c939bca18" since the cluster is too small
TestBootstrapCommand - 2019/11/27 02:18:43.553098 [INFO] consul: member 'Node 981352ac-d0b7-b7ae-918e-780c939bca18' joined, marking health alive
TestBootstrapCommand - 2019/11/27 02:18:43.562108 [WARN] acl.bootstrap: failed to remove bootstrap file: remove /tmp/TestBootstrapCommand-agent098207276/acl-bootstrap-reset: no such file or directory
TestBootstrapCommand - 2019/11/27 02:18:43.917986 [DEBUG] consul: Skipping self join check for "Node 981352ac-d0b7-b7ae-918e-780c939bca18" since the cluster is too small
TestBootstrapCommand - 2019/11/27 02:18:43.918492 [DEBUG] consul: Skipping self join check for "Node 981352ac-d0b7-b7ae-918e-780c939bca18" since the cluster is too small
TestBootstrapCommand - 2019/11/27 02:18:43.919673 [INFO] consul.acl: ACL bootstrap completed
TestBootstrapCommand - 2019/11/27 02:18:43.932480 [DEBUG] http: Request PUT /v1/acl/bootstrap (370.961348ms) from=127.0.0.1:44662
TestBootstrapCommand - 2019/11/27 02:18:43.937043 [INFO] agent: Requesting shutdown
TestBootstrapCommand - 2019/11/27 02:18:43.937145 [INFO] consul: shutting down server
TestBootstrapCommand - 2019/11/27 02:18:43.937191 [WARN] serf: Shutdown without a Leave
TestBootstrapCommand - 2019/11/27 02:18:44.140077 [WARN] serf: Shutdown without a Leave
TestBootstrapCommand - 2019/11/27 02:18:44.206877 [INFO] manager: shutting down
TestBootstrapCommand - 2019/11/27 02:18:44.207333 [INFO] agent: consul server down
TestBootstrapCommand - 2019/11/27 02:18:44.207391 [INFO] agent: shutdown complete
TestBootstrapCommand - 2019/11/27 02:18:44.207454 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestBootstrapCommand - 2019/11/27 02:18:44.207596 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestBootstrapCommand - 2019/11/27 02:18:44.207825 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestBootstrapCommand - 2019/11/27 02:18:44.208356 [INFO] agent: Waiting for endpoints to shut down
TestBootstrapCommand - 2019/11/27 02:18:44.208452 [INFO] agent: Endpoints down
--- PASS: TestBootstrapCommand (11.09s)
PASS
ok  	github.com/hashicorp/consul/command/acl/bootstrap	11.228s
?   	github.com/hashicorp/consul/command/acl/policy	[no test files]
=== RUN   TestPolicyCreateCommand_noTabs
=== PAUSE TestPolicyCreateCommand_noTabs
=== RUN   TestPolicyCreateCommand
=== PAUSE TestPolicyCreateCommand
=== CONT  TestPolicyCreateCommand_noTabs
=== CONT  TestPolicyCreateCommand
--- PASS: TestPolicyCreateCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyCreateCommand - 2019/11/27 02:18:46.402049 [WARN] agent: Node name "Node 18c48bcb-de89-255c-2eaf-36575de08404" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyCreateCommand - 2019/11/27 02:18:46.403128 [DEBUG] tlsutil: Update with version 1
TestPolicyCreateCommand - 2019/11/27 02:18:46.403217 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyCreateCommand - 2019/11/27 02:18:46.403459 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestPolicyCreateCommand - 2019/11/27 02:18:46.403991 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:18:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:18c48bcb-de89-255c-2eaf-36575de08404 Address:127.0.0.1:17506}]
2019/11/27 02:18:47 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestPolicyCreateCommand - 2019/11/27 02:18:47.915435 [INFO] serf: EventMemberJoin: Node 18c48bcb-de89-255c-2eaf-36575de08404.dc1 127.0.0.1
TestPolicyCreateCommand - 2019/11/27 02:18:47.920374 [INFO] serf: EventMemberJoin: Node 18c48bcb-de89-255c-2eaf-36575de08404 127.0.0.1
TestPolicyCreateCommand - 2019/11/27 02:18:47.921393 [INFO] consul: Adding LAN server Node 18c48bcb-de89-255c-2eaf-36575de08404 (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestPolicyCreateCommand - 2019/11/27 02:18:47.923900 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestPolicyCreateCommand - 2019/11/27 02:18:47.924202 [INFO] consul: Handled member-join event for server "Node 18c48bcb-de89-255c-2eaf-36575de08404.dc1" in area "wan"
TestPolicyCreateCommand - 2019/11/27 02:18:47.924879 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestPolicyCreateCommand - 2019/11/27 02:18:47.927724 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestPolicyCreateCommand - 2019/11/27 02:18:47.928054 [INFO] agent: started state syncer
2019/11/27 02:18:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:18:47 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/11/27 02:18:48 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:18:48 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestPolicyCreateCommand - 2019/11/27 02:18:48.418146 [INFO] consul: cluster leadership acquired
TestPolicyCreateCommand - 2019/11/27 02:18:48.418880 [INFO] consul: New leader elected: Node 18c48bcb-de89-255c-2eaf-36575de08404
TestPolicyCreateCommand - 2019/11/27 02:18:48.676043 [INFO] acl: initializing acls
TestPolicyCreateCommand - 2019/11/27 02:18:48.699185 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyCreateCommand - 2019/11/27 02:18:49.654990 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyCreateCommand - 2019/11/27 02:18:49.684684 [INFO] acl: initializing acls
TestPolicyCreateCommand - 2019/11/27 02:18:49.685103 [INFO] consul: Created ACL 'global-management' policy
TestPolicyCreateCommand - 2019/11/27 02:18:49.685168 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyCreateCommand - 2019/11/27 02:18:50.275060 [INFO] consul: Created ACL 'global-management' policy
TestPolicyCreateCommand - 2019/11/27 02:18:50.275143 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyCreateCommand - 2019/11/27 02:18:50.275699 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyCreateCommand - 2019/11/27 02:18:51.584504 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyCreateCommand - 2019/11/27 02:18:51.585277 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyCreateCommand - 2019/11/27 02:18:51.585361 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyCreateCommand - 2019/11/27 02:18:51.586313 [INFO] serf: EventMemberUpdate: Node 18c48bcb-de89-255c-2eaf-36575de08404
TestPolicyCreateCommand - 2019/11/27 02:18:51.587023 [INFO] serf: EventMemberUpdate: Node 18c48bcb-de89-255c-2eaf-36575de08404.dc1
TestPolicyCreateCommand - 2019/11/27 02:18:51.810265 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyCreateCommand - 2019/11/27 02:18:51.811154 [INFO] serf: EventMemberUpdate: Node 18c48bcb-de89-255c-2eaf-36575de08404
TestPolicyCreateCommand - 2019/11/27 02:18:51.815892 [INFO] serf: EventMemberUpdate: Node 18c48bcb-de89-255c-2eaf-36575de08404.dc1
TestPolicyCreateCommand - 2019/11/27 02:18:52.818840 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyCreateCommand - 2019/11/27 02:18:52.819811 [DEBUG] consul: Skipping self join check for "Node 18c48bcb-de89-255c-2eaf-36575de08404" since the cluster is too small
TestPolicyCreateCommand - 2019/11/27 02:18:52.819930 [INFO] consul: member 'Node 18c48bcb-de89-255c-2eaf-36575de08404' joined, marking health alive
TestPolicyCreateCommand - 2019/11/27 02:18:53.087285 [DEBUG] consul: Skipping self join check for "Node 18c48bcb-de89-255c-2eaf-36575de08404" since the cluster is too small
TestPolicyCreateCommand - 2019/11/27 02:18:53.088212 [DEBUG] consul: Skipping self join check for "Node 18c48bcb-de89-255c-2eaf-36575de08404" since the cluster is too small
TestPolicyCreateCommand - 2019/11/27 02:18:53.320072 [DEBUG] http: Request PUT /v1/acl/policy (216.8315ms) from=127.0.0.1:44792
TestPolicyCreateCommand - 2019/11/27 02:18:53.324579 [INFO] agent: Requesting shutdown
TestPolicyCreateCommand - 2019/11/27 02:18:53.324696 [INFO] consul: shutting down server
TestPolicyCreateCommand - 2019/11/27 02:18:53.324746 [WARN] serf: Shutdown without a Leave
TestPolicyCreateCommand - 2019/11/27 02:18:53.461833 [WARN] serf: Shutdown without a Leave
TestPolicyCreateCommand - 2019/11/27 02:18:53.517345 [INFO] manager: shutting down
TestPolicyCreateCommand - 2019/11/27 02:18:53.518012 [INFO] agent: consul server down
TestPolicyCreateCommand - 2019/11/27 02:18:53.518069 [INFO] agent: shutdown complete
TestPolicyCreateCommand - 2019/11/27 02:18:53.518129 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestPolicyCreateCommand - 2019/11/27 02:18:53.518264 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestPolicyCreateCommand - 2019/11/27 02:18:53.518420 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestPolicyCreateCommand - 2019/11/27 02:18:53.518893 [INFO] agent: Waiting for endpoints to shut down
TestPolicyCreateCommand - 2019/11/27 02:18:53.519058 [INFO] agent: Endpoints down
--- PASS: TestPolicyCreateCommand (7.21s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/create	7.351s
=== RUN   TestPolicyDeleteCommand_noTabs
=== PAUSE TestPolicyDeleteCommand_noTabs
=== RUN   TestPolicyDeleteCommand
=== PAUSE TestPolicyDeleteCommand
=== CONT  TestPolicyDeleteCommand_noTabs
=== CONT  TestPolicyDeleteCommand
--- PASS: TestPolicyDeleteCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyDeleteCommand - 2019/11/27 02:19:05.829820 [WARN] agent: Node name "Node f1580117-c4b5-8e62-0058-a04818483d96" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyDeleteCommand - 2019/11/27 02:19:05.830826 [DEBUG] tlsutil: Update with version 1
TestPolicyDeleteCommand - 2019/11/27 02:19:05.830909 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyDeleteCommand - 2019/11/27 02:19:05.831320 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestPolicyDeleteCommand - 2019/11/27 02:19:05.831786 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f1580117-c4b5-8e62-0058-a04818483d96 Address:127.0.0.1:14506}]
2019/11/27 02:19:07 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestPolicyDeleteCommand - 2019/11/27 02:19:07.411417 [INFO] serf: EventMemberJoin: Node f1580117-c4b5-8e62-0058-a04818483d96.dc1 127.0.0.1
TestPolicyDeleteCommand - 2019/11/27 02:19:07.417958 [INFO] serf: EventMemberJoin: Node f1580117-c4b5-8e62-0058-a04818483d96 127.0.0.1
TestPolicyDeleteCommand - 2019/11/27 02:19:07.418986 [INFO] consul: Adding LAN server Node f1580117-c4b5-8e62-0058-a04818483d96 (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestPolicyDeleteCommand - 2019/11/27 02:19:07.419357 [INFO] consul: Handled member-join event for server "Node f1580117-c4b5-8e62-0058-a04818483d96.dc1" in area "wan"
TestPolicyDeleteCommand - 2019/11/27 02:19:07.420004 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestPolicyDeleteCommand - 2019/11/27 02:19:07.420460 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestPolicyDeleteCommand - 2019/11/27 02:19:07.423033 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestPolicyDeleteCommand - 2019/11/27 02:19:07.423149 [INFO] agent: started state syncer
2019/11/27 02:19:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:07 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/11/27 02:19:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:08 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestPolicyDeleteCommand - 2019/11/27 02:19:08.462538 [INFO] consul: cluster leadership acquired
TestPolicyDeleteCommand - 2019/11/27 02:19:08.463142 [INFO] consul: New leader elected: Node f1580117-c4b5-8e62-0058-a04818483d96
TestPolicyDeleteCommand - 2019/11/27 02:19:08.532924 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyDeleteCommand - 2019/11/27 02:19:08.970898 [INFO] acl: initializing acls
TestPolicyDeleteCommand - 2019/11/27 02:19:09.043430 [INFO] acl: initializing acls
TestPolicyDeleteCommand - 2019/11/27 02:19:09.697070 [INFO] consul: Created ACL 'global-management' policy
TestPolicyDeleteCommand - 2019/11/27 02:19:09.697153 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyDeleteCommand - 2019/11/27 02:19:09.698653 [INFO] consul: Created ACL 'global-management' policy
TestPolicyDeleteCommand - 2019/11/27 02:19:09.698725 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyDeleteCommand - 2019/11/27 02:19:10.618533 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyDeleteCommand - 2019/11/27 02:19:10.619114 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyDeleteCommand - 2019/11/27 02:19:10.996438 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyDeleteCommand - 2019/11/27 02:19:11.695737 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyDeleteCommand - 2019/11/27 02:19:11.695830 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyDeleteCommand - 2019/11/27 02:19:11.696796 [INFO] serf: EventMemberUpdate: Node f1580117-c4b5-8e62-0058-a04818483d96
TestPolicyDeleteCommand - 2019/11/27 02:19:11.697172 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyDeleteCommand - 2019/11/27 02:19:11.698107 [INFO] serf: EventMemberUpdate: Node f1580117-c4b5-8e62-0058-a04818483d96
TestPolicyDeleteCommand - 2019/11/27 02:19:11.698114 [INFO] serf: EventMemberUpdate: Node f1580117-c4b5-8e62-0058-a04818483d96.dc1
TestPolicyDeleteCommand - 2019/11/27 02:19:11.698819 [INFO] serf: EventMemberUpdate: Node f1580117-c4b5-8e62-0058-a04818483d96.dc1
TestPolicyDeleteCommand - 2019/11/27 02:19:13.506033 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyDeleteCommand - 2019/11/27 02:19:13.506499 [DEBUG] consul: Skipping self join check for "Node f1580117-c4b5-8e62-0058-a04818483d96" since the cluster is too small
TestPolicyDeleteCommand - 2019/11/27 02:19:13.506614 [INFO] consul: member 'Node f1580117-c4b5-8e62-0058-a04818483d96' joined, marking health alive
TestPolicyDeleteCommand - 2019/11/27 02:19:14.114328 [DEBUG] consul: Skipping self join check for "Node f1580117-c4b5-8e62-0058-a04818483d96" since the cluster is too small
TestPolicyDeleteCommand - 2019/11/27 02:19:14.115090 [DEBUG] consul: Skipping self join check for "Node f1580117-c4b5-8e62-0058-a04818483d96" since the cluster is too small
TestPolicyDeleteCommand - 2019/11/27 02:19:14.940592 [DEBUG] http: Request PUT /v1/acl/policy (806.965203ms) from=127.0.0.1:48912
TestPolicyDeleteCommand - 2019/11/27 02:19:15.673256 [DEBUG] http: Request DELETE /v1/acl/policy/6d7880c3-3966-1d01-9f49-89c51136a8f7 (728.404256ms) from=127.0.0.1:48914
TestPolicyDeleteCommand - 2019/11/27 02:19:15.675649 [ERR] http: Request GET /v1/acl/policy/6d7880c3-3966-1d01-9f49-89c51136a8f7, error: ACL not found from=127.0.0.1:48912
TestPolicyDeleteCommand - 2019/11/27 02:19:15.679115 [DEBUG] http: Request GET /v1/acl/policy/6d7880c3-3966-1d01-9f49-89c51136a8f7 (3.858144ms) from=127.0.0.1:48912
TestPolicyDeleteCommand - 2019/11/27 02:19:15.680828 [INFO] agent: Requesting shutdown
TestPolicyDeleteCommand - 2019/11/27 02:19:15.680912 [INFO] consul: shutting down server
TestPolicyDeleteCommand - 2019/11/27 02:19:15.680961 [WARN] serf: Shutdown without a Leave
TestPolicyDeleteCommand - 2019/11/27 02:19:15.804929 [WARN] serf: Shutdown without a Leave
TestPolicyDeleteCommand - 2019/11/27 02:19:15.927233 [INFO] manager: shutting down
TestPolicyDeleteCommand - 2019/11/27 02:19:15.928051 [INFO] agent: consul server down
TestPolicyDeleteCommand - 2019/11/27 02:19:15.928129 [INFO] agent: shutdown complete
TestPolicyDeleteCommand - 2019/11/27 02:19:15.928211 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestPolicyDeleteCommand - 2019/11/27 02:19:15.928361 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestPolicyDeleteCommand - 2019/11/27 02:19:15.928520 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestPolicyDeleteCommand - 2019/11/27 02:19:15.928989 [INFO] agent: Waiting for endpoints to shut down
TestPolicyDeleteCommand - 2019/11/27 02:19:15.929074 [INFO] agent: Endpoints down
--- PASS: TestPolicyDeleteCommand (10.20s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/delete	10.341s
=== RUN   TestPolicyListCommand_noTabs
=== PAUSE TestPolicyListCommand_noTabs
=== RUN   TestPolicyListCommand
=== PAUSE TestPolicyListCommand
=== CONT  TestPolicyListCommand_noTabs
=== CONT  TestPolicyListCommand
--- PASS: TestPolicyListCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyListCommand - 2019/11/27 02:19:16.555910 [WARN] agent: Node name "Node 12df14d5-1698-efa1-2986-fb6cba56eb33" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyListCommand - 2019/11/27 02:19:16.557149 [DEBUG] tlsutil: Update with version 1
TestPolicyListCommand - 2019/11/27 02:19:16.557241 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyListCommand - 2019/11/27 02:19:16.557579 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestPolicyListCommand - 2019/11/27 02:19:16.558082 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:17 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:12df14d5-1698-efa1-2986-fb6cba56eb33 Address:127.0.0.1:38506}]
2019/11/27 02:19:17 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestPolicyListCommand - 2019/11/27 02:19:17.900415 [INFO] serf: EventMemberJoin: Node 12df14d5-1698-efa1-2986-fb6cba56eb33.dc1 127.0.0.1
TestPolicyListCommand - 2019/11/27 02:19:17.905471 [INFO] serf: EventMemberJoin: Node 12df14d5-1698-efa1-2986-fb6cba56eb33 127.0.0.1
TestPolicyListCommand - 2019/11/27 02:19:17.910694 [INFO] consul: Adding LAN server Node 12df14d5-1698-efa1-2986-fb6cba56eb33 (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestPolicyListCommand - 2019/11/27 02:19:17.911217 [INFO] consul: Handled member-join event for server "Node 12df14d5-1698-efa1-2986-fb6cba56eb33.dc1" in area "wan"
TestPolicyListCommand - 2019/11/27 02:19:17.911897 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestPolicyListCommand - 2019/11/27 02:19:17.912134 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestPolicyListCommand - 2019/11/27 02:19:17.916294 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestPolicyListCommand - 2019/11/27 02:19:17.916593 [INFO] agent: started state syncer
2019/11/27 02:19:17 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:17 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/11/27 02:19:18 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:18 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestPolicyListCommand - 2019/11/27 02:19:18.917841 [INFO] consul: cluster leadership acquired
TestPolicyListCommand - 2019/11/27 02:19:18.918501 [INFO] consul: New leader elected: Node 12df14d5-1698-efa1-2986-fb6cba56eb33
TestPolicyListCommand - 2019/11/27 02:19:19.166602 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyListCommand - 2019/11/27 02:19:19.384055 [INFO] acl: initializing acls
TestPolicyListCommand - 2019/11/27 02:19:19.462938 [INFO] acl: initializing acls
TestPolicyListCommand - 2019/11/27 02:19:19.950312 [INFO] consul: Created ACL 'global-management' policy
TestPolicyListCommand - 2019/11/27 02:19:19.950442 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyListCommand - 2019/11/27 02:19:19.950972 [INFO] consul: Created ACL 'global-management' policy
TestPolicyListCommand - 2019/11/27 02:19:19.951053 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyListCommand - 2019/11/27 02:19:20.863003 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyListCommand - 2019/11/27 02:19:21.189864 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyListCommand - 2019/11/27 02:19:21.273263 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyListCommand - 2019/11/27 02:19:21.273436 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyListCommand - 2019/11/27 02:19:21.274525 [INFO] serf: EventMemberUpdate: Node 12df14d5-1698-efa1-2986-fb6cba56eb33
TestPolicyListCommand - 2019/11/27 02:19:21.273264 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyListCommand - 2019/11/27 02:19:21.275378 [INFO] serf: EventMemberUpdate: Node 12df14d5-1698-efa1-2986-fb6cba56eb33.dc1
TestPolicyListCommand - 2019/11/27 02:19:21.275639 [INFO] serf: EventMemberUpdate: Node 12df14d5-1698-efa1-2986-fb6cba56eb33
TestPolicyListCommand - 2019/11/27 02:19:21.276238 [INFO] serf: EventMemberUpdate: Node 12df14d5-1698-efa1-2986-fb6cba56eb33.dc1
TestPolicyListCommand - 2019/11/27 02:19:22.661103 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyListCommand - 2019/11/27 02:19:22.661538 [DEBUG] consul: Skipping self join check for "Node 12df14d5-1698-efa1-2986-fb6cba56eb33" since the cluster is too small
TestPolicyListCommand - 2019/11/27 02:19:22.661636 [INFO] consul: member 'Node 12df14d5-1698-efa1-2986-fb6cba56eb33' joined, marking health alive
TestPolicyListCommand - 2019/11/27 02:19:23.167840 [DEBUG] consul: Skipping self join check for "Node 12df14d5-1698-efa1-2986-fb6cba56eb33" since the cluster is too small
TestPolicyListCommand - 2019/11/27 02:19:23.168286 [DEBUG] consul: Skipping self join check for "Node 12df14d5-1698-efa1-2986-fb6cba56eb33" since the cluster is too small
TestPolicyListCommand - 2019/11/27 02:19:23.685312 [DEBUG] http: Request PUT /v1/acl/policy (485.287451ms) from=127.0.0.1:47706
TestPolicyListCommand - 2019/11/27 02:19:23.945147 [DEBUG] http: Request PUT /v1/acl/policy (253.800809ms) from=127.0.0.1:47706
TestPolicyListCommand - 2019/11/27 02:19:24.257089 [DEBUG] http: Request PUT /v1/acl/policy (308.922531ms) from=127.0.0.1:47706
TestPolicyListCommand - 2019/11/27 02:19:24.550512 [DEBUG] http: Request PUT /v1/acl/policy (281.539509ms) from=127.0.0.1:47706
TestPolicyListCommand - 2019/11/27 02:19:24.728665 [DEBUG] http: Request PUT /v1/acl/policy (170.727039ms) from=127.0.0.1:47706
TestPolicyListCommand - 2019/11/27 02:19:24.737231 [DEBUG] http: Request GET /v1/acl/policies (1.570725ms) from=127.0.0.1:47708
TestPolicyListCommand - 2019/11/27 02:19:24.740688 [INFO] agent: Requesting shutdown
TestPolicyListCommand - 2019/11/27 02:19:24.740806 [INFO] consul: shutting down server
TestPolicyListCommand - 2019/11/27 02:19:24.740854 [WARN] serf: Shutdown without a Leave
TestPolicyListCommand - 2019/11/27 02:19:24.906855 [WARN] serf: Shutdown without a Leave
TestPolicyListCommand - 2019/11/27 02:19:25.093287 [INFO] manager: shutting down
TestPolicyListCommand - 2019/11/27 02:19:25.093959 [INFO] agent: consul server down
TestPolicyListCommand - 2019/11/27 02:19:25.094020 [INFO] agent: shutdown complete
TestPolicyListCommand - 2019/11/27 02:19:25.094071 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestPolicyListCommand - 2019/11/27 02:19:25.094200 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestPolicyListCommand - 2019/11/27 02:19:25.094335 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestPolicyListCommand - 2019/11/27 02:19:25.094932 [INFO] agent: Waiting for endpoints to shut down
TestPolicyListCommand - 2019/11/27 02:19:25.095041 [INFO] agent: Endpoints down
--- PASS: TestPolicyListCommand (8.68s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/list	8.867s
=== RUN   TestPolicyReadCommand_noTabs
=== PAUSE TestPolicyReadCommand_noTabs
=== RUN   TestPolicyReadCommand
=== PAUSE TestPolicyReadCommand
=== CONT  TestPolicyReadCommand_noTabs
=== CONT  TestPolicyReadCommand
--- PASS: TestPolicyReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyReadCommand - 2019/11/27 02:19:25.787383 [WARN] agent: Node name "Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyReadCommand - 2019/11/27 02:19:25.791212 [DEBUG] tlsutil: Update with version 1
TestPolicyReadCommand - 2019/11/27 02:19:25.791518 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyReadCommand - 2019/11/27 02:19:25.792059 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestPolicyReadCommand - 2019/11/27 02:19:25.792823 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1c51a77c-70d9-7cff-bb48-077f9c7b7284 Address:127.0.0.1:31006}]
2019/11/27 02:19:27 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestPolicyReadCommand - 2019/11/27 02:19:27.656119 [INFO] serf: EventMemberJoin: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284.dc1 127.0.0.1
TestPolicyReadCommand - 2019/11/27 02:19:27.659920 [INFO] serf: EventMemberJoin: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284 127.0.0.1
TestPolicyReadCommand - 2019/11/27 02:19:27.662566 [INFO] consul: Adding LAN server Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
TestPolicyReadCommand - 2019/11/27 02:19:27.663151 [INFO] consul: Handled member-join event for server "Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284.dc1" in area "wan"
TestPolicyReadCommand - 2019/11/27 02:19:27.663231 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestPolicyReadCommand - 2019/11/27 02:19:27.663780 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestPolicyReadCommand - 2019/11/27 02:19:27.666135 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestPolicyReadCommand - 2019/11/27 02:19:27.666541 [INFO] agent: started state syncer
2019/11/27 02:19:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:27 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/11/27 02:19:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:28 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestPolicyReadCommand - 2019/11/27 02:19:28.696436 [INFO] consul: cluster leadership acquired
TestPolicyReadCommand - 2019/11/27 02:19:28.697321 [INFO] consul: New leader elected: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284
TestPolicyReadCommand - 2019/11/27 02:19:28.823279 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyReadCommand - 2019/11/27 02:19:29.215020 [INFO] acl: initializing acls
TestPolicyReadCommand - 2019/11/27 02:19:29.638658 [INFO] consul: Created ACL 'global-management' policy
TestPolicyReadCommand - 2019/11/27 02:19:29.638763 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyReadCommand - 2019/11/27 02:19:29.640047 [INFO] acl: initializing acls
TestPolicyReadCommand - 2019/11/27 02:19:29.640182 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyReadCommand - 2019/11/27 02:19:29.909982 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyReadCommand - 2019/11/27 02:19:30.505398 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyReadCommand - 2019/11/27 02:19:30.505517 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyReadCommand - 2019/11/27 02:19:30.505431 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyReadCommand - 2019/11/27 02:19:30.506549 [INFO] serf: EventMemberUpdate: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284
TestPolicyReadCommand - 2019/11/27 02:19:30.507315 [INFO] serf: EventMemberUpdate: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284.dc1
TestPolicyReadCommand - 2019/11/27 02:19:30.507455 [INFO] serf: EventMemberUpdate: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284
TestPolicyReadCommand - 2019/11/27 02:19:30.508109 [INFO] serf: EventMemberUpdate: Node 1c51a77c-70d9-7cff-bb48-077f9c7b7284.dc1
TestPolicyReadCommand - 2019/11/27 02:19:31.016241 [INFO] agent: Synced node info
TestPolicyReadCommand - 2019/11/27 02:19:31.016367 [DEBUG] agent: Node info in sync
TestPolicyReadCommand - 2019/11/27 02:19:31.832200 [DEBUG] http: Request PUT /v1/acl/policy (800.236478ms) from=127.0.0.1:36486
TestPolicyReadCommand - 2019/11/27 02:19:31.861358 [DEBUG] http: Request GET /v1/acl/policy/55c16f98-b513-447b-82d0-0d81e07a4fbc (10.643063ms) from=127.0.0.1:36488
TestPolicyReadCommand - 2019/11/27 02:19:31.865480 [INFO] agent: Requesting shutdown
TestPolicyReadCommand - 2019/11/27 02:19:31.869584 [INFO] consul: shutting down server
TestPolicyReadCommand - 2019/11/27 02:19:31.869809 [WARN] serf: Shutdown without a Leave
TestPolicyReadCommand - 2019/11/27 02:19:31.963149 [WARN] serf: Shutdown without a Leave
TestPolicyReadCommand - 2019/11/27 02:19:32.048566 [INFO] manager: shutting down
TestPolicyReadCommand - 2019/11/27 02:19:32.049463 [INFO] agent: consul server down
TestPolicyReadCommand - 2019/11/27 02:19:32.049514 [INFO] agent: shutdown complete
TestPolicyReadCommand - 2019/11/27 02:19:32.049563 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestPolicyReadCommand - 2019/11/27 02:19:32.049695 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestPolicyReadCommand - 2019/11/27 02:19:32.049839 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestPolicyReadCommand - 2019/11/27 02:19:32.050458 [INFO] agent: Waiting for endpoints to shut down
TestPolicyReadCommand - 2019/11/27 02:19:32.050621 [INFO] agent: Endpoints down
TestPolicyReadCommand - 2019/11/27 02:19:32.051071 [ERR] connect: Apply failed raft is already shutdown
TestPolicyReadCommand - 2019/11/27 02:19:32.051130 [ERR] consul: failed to establish leadership: raft is already shutdown
--- PASS: TestPolicyReadCommand (6.45s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/read	6.602s
=== RUN   TestPolicyUpdateCommand_noTabs
=== PAUSE TestPolicyUpdateCommand_noTabs
=== RUN   TestPolicyUpdateCommand
=== PAUSE TestPolicyUpdateCommand
=== CONT  TestPolicyUpdateCommand_noTabs
--- PASS: TestPolicyUpdateCommand_noTabs (0.00s)
=== CONT  TestPolicyUpdateCommand
WARNING: bootstrap = true: do not enable unless necessary
TestPolicyUpdateCommand - 2019/11/27 02:19:58.201164 [WARN] agent: Node name "Node 6d301d67-b7fc-8775-aac2-70c764b01f90" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestPolicyUpdateCommand - 2019/11/27 02:19:58.212091 [DEBUG] tlsutil: Update with version 1
TestPolicyUpdateCommand - 2019/11/27 02:19:58.212195 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestPolicyUpdateCommand - 2019/11/27 02:19:58.212444 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestPolicyUpdateCommand - 2019/11/27 02:19:58.212796 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:19:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6d301d67-b7fc-8775-aac2-70c764b01f90 Address:127.0.0.1:10006}]
2019/11/27 02:19:58 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestPolicyUpdateCommand - 2019/11/27 02:19:58.964151 [INFO] serf: EventMemberJoin: Node 6d301d67-b7fc-8775-aac2-70c764b01f90.dc1 127.0.0.1
TestPolicyUpdateCommand - 2019/11/27 02:19:58.972453 [INFO] serf: EventMemberJoin: Node 6d301d67-b7fc-8775-aac2-70c764b01f90 127.0.0.1
TestPolicyUpdateCommand - 2019/11/27 02:19:58.978323 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestPolicyUpdateCommand - 2019/11/27 02:19:58.978412 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestPolicyUpdateCommand - 2019/11/27 02:19:58.979033 [INFO] consul: Handled member-join event for server "Node 6d301d67-b7fc-8775-aac2-70c764b01f90.dc1" in area "wan"
TestPolicyUpdateCommand - 2019/11/27 02:19:58.980828 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestPolicyUpdateCommand - 2019/11/27 02:19:58.980928 [INFO] agent: started state syncer
TestPolicyUpdateCommand - 2019/11/27 02:19:58.984371 [INFO] consul: Adding LAN server Node 6d301d67-b7fc-8775-aac2-70c764b01f90 (Addr: tcp/127.0.0.1:10006) (DC: dc1)
2019/11/27 02:19:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:19:59 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/11/27 02:19:59 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:19:59 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestPolicyUpdateCommand - 2019/11/27 02:19:59.548774 [INFO] consul: cluster leadership acquired
TestPolicyUpdateCommand - 2019/11/27 02:19:59.549399 [INFO] consul: New leader elected: Node 6d301d67-b7fc-8775-aac2-70c764b01f90
TestPolicyUpdateCommand - 2019/11/27 02:19:59.730077 [INFO] acl: initializing acls
TestPolicyUpdateCommand - 2019/11/27 02:19:59.850074 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyUpdateCommand - 2019/11/27 02:19:59.858086 [INFO] acl: initializing acls
TestPolicyUpdateCommand - 2019/11/27 02:20:00.327382 [INFO] consul: Created ACL 'global-management' policy
TestPolicyUpdateCommand - 2019/11/27 02:20:00.327485 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyUpdateCommand - 2019/11/27 02:20:00.528069 [INFO] consul: Created ACL 'global-management' policy
TestPolicyUpdateCommand - 2019/11/27 02:20:00.528189 [WARN] consul: Configuring a non-UUID master token is deprecated
TestPolicyUpdateCommand - 2019/11/27 02:20:01.137332 [ERR] agent: failed to sync remote state: ACL not found
TestPolicyUpdateCommand - 2019/11/27 02:20:01.181491 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyUpdateCommand - 2019/11/27 02:20:01.647562 [INFO] consul: Bootstrapped ACL master token from configuration
TestPolicyUpdateCommand - 2019/11/27 02:20:01.648120 [INFO] consul: Created ACL anonymous token from configuration
TestPolicyUpdateCommand - 2019/11/27 02:20:01.648202 [DEBUG] acl: transitioning out of legacy ACL mode
TestPolicyUpdateCommand - 2019/11/27 02:20:01.649195 [INFO] serf: EventMemberUpdate: Node 6d301d67-b7fc-8775-aac2-70c764b01f90
TestPolicyUpdateCommand - 2019/11/27 02:20:01.649892 [INFO] serf: EventMemberUpdate: Node 6d301d67-b7fc-8775-aac2-70c764b01f90.dc1
TestPolicyUpdateCommand - 2019/11/27 02:20:01.650723 [INFO] serf: EventMemberUpdate: Node 6d301d67-b7fc-8775-aac2-70c764b01f90
TestPolicyUpdateCommand - 2019/11/27 02:20:01.651768 [INFO] serf: EventMemberUpdate: Node 6d301d67-b7fc-8775-aac2-70c764b01f90.dc1
TestPolicyUpdateCommand - 2019/11/27 02:20:03.481075 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestPolicyUpdateCommand - 2019/11/27 02:20:03.481568 [DEBUG] consul: Skipping self join check for "Node 6d301d67-b7fc-8775-aac2-70c764b01f90" since the cluster is too small
TestPolicyUpdateCommand - 2019/11/27 02:20:03.484138 [INFO] consul: member 'Node 6d301d67-b7fc-8775-aac2-70c764b01f90' joined, marking health alive
TestPolicyUpdateCommand - 2019/11/27 02:20:03.806896 [DEBUG] consul: Skipping self join check for "Node 6d301d67-b7fc-8775-aac2-70c764b01f90" since the cluster is too small
TestPolicyUpdateCommand - 2019/11/27 02:20:03.808249 [DEBUG] consul: Skipping self join check for "Node 6d301d67-b7fc-8775-aac2-70c764b01f90" since the cluster is too small
TestPolicyUpdateCommand - 2019/11/27 02:20:04.160267 [DEBUG] http: Request PUT /v1/acl/policy (284.086829ms) from=127.0.0.1:35582
TestPolicyUpdateCommand - 2019/11/27 02:20:04.175594 [DEBUG] http: Request GET /v1/acl/policy/75ce99d1-20cf-e4ff-ca85-f3301fa2aff5 (4.319826ms) from=127.0.0.1:35584
TestPolicyUpdateCommand - 2019/11/27 02:20:04.431257 [DEBUG] http: Request PUT /v1/acl/policy/75ce99d1-20cf-e4ff-ca85-f3301fa2aff5 (251.473957ms) from=127.0.0.1:35584
TestPolicyUpdateCommand - 2019/11/27 02:20:04.436626 [INFO] agent: Requesting shutdown
TestPolicyUpdateCommand - 2019/11/27 02:20:04.438624 [INFO] consul: shutting down server
TestPolicyUpdateCommand - 2019/11/27 02:20:04.438747 [WARN] serf: Shutdown without a Leave
TestPolicyUpdateCommand - 2019/11/27 02:20:04.592060 [WARN] serf: Shutdown without a Leave
TestPolicyUpdateCommand - 2019/11/27 02:20:04.824632 [INFO] manager: shutting down
TestPolicyUpdateCommand - 2019/11/27 02:20:04.825567 [INFO] agent: consul server down
TestPolicyUpdateCommand - 2019/11/27 02:20:04.825635 [INFO] agent: shutdown complete
TestPolicyUpdateCommand - 2019/11/27 02:20:04.825703 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestPolicyUpdateCommand - 2019/11/27 02:20:04.825870 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestPolicyUpdateCommand - 2019/11/27 02:20:04.826081 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestPolicyUpdateCommand - 2019/11/27 02:20:04.826906 [INFO] agent: Waiting for endpoints to shut down
TestPolicyUpdateCommand - 2019/11/27 02:20:04.827008 [INFO] agent: Endpoints down
--- PASS: TestPolicyUpdateCommand (6.78s)
PASS
ok  	github.com/hashicorp/consul/command/acl/policy/update	7.136s
=== RUN   TestRulesTranslateCommand_noTabs
=== PAUSE TestRulesTranslateCommand_noTabs
=== RUN   TestRulesTranslateCommand
=== PAUSE TestRulesTranslateCommand
=== CONT  TestRulesTranslateCommand_noTabs
--- PASS: TestRulesTranslateCommand_noTabs (0.00s)
=== CONT  TestRulesTranslateCommand
WARNING: bootstrap = true: do not enable unless necessary
TestRulesTranslateCommand - 2019/11/27 02:20:11.795781 [WARN] agent: Node name "Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRulesTranslateCommand - 2019/11/27 02:20:11.796882 [DEBUG] tlsutil: Update with version 1
TestRulesTranslateCommand - 2019/11/27 02:20:11.796968 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRulesTranslateCommand - 2019/11/27 02:20:11.797244 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestRulesTranslateCommand - 2019/11/27 02:20:11.797954 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33 Address:127.0.0.1:31006}]
2019/11/27 02:20:16 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestRulesTranslateCommand - 2019/11/27 02:20:16.052399 [INFO] serf: EventMemberJoin: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33.dc1 127.0.0.1
TestRulesTranslateCommand - 2019/11/27 02:20:16.057530 [INFO] serf: EventMemberJoin: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33 127.0.0.1
TestRulesTranslateCommand - 2019/11/27 02:20:16.062715 [INFO] consul: Adding LAN server Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
TestRulesTranslateCommand - 2019/11/27 02:20:16.064751 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestRulesTranslateCommand - 2019/11/27 02:20:16.065680 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestRulesTranslateCommand - 2019/11/27 02:20:16.068501 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestRulesTranslateCommand - 2019/11/27 02:20:16.065090 [INFO] consul: Handled member-join event for server "Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33.dc1" in area "wan"
TestRulesTranslateCommand - 2019/11/27 02:20:16.082264 [INFO] agent: started state syncer
2019/11/27 02:20:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:16 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:18 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:18 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestRulesTranslateCommand - 2019/11/27 02:20:18.526367 [INFO] consul: cluster leadership acquired
TestRulesTranslateCommand - 2019/11/27 02:20:18.527149 [INFO] consul: New leader elected: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33
TestRulesTranslateCommand - 2019/11/27 02:20:18.602814 [ERR] agent: failed to sync remote state: ACL not found
TestRulesTranslateCommand - 2019/11/27 02:20:19.111585 [ERR] agent: failed to sync remote state: ACL not found
TestRulesTranslateCommand - 2019/11/27 02:20:19.219240 [INFO] acl: initializing acls
TestRulesTranslateCommand - 2019/11/27 02:20:19.717475 [INFO] acl: initializing acls
TestRulesTranslateCommand - 2019/11/27 02:20:19.717860 [INFO] consul: Created ACL 'global-management' policy
TestRulesTranslateCommand - 2019/11/27 02:20:19.717927 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRulesTranslateCommand - 2019/11/27 02:20:20.034011 [INFO] consul: Created ACL 'global-management' policy
TestRulesTranslateCommand - 2019/11/27 02:20:20.034115 [WARN] consul: Configuring a non-UUID master token is deprecated
TestRulesTranslateCommand - 2019/11/27 02:20:21.046964 [INFO] consul: Bootstrapped ACL master token from configuration
TestRulesTranslateCommand - 2019/11/27 02:20:21.050247 [INFO] consul: Bootstrapped ACL master token from configuration
TestRulesTranslateCommand - 2019/11/27 02:20:21.980816 [INFO] consul: Created ACL anonymous token from configuration
TestRulesTranslateCommand - 2019/11/27 02:20:21.981559 [INFO] consul: Created ACL anonymous token from configuration
TestRulesTranslateCommand - 2019/11/27 02:20:21.981653 [DEBUG] acl: transitioning out of legacy ACL mode
TestRulesTranslateCommand - 2019/11/27 02:20:21.981981 [INFO] serf: EventMemberUpdate: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33
TestRulesTranslateCommand - 2019/11/27 02:20:21.982712 [INFO] serf: EventMemberUpdate: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33.dc1
TestRulesTranslateCommand - 2019/11/27 02:20:21.983674 [INFO] serf: EventMemberUpdate: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33
TestRulesTranslateCommand - 2019/11/27 02:20:21.984393 [INFO] serf: EventMemberUpdate: Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33.dc1
TestRulesTranslateCommand - 2019/11/27 02:20:23.657018 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRulesTranslateCommand - 2019/11/27 02:20:23.657570 [DEBUG] consul: Skipping self join check for "Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33" since the cluster is too small
TestRulesTranslateCommand - 2019/11/27 02:20:23.657727 [INFO] consul: member 'Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33' joined, marking health alive
TestRulesTranslateCommand - 2019/11/27 02:20:24.170456 [DEBUG] consul: Skipping self join check for "Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33" since the cluster is too small
TestRulesTranslateCommand - 2019/11/27 02:20:24.170968 [DEBUG] consul: Skipping self join check for "Node 7ac0e81d-5f13-3a8b-04e5-8a79d9c2ac33" since the cluster is too small
TestRulesTranslateCommand - 2019/11/27 02:20:24.198278 [INFO] agent: Requesting shutdown
TestRulesTranslateCommand - 2019/11/27 02:20:24.198510 [INFO] consul: shutting down server
TestRulesTranslateCommand - 2019/11/27 02:20:24.199030 [WARN] serf: Shutdown without a Leave
TestRulesTranslateCommand - 2019/11/27 02:20:24.523340 [WARN] serf: Shutdown without a Leave
TestRulesTranslateCommand - 2019/11/27 02:20:24.978930 [INFO] manager: shutting down
TestRulesTranslateCommand - 2019/11/27 02:20:24.979522 [INFO] agent: consul server down
TestRulesTranslateCommand - 2019/11/27 02:20:24.979581 [INFO] agent: shutdown complete
TestRulesTranslateCommand - 2019/11/27 02:20:24.979646 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestRulesTranslateCommand - 2019/11/27 02:20:24.979822 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestRulesTranslateCommand - 2019/11/27 02:20:24.979989 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestRulesTranslateCommand - 2019/11/27 02:20:24.980309 [INFO] agent: Waiting for endpoints to shut down
TestRulesTranslateCommand - 2019/11/27 02:20:24.980396 [INFO] agent: Endpoints down
--- PASS: TestRulesTranslateCommand (13.32s)
PASS
ok  	github.com/hashicorp/consul/command/acl/rules	13.525s
?   	github.com/hashicorp/consul/command/acl/token	[no test files]
=== RUN   TestTokenCloneCommand_noTabs
=== PAUSE TestTokenCloneCommand_noTabs
=== RUN   TestTokenCloneCommand
=== PAUSE TestTokenCloneCommand
=== CONT  TestTokenCloneCommand_noTabs
--- PASS: TestTokenCloneCommand_noTabs (0.00s)
=== CONT  TestTokenCloneCommand
WARNING: bootstrap = true: do not enable unless necessary
TestTokenCloneCommand - 2019/11/27 02:20:19.585077 [WARN] agent: Node name "Node fe69ea8c-0282-7102-dfb4-f761ab1d1395" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenCloneCommand - 2019/11/27 02:20:19.586018 [DEBUG] tlsutil: Update with version 1
TestTokenCloneCommand - 2019/11/27 02:20:19.586124 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenCloneCommand - 2019/11/27 02:20:19.586388 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestTokenCloneCommand - 2019/11/27 02:20:19.586509 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:21 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fe69ea8c-0282-7102-dfb4-f761ab1d1395 Address:127.0.0.1:52006}]
2019/11/27 02:20:21 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestTokenCloneCommand - 2019/11/27 02:20:21.418719 [INFO] serf: EventMemberJoin: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395.dc1 127.0.0.1
TestTokenCloneCommand - 2019/11/27 02:20:21.423828 [INFO] serf: EventMemberJoin: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395 127.0.0.1
TestTokenCloneCommand - 2019/11/27 02:20:21.425891 [INFO] consul: Adding LAN server Node fe69ea8c-0282-7102-dfb4-f761ab1d1395 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestTokenCloneCommand - 2019/11/27 02:20:21.425991 [INFO] consul: Handled member-join event for server "Node fe69ea8c-0282-7102-dfb4-f761ab1d1395.dc1" in area "wan"
TestTokenCloneCommand - 2019/11/27 02:20:21.427577 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestTokenCloneCommand - 2019/11/27 02:20:21.427662 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestTokenCloneCommand - 2019/11/27 02:20:21.430317 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestTokenCloneCommand - 2019/11/27 02:20:21.430470 [INFO] agent: started state syncer
2019/11/27 02:20:21 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:21 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:22 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:22 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestTokenCloneCommand - 2019/11/27 02:20:22.827641 [INFO] consul: cluster leadership acquired
TestTokenCloneCommand - 2019/11/27 02:20:22.828228 [INFO] consul: New leader elected: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395
TestTokenCloneCommand - 2019/11/27 02:20:22.979458 [INFO] acl: initializing acls
TestTokenCloneCommand - 2019/11/27 02:20:23.058991 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCloneCommand - 2019/11/27 02:20:23.488782 [INFO] consul: Created ACL 'global-management' policy
TestTokenCloneCommand - 2019/11/27 02:20:23.488876 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCloneCommand - 2019/11/27 02:20:23.492442 [INFO] acl: initializing acls
TestTokenCloneCommand - 2019/11/27 02:20:23.492587 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCloneCommand - 2019/11/27 02:20:23.970986 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCloneCommand - 2019/11/27 02:20:25.172347 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCloneCommand - 2019/11/27 02:20:25.492284 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCloneCommand - 2019/11/27 02:20:25.492381 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenCloneCommand - 2019/11/27 02:20:25.492502 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCloneCommand - 2019/11/27 02:20:25.493487 [INFO] serf: EventMemberUpdate: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395
TestTokenCloneCommand - 2019/11/27 02:20:25.494148 [INFO] serf: EventMemberUpdate: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395.dc1
TestTokenCloneCommand - 2019/11/27 02:20:25.496102 [INFO] serf: EventMemberUpdate: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395
TestTokenCloneCommand - 2019/11/27 02:20:25.496876 [INFO] serf: EventMemberUpdate: Node fe69ea8c-0282-7102-dfb4-f761ab1d1395.dc1
TestTokenCloneCommand - 2019/11/27 02:20:27.325291 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenCloneCommand - 2019/11/27 02:20:27.325933 [DEBUG] consul: Skipping self join check for "Node fe69ea8c-0282-7102-dfb4-f761ab1d1395" since the cluster is too small
TestTokenCloneCommand - 2019/11/27 02:20:27.326037 [INFO] consul: member 'Node fe69ea8c-0282-7102-dfb4-f761ab1d1395' joined, marking health alive
TestTokenCloneCommand - 2019/11/27 02:20:27.847798 [DEBUG] consul: Skipping self join check for "Node fe69ea8c-0282-7102-dfb4-f761ab1d1395" since the cluster is too small
TestTokenCloneCommand - 2019/11/27 02:20:27.848309 [DEBUG] consul: Skipping self join check for "Node fe69ea8c-0282-7102-dfb4-f761ab1d1395" since the cluster is too small
TestTokenCloneCommand - 2019/11/27 02:20:28.693888 [DEBUG] http: Request PUT /v1/acl/policy (820.046131ms) from=127.0.0.1:48946
TestTokenCloneCommand - 2019/11/27 02:20:29.458957 [DEBUG] http: Request PUT /v1/acl/token (754.621055ms) from=127.0.0.1:48946
=== RUN   TestTokenCloneCommand/Description
TestTokenCloneCommand - 2019/11/27 02:20:30.059275 [DEBUG] http: Request PUT /v1/acl/token/34aaea2c-b90c-503d-abe3-f93e1599754b/clone (585.174828ms) from=127.0.0.1:48948
TestTokenCloneCommand - 2019/11/27 02:20:30.071099 [DEBUG] http: Request GET /v1/acl/token/b8473eed-cbfc-c9e8-e5d3-f4fa26298458 (2.986443ms) from=127.0.0.1:48946
=== RUN   TestTokenCloneCommand/Without_Description
TestTokenCloneCommand - 2019/11/27 02:20:30.369500 [DEBUG] http: Request PUT /v1/acl/token/34aaea2c-b90c-503d-abe3-f93e1599754b/clone (291.649377ms) from=127.0.0.1:48950
TestTokenCloneCommand - 2019/11/27 02:20:30.379279 [DEBUG] http: Request GET /v1/acl/token/7d86ae68-9641-befc-58d8-42be3390e2f0 (3.476461ms) from=127.0.0.1:48946
TestTokenCloneCommand - 2019/11/27 02:20:30.386280 [INFO] agent: Requesting shutdown
TestTokenCloneCommand - 2019/11/27 02:20:30.386383 [INFO] consul: shutting down server
TestTokenCloneCommand - 2019/11/27 02:20:30.386433 [WARN] serf: Shutdown without a Leave
TestTokenCloneCommand - 2019/11/27 02:20:30.735689 [WARN] serf: Shutdown without a Leave
TestTokenCloneCommand - 2019/11/27 02:20:31.212168 [INFO] manager: shutting down
TestTokenCloneCommand - 2019/11/27 02:20:31.213229 [INFO] agent: consul server down
TestTokenCloneCommand - 2019/11/27 02:20:31.213295 [INFO] agent: shutdown complete
TestTokenCloneCommand - 2019/11/27 02:20:31.213370 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestTokenCloneCommand - 2019/11/27 02:20:31.213561 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestTokenCloneCommand - 2019/11/27 02:20:31.213743 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestTokenCloneCommand - 2019/11/27 02:20:31.214841 [INFO] agent: Waiting for endpoints to shut down
TestTokenCloneCommand - 2019/11/27 02:20:31.215110 [INFO] agent: Endpoints down
--- PASS: TestTokenCloneCommand (11.72s)
    --- PASS: TestTokenCloneCommand/Description (0.61s)
    --- PASS: TestTokenCloneCommand/Without_Description (0.31s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/clone	11.937s
=== RUN   TestTokenCreateCommand_noTabs
=== PAUSE TestTokenCreateCommand_noTabs
=== RUN   TestTokenCreateCommand
=== PAUSE TestTokenCreateCommand
=== CONT  TestTokenCreateCommand_noTabs
=== CONT  TestTokenCreateCommand
--- PASS: TestTokenCreateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenCreateCommand - 2019/11/27 02:20:37.684721 [WARN] agent: Node name "Node 593c70a1-30f8-0bd5-ca3d-2929c1290026" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenCreateCommand - 2019/11/27 02:20:37.685719 [DEBUG] tlsutil: Update with version 1
TestTokenCreateCommand - 2019/11/27 02:20:37.685799 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenCreateCommand - 2019/11/27 02:20:37.686110 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestTokenCreateCommand - 2019/11/27 02:20:37.686523 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:20:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:593c70a1-30f8-0bd5-ca3d-2929c1290026 Address:127.0.0.1:35506}]
2019/11/27 02:20:39 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
TestTokenCreateCommand - 2019/11/27 02:20:39.254158 [INFO] serf: EventMemberJoin: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026.dc1 127.0.0.1
TestTokenCreateCommand - 2019/11/27 02:20:39.263036 [INFO] serf: EventMemberJoin: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026 127.0.0.1
TestTokenCreateCommand - 2019/11/27 02:20:39.265622 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
TestTokenCreateCommand - 2019/11/27 02:20:39.268584 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestTokenCreateCommand - 2019/11/27 02:20:39.267123 [INFO] consul: Handled member-join event for server "Node 593c70a1-30f8-0bd5-ca3d-2929c1290026.dc1" in area "wan"
TestTokenCreateCommand - 2019/11/27 02:20:39.267722 [INFO] consul: Adding LAN server Node 593c70a1-30f8-0bd5-ca3d-2929c1290026 (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestTokenCreateCommand - 2019/11/27 02:20:39.278369 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestTokenCreateCommand - 2019/11/27 02:20:39.278501 [INFO] agent: started state syncer
2019/11/27 02:20:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:20:39 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
2019/11/27 02:20:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:20:40 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
TestTokenCreateCommand - 2019/11/27 02:20:40.079941 [INFO] consul: cluster leadership acquired
TestTokenCreateCommand - 2019/11/27 02:20:40.080523 [INFO] consul: New leader elected: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026
TestTokenCreateCommand - 2019/11/27 02:20:40.356027 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCreateCommand - 2019/11/27 02:20:40.689124 [INFO] acl: initializing acls
TestTokenCreateCommand - 2019/11/27 02:20:40.818649 [INFO] acl: initializing acls
TestTokenCreateCommand - 2019/11/27 02:20:41.181040 [INFO] consul: Created ACL 'global-management' policy
TestTokenCreateCommand - 2019/11/27 02:20:41.181124 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCreateCommand - 2019/11/27 02:20:41.181863 [INFO] consul: Created ACL 'global-management' policy
TestTokenCreateCommand - 2019/11/27 02:20:41.181929 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenCreateCommand - 2019/11/27 02:20:41.803593 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCreateCommand - 2019/11/27 02:20:42.325400 [ERR] agent: failed to sync remote state: ACL not found
TestTokenCreateCommand - 2019/11/27 02:20:42.348791 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenCreateCommand - 2019/11/27 02:20:43.005275 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCreateCommand - 2019/11/27 02:20:43.005418 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenCreateCommand - 2019/11/27 02:20:43.006273 [INFO] serf: EventMemberUpdate: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026
TestTokenCreateCommand - 2019/11/27 02:20:43.007060 [INFO] serf: EventMemberUpdate: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026.dc1
TestTokenCreateCommand - 2019/11/27 02:20:43.235950 [INFO] consul: Created ACL anonymous token from configuration
TestTokenCreateCommand - 2019/11/27 02:20:43.236895 [INFO] serf: EventMemberUpdate: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026
TestTokenCreateCommand - 2019/11/27 02:20:43.237515 [INFO] serf: EventMemberUpdate: Node 593c70a1-30f8-0bd5-ca3d-2929c1290026.dc1
TestTokenCreateCommand - 2019/11/27 02:20:44.645615 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenCreateCommand - 2019/11/27 02:20:44.646071 [DEBUG] consul: Skipping self join check for "Node 593c70a1-30f8-0bd5-ca3d-2929c1290026" since the cluster is too small
TestTokenCreateCommand - 2019/11/27 02:20:44.646193 [INFO] consul: member 'Node 593c70a1-30f8-0bd5-ca3d-2929c1290026' joined, marking health alive
TestTokenCreateCommand - 2019/11/27 02:20:44.881205 [DEBUG] consul: Skipping self join check for "Node 593c70a1-30f8-0bd5-ca3d-2929c1290026" since the cluster is too small
TestTokenCreateCommand - 2019/11/27 02:20:44.881868 [DEBUG] consul: Skipping self join check for "Node 593c70a1-30f8-0bd5-ca3d-2929c1290026" since the cluster is too small
TestTokenCreateCommand - 2019/11/27 02:20:45.309285 [DEBUG] http: Request PUT /v1/acl/policy (409.217316ms) from=127.0.0.1:42528
TestTokenCreateCommand - 2019/11/27 02:20:45.913937 [DEBUG] http: Request PUT /v1/acl/token (597.391537ms) from=127.0.0.1:42530
TestTokenCreateCommand - 2019/11/27 02:20:46.285555 [DEBUG] http: Request PUT /v1/acl/token (364.685016ms) from=127.0.0.1:42532
TestTokenCreateCommand - 2019/11/27 02:20:46.292072 [INFO] agent: Requesting shutdown
TestTokenCreateCommand - 2019/11/27 02:20:46.292171 [INFO] consul: shutting down server
TestTokenCreateCommand - 2019/11/27 02:20:46.292266 [WARN] serf: Shutdown without a Leave
TestTokenCreateCommand - 2019/11/27 02:20:46.623986 [WARN] serf: Shutdown without a Leave
TestTokenCreateCommand - 2019/11/27 02:20:48.177341 [INFO] manager: shutting down
TestTokenCreateCommand - 2019/11/27 02:20:48.178084 [INFO] agent: consul server down
TestTokenCreateCommand - 2019/11/27 02:20:48.178149 [INFO] agent: shutdown complete
TestTokenCreateCommand - 2019/11/27 02:20:48.178214 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestTokenCreateCommand - 2019/11/27 02:20:48.178389 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestTokenCreateCommand - 2019/11/27 02:20:48.178584 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestTokenCreateCommand - 2019/11/27 02:20:48.179501 [INFO] agent: Waiting for endpoints to shut down
TestTokenCreateCommand - 2019/11/27 02:20:48.179769 [INFO] agent: Endpoints down
--- PASS: TestTokenCreateCommand (10.58s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/create	10.734s
=== RUN   TestTokenDeleteCommand_noTabs
=== PAUSE TestTokenDeleteCommand_noTabs
=== RUN   TestTokenDeleteCommand
=== PAUSE TestTokenDeleteCommand
=== CONT  TestTokenDeleteCommand_noTabs
=== CONT  TestTokenDeleteCommand
--- PASS: TestTokenDeleteCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenDeleteCommand - 2019/11/27 02:21:12.621543 [WARN] agent: Node name "Node 407c2086-86c8-1a6a-de12-dce212af67ec" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenDeleteCommand - 2019/11/27 02:21:12.622824 [DEBUG] tlsutil: Update with version 1
TestTokenDeleteCommand - 2019/11/27 02:21:12.622891 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenDeleteCommand - 2019/11/27 02:21:12.623148 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestTokenDeleteCommand - 2019/11/27 02:21:12.623449 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:407c2086-86c8-1a6a-de12-dce212af67ec Address:127.0.0.1:38506}]
2019/11/27 02:21:14 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
TestTokenDeleteCommand - 2019/11/27 02:21:14.853364 [INFO] serf: EventMemberJoin: Node 407c2086-86c8-1a6a-de12-dce212af67ec.dc1 127.0.0.1
TestTokenDeleteCommand - 2019/11/27 02:21:14.858139 [INFO] serf: EventMemberJoin: Node 407c2086-86c8-1a6a-de12-dce212af67ec 127.0.0.1
TestTokenDeleteCommand - 2019/11/27 02:21:14.859343 [INFO] consul: Adding LAN server Node 407c2086-86c8-1a6a-de12-dce212af67ec (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestTokenDeleteCommand - 2019/11/27 02:21:14.859683 [INFO] consul: Handled member-join event for server "Node 407c2086-86c8-1a6a-de12-dce212af67ec.dc1" in area "wan"
TestTokenDeleteCommand - 2019/11/27 02:21:14.860455 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestTokenDeleteCommand - 2019/11/27 02:21:14.861282 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestTokenDeleteCommand - 2019/11/27 02:21:14.865213 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestTokenDeleteCommand - 2019/11/27 02:21:14.865337 [INFO] agent: started state syncer
2019/11/27 02:21:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:14 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:17 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:17 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestTokenDeleteCommand - 2019/11/27 02:21:17.113415 [INFO] consul: cluster leadership acquired
TestTokenDeleteCommand - 2019/11/27 02:21:17.113923 [INFO] consul: New leader elected: Node 407c2086-86c8-1a6a-de12-dce212af67ec
TestTokenDeleteCommand - 2019/11/27 02:21:17.266929 [ERR] agent: failed to sync remote state: ACL not found
TestTokenDeleteCommand - 2019/11/27 02:21:18.012266 [INFO] acl: initializing acls
TestTokenDeleteCommand - 2019/11/27 02:21:18.121672 [INFO] acl: initializing acls
TestTokenDeleteCommand - 2019/11/27 02:21:18.690525 [INFO] consul: Created ACL 'global-management' policy
TestTokenDeleteCommand - 2019/11/27 02:21:18.690628 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenDeleteCommand - 2019/11/27 02:21:18.691799 [INFO] consul: Created ACL 'global-management' policy
TestTokenDeleteCommand - 2019/11/27 02:21:18.691998 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenDeleteCommand - 2019/11/27 02:21:19.333302 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenDeleteCommand - 2019/11/27 02:21:19.333317 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenDeleteCommand - 2019/11/27 02:21:19.780417 [ERR] agent: failed to sync remote state: ACL not found
TestTokenDeleteCommand - 2019/11/27 02:21:20.465138 [INFO] consul: Created ACL anonymous token from configuration
TestTokenDeleteCommand - 2019/11/27 02:21:20.465259 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenDeleteCommand - 2019/11/27 02:21:20.465933 [INFO] consul: Created ACL anonymous token from configuration
TestTokenDeleteCommand - 2019/11/27 02:21:20.466199 [INFO] serf: EventMemberUpdate: Node 407c2086-86c8-1a6a-de12-dce212af67ec
TestTokenDeleteCommand - 2019/11/27 02:21:20.466920 [INFO] serf: EventMemberUpdate: Node 407c2086-86c8-1a6a-de12-dce212af67ec.dc1
TestTokenDeleteCommand - 2019/11/27 02:21:20.467502 [INFO] serf: EventMemberUpdate: Node 407c2086-86c8-1a6a-de12-dce212af67ec
TestTokenDeleteCommand - 2019/11/27 02:21:20.468040 [INFO] serf: EventMemberUpdate: Node 407c2086-86c8-1a6a-de12-dce212af67ec.dc1
TestTokenDeleteCommand - 2019/11/27 02:21:22.756405 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenDeleteCommand - 2019/11/27 02:21:22.756978 [DEBUG] consul: Skipping self join check for "Node 407c2086-86c8-1a6a-de12-dce212af67ec" since the cluster is too small
TestTokenDeleteCommand - 2019/11/27 02:21:22.757093 [INFO] consul: member 'Node 407c2086-86c8-1a6a-de12-dce212af67ec' joined, marking health alive
TestTokenDeleteCommand - 2019/11/27 02:21:23.022655 [DEBUG] consul: Skipping self join check for "Node 407c2086-86c8-1a6a-de12-dce212af67ec" since the cluster is too small
TestTokenDeleteCommand - 2019/11/27 02:21:23.023590 [DEBUG] consul: Skipping self join check for "Node 407c2086-86c8-1a6a-de12-dce212af67ec" since the cluster is too small
TestTokenDeleteCommand - 2019/11/27 02:21:23.302379 [DEBUG] http: Request PUT /v1/acl/token (275.133333ms) from=127.0.0.1:47746
TestTokenDeleteCommand - 2019/11/27 02:21:23.603962 [DEBUG] http: Request DELETE /v1/acl/token/b8d3548e-f248-f8db-bc8d-d8a92126678f (290.55456ms) from=127.0.0.1:47748
TestTokenDeleteCommand - 2019/11/27 02:21:23.612269 [ERR] http: Request GET /v1/acl/token/b8d3548e-f248-f8db-bc8d-d8a92126678f, error: ACL not found from=127.0.0.1:47746
TestTokenDeleteCommand - 2019/11/27 02:21:23.616188 [DEBUG] http: Request GET /v1/acl/token/b8d3548e-f248-f8db-bc8d-d8a92126678f (4.409827ms) from=127.0.0.1:47746
TestTokenDeleteCommand - 2019/11/27 02:21:23.618067 [INFO] agent: Requesting shutdown
TestTokenDeleteCommand - 2019/11/27 02:21:23.618163 [INFO] consul: shutting down server
TestTokenDeleteCommand - 2019/11/27 02:21:23.618219 [WARN] serf: Shutdown without a Leave
TestTokenDeleteCommand - 2019/11/27 02:21:23.852802 [WARN] serf: Shutdown without a Leave
TestTokenDeleteCommand - 2019/11/27 02:21:23.998650 [INFO] manager: shutting down
TestTokenDeleteCommand - 2019/11/27 02:21:23.999949 [INFO] agent: consul server down
TestTokenDeleteCommand - 2019/11/27 02:21:24.000032 [INFO] agent: shutdown complete
TestTokenDeleteCommand - 2019/11/27 02:21:24.000144 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestTokenDeleteCommand - 2019/11/27 02:21:24.000335 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestTokenDeleteCommand - 2019/11/27 02:21:24.000542 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestTokenDeleteCommand - 2019/11/27 02:21:24.001135 [INFO] agent: Waiting for endpoints to shut down
TestTokenDeleteCommand - 2019/11/27 02:21:24.001240 [INFO] agent: Endpoints down
--- PASS: TestTokenDeleteCommand (11.45s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/delete	11.591s
=== RUN   TestTokenListCommand_noTabs
=== PAUSE TestTokenListCommand_noTabs
=== RUN   TestTokenListCommand
=== PAUSE TestTokenListCommand
=== CONT  TestTokenListCommand_noTabs
=== CONT  TestTokenListCommand
--- PASS: TestTokenListCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenListCommand - 2019/11/27 02:21:12.590775 [WARN] agent: Node name "Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenListCommand - 2019/11/27 02:21:12.591848 [DEBUG] tlsutil: Update with version 1
TestTokenListCommand - 2019/11/27 02:21:12.591978 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenListCommand - 2019/11/27 02:21:12.592998 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestTokenListCommand - 2019/11/27 02:21:12.593408 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bfd2d748-fcf2-66ca-3d0e-f36517646ccf Address:127.0.0.1:19006}]
2019/11/27 02:21:14 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestTokenListCommand - 2019/11/27 02:21:14.853365 [INFO] serf: EventMemberJoin: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf.dc1 127.0.0.1
TestTokenListCommand - 2019/11/27 02:21:14.868473 [INFO] serf: EventMemberJoin: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf 127.0.0.1
TestTokenListCommand - 2019/11/27 02:21:14.869976 [INFO] consul: Adding LAN server Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestTokenListCommand - 2019/11/27 02:21:14.870058 [INFO] consul: Handled member-join event for server "Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf.dc1" in area "wan"
TestTokenListCommand - 2019/11/27 02:21:14.871810 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestTokenListCommand - 2019/11/27 02:21:14.872473 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestTokenListCommand - 2019/11/27 02:21:14.875310 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestTokenListCommand - 2019/11/27 02:21:14.875424 [INFO] agent: started state syncer
2019/11/27 02:21:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:14 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:17 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:17 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestTokenListCommand - 2019/11/27 02:21:17.111067 [INFO] consul: cluster leadership acquired
TestTokenListCommand - 2019/11/27 02:21:17.111598 [INFO] consul: New leader elected: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf
TestTokenListCommand - 2019/11/27 02:21:17.290797 [ERR] agent: failed to sync remote state: ACL not found
TestTokenListCommand - 2019/11/27 02:21:17.569823 [INFO] acl: initializing acls
TestTokenListCommand - 2019/11/27 02:21:17.858563 [ERR] agent: failed to sync remote state: ACL not found
TestTokenListCommand - 2019/11/27 02:21:18.023324 [INFO] acl: initializing acls
TestTokenListCommand - 2019/11/27 02:21:18.573331 [INFO] consul: Created ACL 'global-management' policy
TestTokenListCommand - 2019/11/27 02:21:18.573414 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenListCommand - 2019/11/27 02:21:18.575275 [INFO] consul: Created ACL 'global-management' policy
TestTokenListCommand - 2019/11/27 02:21:18.575343 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenListCommand - 2019/11/27 02:21:18.909833 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenListCommand - 2019/11/27 02:21:19.494074 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenListCommand - 2019/11/27 02:21:19.494928 [INFO] consul: Created ACL anonymous token from configuration
TestTokenListCommand - 2019/11/27 02:21:19.495018 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenListCommand - 2019/11/27 02:21:19.496153 [INFO] serf: EventMemberUpdate: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf
TestTokenListCommand - 2019/11/27 02:21:19.497539 [INFO] serf: EventMemberUpdate: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf.dc1
TestTokenListCommand - 2019/11/27 02:21:19.809881 [INFO] consul: Created ACL anonymous token from configuration
TestTokenListCommand - 2019/11/27 02:21:19.810815 [INFO] serf: EventMemberUpdate: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf
TestTokenListCommand - 2019/11/27 02:21:19.811604 [INFO] serf: EventMemberUpdate: Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf.dc1
TestTokenListCommand - 2019/11/27 02:21:22.598362 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenListCommand - 2019/11/27 02:21:22.598929 [DEBUG] consul: Skipping self join check for "Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf" since the cluster is too small
TestTokenListCommand - 2019/11/27 02:21:22.599453 [INFO] consul: member 'Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf' joined, marking health alive
TestTokenListCommand - 2019/11/27 02:21:22.999962 [DEBUG] consul: Skipping self join check for "Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf" since the cluster is too small
TestTokenListCommand - 2019/11/27 02:21:23.002886 [DEBUG] consul: Skipping self join check for "Node bfd2d748-fcf2-66ca-3d0e-f36517646ccf" since the cluster is too small
TestTokenListCommand - 2019/11/27 02:21:23.303858 [DEBUG] http: Request PUT /v1/acl/token (288.901166ms) from=127.0.0.1:58802
TestTokenListCommand - 2019/11/27 02:21:23.598981 [DEBUG] http: Request PUT /v1/acl/token (288.864165ms) from=127.0.0.1:58802
TestTokenListCommand - 2019/11/27 02:21:24.000442 [DEBUG] http: Request PUT /v1/acl/token (396.694085ms) from=127.0.0.1:58802
TestTokenListCommand - 2019/11/27 02:21:24.366607 [DEBUG] http: Request PUT /v1/acl/token (360.322761ms) from=127.0.0.1:58802
TestTokenListCommand - 2019/11/27 02:21:24.744413 [DEBUG] http: Request PUT /v1/acl/token (374.640614ms) from=127.0.0.1:58802
TestTokenListCommand - 2019/11/27 02:21:24.751597 [DEBUG] http: Request GET /v1/acl/tokens (2.668763ms) from=127.0.0.1:58808
TestTokenListCommand - 2019/11/27 02:21:24.757524 [INFO] agent: Requesting shutdown
TestTokenListCommand - 2019/11/27 02:21:24.757748 [INFO] consul: shutting down server
TestTokenListCommand - 2019/11/27 02:21:24.757808 [WARN] serf: Shutdown without a Leave
TestTokenListCommand - 2019/11/27 02:21:24.863889 [WARN] serf: Shutdown without a Leave
TestTokenListCommand - 2019/11/27 02:21:24.986223 [INFO] manager: shutting down
TestTokenListCommand - 2019/11/27 02:21:24.987179 [INFO] agent: consul server down
TestTokenListCommand - 2019/11/27 02:21:24.987241 [INFO] agent: shutdown complete
TestTokenListCommand - 2019/11/27 02:21:24.987295 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestTokenListCommand - 2019/11/27 02:21:24.987458 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestTokenListCommand - 2019/11/27 02:21:24.989533 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestTokenListCommand - 2019/11/27 02:21:24.990268 [INFO] agent: Waiting for endpoints to shut down
TestTokenListCommand - 2019/11/27 02:21:24.990358 [INFO] agent: Endpoints down
--- PASS: TestTokenListCommand (12.51s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/list	12.691s
=== RUN   TestTokenReadCommand_noTabs
=== PAUSE TestTokenReadCommand_noTabs
=== RUN   TestTokenReadCommand
=== PAUSE TestTokenReadCommand
=== CONT  TestTokenReadCommand_noTabs
=== CONT  TestTokenReadCommand
--- PASS: TestTokenReadCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenReadCommand - 2019/11/27 02:21:21.334452 [WARN] agent: Node name "Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenReadCommand - 2019/11/27 02:21:21.335387 [DEBUG] tlsutil: Update with version 1
TestTokenReadCommand - 2019/11/27 02:21:21.335452 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenReadCommand - 2019/11/27 02:21:21.335882 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestTokenReadCommand - 2019/11/27 02:21:21.336639 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:21:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4653518a-3cf5-6c5d-3f96-2d5e48afb4bc Address:127.0.0.1:50506}]
2019/11/27 02:21:23 [INFO]  raft: Node at 127.0.0.1:50506 [Follower] entering Follower state (Leader: "")
TestTokenReadCommand - 2019/11/27 02:21:23.313837 [INFO] serf: EventMemberJoin: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc.dc1 127.0.0.1
TestTokenReadCommand - 2019/11/27 02:21:23.320495 [INFO] serf: EventMemberJoin: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc 127.0.0.1
TestTokenReadCommand - 2019/11/27 02:21:23.322410 [INFO] consul: Adding LAN server Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestTokenReadCommand - 2019/11/27 02:21:23.327042 [INFO] consul: Handled member-join event for server "Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc.dc1" in area "wan"
TestTokenReadCommand - 2019/11/27 02:21:23.328903 [INFO] agent: Started DNS server 127.0.0.1:50501 (tcp)
TestTokenReadCommand - 2019/11/27 02:21:23.329518 [INFO] agent: Started DNS server 127.0.0.1:50501 (udp)
TestTokenReadCommand - 2019/11/27 02:21:23.332447 [INFO] agent: Started HTTP server on 127.0.0.1:50502 (tcp)
TestTokenReadCommand - 2019/11/27 02:21:23.332618 [INFO] agent: started state syncer
2019/11/27 02:21:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:21:23 [INFO]  raft: Node at 127.0.0.1:50506 [Candidate] entering Candidate state in term 2
2019/11/27 02:21:24 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:21:24 [INFO]  raft: Node at 127.0.0.1:50506 [Leader] entering Leader state
TestTokenReadCommand - 2019/11/27 02:21:24.364868 [INFO] consul: cluster leadership acquired
TestTokenReadCommand - 2019/11/27 02:21:24.365778 [INFO] consul: New leader elected: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc
TestTokenReadCommand - 2019/11/27 02:21:24.503918 [ERR] agent: failed to sync remote state: ACL not found
TestTokenReadCommand - 2019/11/27 02:21:24.880147 [INFO] acl: initializing acls
TestTokenReadCommand - 2019/11/27 02:21:24.988331 [INFO] acl: initializing acls
TestTokenReadCommand - 2019/11/27 02:21:25.287397 [INFO] consul: Created ACL 'global-management' policy
TestTokenReadCommand - 2019/11/27 02:21:25.287494 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenReadCommand - 2019/11/27 02:21:25.568706 [INFO] consul: Created ACL 'global-management' policy
TestTokenReadCommand - 2019/11/27 02:21:25.568810 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenReadCommand - 2019/11/27 02:21:26.254365 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenReadCommand - 2019/11/27 02:21:26.254584 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenReadCommand - 2019/11/27 02:21:26.844541 [INFO] consul: Created ACL anonymous token from configuration
TestTokenReadCommand - 2019/11/27 02:21:26.844642 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenReadCommand - 2019/11/27 02:21:26.845406 [INFO] consul: Created ACL anonymous token from configuration
TestTokenReadCommand - 2019/11/27 02:21:26.845899 [INFO] serf: EventMemberUpdate: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc
TestTokenReadCommand - 2019/11/27 02:21:26.847332 [INFO] serf: EventMemberUpdate: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc
TestTokenReadCommand - 2019/11/27 02:21:26.849584 [INFO] serf: EventMemberUpdate: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc.dc1
TestTokenReadCommand - 2019/11/27 02:21:26.850348 [INFO] serf: EventMemberUpdate: Node 4653518a-3cf5-6c5d-3f96-2d5e48afb4bc.dc1
TestTokenReadCommand - 2019/11/27 02:21:28.047349 [INFO] agent: Synced node info
TestTokenReadCommand - 2019/11/27 02:21:28.047462 [DEBUG] agent: Node info in sync
TestTokenReadCommand - 2019/11/27 02:21:28.754368 [DEBUG] http: Request PUT /v1/acl/token (684.366185ms) from=127.0.0.1:54414
TestTokenReadCommand - 2019/11/27 02:21:28.771358 [DEBUG] http: Request GET /v1/acl/token/0b5e409a-c527-adea-e437-32f99d263ff1 (3.672467ms) from=127.0.0.1:54416
TestTokenReadCommand - 2019/11/27 02:21:28.778138 [INFO] agent: Requesting shutdown
TestTokenReadCommand - 2019/11/27 02:21:28.778250 [INFO] consul: shutting down server
TestTokenReadCommand - 2019/11/27 02:21:28.778297 [WARN] serf: Shutdown without a Leave
TestTokenReadCommand - 2019/11/27 02:21:29.041569 [WARN] serf: Shutdown without a Leave
TestTokenReadCommand - 2019/11/27 02:21:29.186111 [INFO] manager: shutting down
TestTokenReadCommand - 2019/11/27 02:21:29.319798 [INFO] agent: consul server down
TestTokenReadCommand - 2019/11/27 02:21:29.319923 [INFO] agent: shutdown complete
TestTokenReadCommand - 2019/11/27 02:21:29.319989 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (tcp)
TestTokenReadCommand - 2019/11/27 02:21:29.320176 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (udp)
TestTokenReadCommand - 2019/11/27 02:21:29.320212 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestTokenReadCommand - 2019/11/27 02:21:29.320354 [INFO] agent: Stopping HTTP server 127.0.0.1:50502 (tcp)
TestTokenReadCommand - 2019/11/27 02:21:29.321219 [INFO] agent: Waiting for endpoints to shut down
TestTokenReadCommand - 2019/11/27 02:21:29.321331 [INFO] agent: Endpoints down
--- PASS: TestTokenReadCommand (8.09s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/read	8.239s
=== RUN   TestTokenUpdateCommand_noTabs
=== PAUSE TestTokenUpdateCommand_noTabs
=== RUN   TestTokenUpdateCommand
=== PAUSE TestTokenUpdateCommand
=== CONT  TestTokenUpdateCommand_noTabs
=== CONT  TestTokenUpdateCommand
--- PASS: TestTokenUpdateCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestTokenUpdateCommand - 2019/11/27 02:22:04.085327 [WARN] agent: Node name "Node b1348f07-eb63-1847-5665-79ed8482e9f0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestTokenUpdateCommand - 2019/11/27 02:22:04.086169 [DEBUG] tlsutil: Update with version 1
TestTokenUpdateCommand - 2019/11/27 02:22:04.086244 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestTokenUpdateCommand - 2019/11/27 02:22:04.086499 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestTokenUpdateCommand - 2019/11/27 02:22:04.086980 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b1348f07-eb63-1847-5665-79ed8482e9f0 Address:127.0.0.1:17506}]
TestTokenUpdateCommand - 2019/11/27 02:22:07.451274 [INFO] serf: EventMemberJoin: Node b1348f07-eb63-1847-5665-79ed8482e9f0.dc1 127.0.0.1
2019/11/27 02:22:07 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestTokenUpdateCommand - 2019/11/27 02:22:07.465244 [INFO] serf: EventMemberJoin: Node b1348f07-eb63-1847-5665-79ed8482e9f0 127.0.0.1
TestTokenUpdateCommand - 2019/11/27 02:22:07.467989 [INFO] consul: Adding LAN server Node b1348f07-eb63-1847-5665-79ed8482e9f0 (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestTokenUpdateCommand - 2019/11/27 02:22:07.467994 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestTokenUpdateCommand - 2019/11/27 02:22:07.468226 [INFO] consul: Handled member-join event for server "Node b1348f07-eb63-1847-5665-79ed8482e9f0.dc1" in area "wan"
TestTokenUpdateCommand - 2019/11/27 02:22:07.468502 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestTokenUpdateCommand - 2019/11/27 02:22:07.470982 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestTokenUpdateCommand - 2019/11/27 02:22:07.471106 [INFO] agent: started state syncer
2019/11/27 02:22:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:07 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:08 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestTokenUpdateCommand - 2019/11/27 02:22:08.275813 [INFO] consul: cluster leadership acquired
TestTokenUpdateCommand - 2019/11/27 02:22:08.276446 [INFO] consul: New leader elected: Node b1348f07-eb63-1847-5665-79ed8482e9f0
TestTokenUpdateCommand - 2019/11/27 02:22:08.581773 [ERR] agent: failed to sync remote state: ACL not found
TestTokenUpdateCommand - 2019/11/27 02:22:08.674660 [INFO] acl: initializing acls
TestTokenUpdateCommand - 2019/11/27 02:22:08.892017 [INFO] consul: Created ACL 'global-management' policy
TestTokenUpdateCommand - 2019/11/27 02:22:08.892122 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenUpdateCommand - 2019/11/27 02:22:09.019977 [INFO] acl: initializing acls
TestTokenUpdateCommand - 2019/11/27 02:22:09.020136 [WARN] consul: Configuring a non-UUID master token is deprecated
TestTokenUpdateCommand - 2019/11/27 02:22:09.308592 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenUpdateCommand - 2019/11/27 02:22:10.785719 [INFO] consul: Created ACL anonymous token from configuration
TestTokenUpdateCommand - 2019/11/27 02:22:10.787629 [INFO] consul: Bootstrapped ACL master token from configuration
TestTokenUpdateCommand - 2019/11/27 02:22:10.787819 [DEBUG] acl: transitioning out of legacy ACL mode
TestTokenUpdateCommand - 2019/11/27 02:22:10.799931 [INFO] serf: EventMemberUpdate: Node b1348f07-eb63-1847-5665-79ed8482e9f0
TestTokenUpdateCommand - 2019/11/27 02:22:10.800659 [INFO] serf: EventMemberUpdate: Node b1348f07-eb63-1847-5665-79ed8482e9f0.dc1
TestTokenUpdateCommand - 2019/11/27 02:22:10.802841 [INFO] serf: EventMemberUpdate: Node b1348f07-eb63-1847-5665-79ed8482e9f0
TestTokenUpdateCommand - 2019/11/27 02:22:10.803600 [INFO] serf: EventMemberUpdate: Node b1348f07-eb63-1847-5665-79ed8482e9f0.dc1
TestTokenUpdateCommand - 2019/11/27 02:22:11.963338 [INFO] agent: Synced node info
TestTokenUpdateCommand - 2019/11/27 02:22:11.963453 [DEBUG] agent: Node info in sync
TestTokenUpdateCommand - 2019/11/27 02:22:12.548484 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestTokenUpdateCommand - 2019/11/27 02:22:12.549912 [DEBUG] consul: Skipping self join check for "Node b1348f07-eb63-1847-5665-79ed8482e9f0" since the cluster is too small
TestTokenUpdateCommand - 2019/11/27 02:22:12.550111 [INFO] consul: member 'Node b1348f07-eb63-1847-5665-79ed8482e9f0' joined, marking health alive
TestTokenUpdateCommand - 2019/11/27 02:22:12.559633 [DEBUG] http: Request PUT /v1/acl/policy (571.619612ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:13.153915 [DEBUG] consul: Skipping self join check for "Node b1348f07-eb63-1847-5665-79ed8482e9f0" since the cluster is too small
TestTokenUpdateCommand - 2019/11/27 02:22:13.154367 [DEBUG] consul: Skipping self join check for "Node b1348f07-eb63-1847-5665-79ed8482e9f0" since the cluster is too small
TestTokenUpdateCommand - 2019/11/27 02:22:13.156238 [DEBUG] http: Request PUT /v1/acl/token (581.781644ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:13.352220 [DEBUG] http: Request PUT /v1/acl/create (191.587241ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:13.362877 [DEBUG] http: Request GET /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (4.923511ms) from=127.0.0.1:44864
TestTokenUpdateCommand - 2019/11/27 02:22:13.609327 [DEBUG] http: Request PUT /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (233.029735ms) from=127.0.0.1:44864
TestTokenUpdateCommand - 2019/11/27 02:22:13.614247 [DEBUG] http: Request GET /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (1.406051ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:13.621771 [DEBUG] http: Request GET /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (1.289047ms) from=127.0.0.1:44866
TestTokenUpdateCommand - 2019/11/27 02:22:13.851532 [DEBUG] http: Request PUT /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (226.61517ms) from=127.0.0.1:44866
TestTokenUpdateCommand - 2019/11/27 02:22:13.857300 [DEBUG] http: Request GET /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (1.222711ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:13.867853 [DEBUG] http: Request GET /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (2.477422ms) from=127.0.0.1:44868
TestTokenUpdateCommand - 2019/11/27 02:22:14.120318 [DEBUG] http: Request PUT /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (246.234877ms) from=127.0.0.1:44868
TestTokenUpdateCommand - 2019/11/27 02:22:14.130204 [DEBUG] http: Request GET /v1/acl/token/52c9a943-7511-785c-2131-48694fc0da23 (1.221711ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.135582 [DEBUG] http: Request GET /v1/acl/token/self (1.015037ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.164811 [DEBUG] http: Request GET /v1/acl/token/self (653.024µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.194421 [DEBUG] http: Request GET /v1/acl/token/self (914.033µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.223558 [DEBUG] http: Request GET /v1/acl/token/self (735.36µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.252824 [DEBUG] http: Request GET /v1/acl/token/self (902.699µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.282450 [DEBUG] http: Request GET /v1/acl/token/self (1.161709ms) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.313392 [DEBUG] http: Request GET /v1/acl/token/self (820.696µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.344258 [DEBUG] http: Request GET /v1/acl/token/self (830.363µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.373842 [DEBUG] http: Request GET /v1/acl/token/self (841.364µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.403105 [DEBUG] http: Request GET /v1/acl/token/self (825.363µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.433506 [DEBUG] http: Request GET /v1/acl/token/self (846.364µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.462650 [DEBUG] http: Request GET /v1/acl/token/self (747.36µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.491812 [DEBUG] http: Request GET /v1/acl/token/self (783.028µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.520640 [DEBUG] http: Request GET /v1/acl/token/self (720.693µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.528537 [DEBUG] http: Request GET /v1/acl/token/b19a86d1-f847-7e9d-ba26-cdedf08ed00c (1.176042ms) from=127.0.0.1:44870
TestTokenUpdateCommand - 2019/11/27 02:22:14.807247 [DEBUG] http: Request PUT /v1/acl/token/b19a86d1-f847-7e9d-ba26-cdedf08ed00c (276.013949ms) from=127.0.0.1:44870
TestTokenUpdateCommand - 2019/11/27 02:22:14.811184 [DEBUG] http: Request GET /v1/acl/token/b19a86d1-f847-7e9d-ba26-cdedf08ed00c (985.369µs) from=127.0.0.1:44862
TestTokenUpdateCommand - 2019/11/27 02:22:14.813648 [INFO] agent: Requesting shutdown
TestTokenUpdateCommand - 2019/11/27 02:22:14.813761 [INFO] consul: shutting down server
TestTokenUpdateCommand - 2019/11/27 02:22:14.813817 [WARN] serf: Shutdown without a Leave
TestTokenUpdateCommand - 2019/11/27 02:22:14.994242 [WARN] serf: Shutdown without a Leave
TestTokenUpdateCommand - 2019/11/27 02:22:15.083335 [INFO] manager: shutting down
TestTokenUpdateCommand - 2019/11/27 02:22:15.084373 [INFO] agent: consul server down
TestTokenUpdateCommand - 2019/11/27 02:22:15.084448 [INFO] agent: shutdown complete
TestTokenUpdateCommand - 2019/11/27 02:22:15.084552 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestTokenUpdateCommand - 2019/11/27 02:22:15.084806 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestTokenUpdateCommand - 2019/11/27 02:22:15.085010 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestTokenUpdateCommand - 2019/11/27 02:22:15.086189 [INFO] agent: Waiting for endpoints to shut down
TestTokenUpdateCommand - 2019/11/27 02:22:15.086447 [INFO] agent: Endpoints down
--- PASS: TestTokenUpdateCommand (11.15s)
PASS
ok  	github.com/hashicorp/consul/command/acl/token/update	11.325s
=== RUN   TestConfigFail
=== PAUSE TestConfigFail
=== RUN   TestRetryJoin
--- SKIP: TestRetryJoin (0.00s)
    agent_test.go:85: DM-skipped
=== RUN   TestRetryJoinFail
=== PAUSE TestRetryJoinFail
=== RUN   TestRetryJoinWanFail
=== PAUSE TestRetryJoinWanFail
=== RUN   TestProtectDataDir
=== PAUSE TestProtectDataDir
=== RUN   TestBadDataDirPermissions
=== PAUSE TestBadDataDirPermissions
=== CONT  TestConfigFail
=== CONT  TestRetryJoinWanFail
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=
=== CONT  TestRetryJoinFail
=== CONT  TestBadDataDirPermissions
--- PASS: TestBadDataDirPermissions (0.12s)
=== CONT  TestProtectDataDir
--- PASS: TestProtectDataDir (0.08s)
--- PASS: TestRetryJoinFail (0.60s)
--- PASS: TestRetryJoinWanFail (3.07s)
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=foo_some-other-arg
=== RUN   TestConfigFail/agent_-server_-bind=10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise_0.0.0.0_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise_::_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise_[::]_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise-wan_0.0.0.0_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise-wan_::_-bind_10.0.0.1
=== RUN   TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise-wan_[::]_-bind_10.0.0.1
--- PASS: TestConfigFail (5.86s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter= (3.12s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1_-datacenter=foo_some-other-arg (0.31s)
    --- PASS: TestConfigFail/agent_-server_-bind=10.0.0.1 (0.38s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise_0.0.0.0_-bind_10.0.0.1 (0.35s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise_::_-bind_10.0.0.1 (0.35s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise_[::]_-bind_10.0.0.1 (0.36s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise-wan_0.0.0.0_-bind_10.0.0.1 (0.33s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise-wan_::_-bind_10.0.0.1 (0.37s)
    --- PASS: TestConfigFail/agent_-server_-data-dir_/tmp/consul-test/TestConfigFail-consul816907490_-advertise-wan_[::]_-bind_10.0.0.1 (0.28s)
PASS
ok  	github.com/hashicorp/consul/command/agent	6.020s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/catalog	0.045s
=== RUN   TestCatalogListDatacentersCommand_noTabs
=== PAUSE TestCatalogListDatacentersCommand_noTabs
=== RUN   TestCatalogListDatacentersCommand_Validation
=== PAUSE TestCatalogListDatacentersCommand_Validation
=== RUN   TestCatalogListDatacentersCommand
=== PAUSE TestCatalogListDatacentersCommand
=== CONT  TestCatalogListDatacentersCommand_noTabs
=== CONT  TestCatalogListDatacentersCommand
--- PASS: TestCatalogListDatacentersCommand_noTabs (0.00s)
=== CONT  TestCatalogListDatacentersCommand_Validation
--- PASS: TestCatalogListDatacentersCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListDatacentersCommand - 2019/11/27 02:22:15.520358 [WARN] agent: Node name "Node 60f6f106-7a45-4168-c987-508788fd93c5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListDatacentersCommand - 2019/11/27 02:22:15.521301 [DEBUG] tlsutil: Update with version 1
TestCatalogListDatacentersCommand - 2019/11/27 02:22:15.521372 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogListDatacentersCommand - 2019/11/27 02:22:15.521612 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCatalogListDatacentersCommand - 2019/11/27 02:22:15.521948 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:16 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:60f6f106-7a45-4168-c987-508788fd93c5 Address:127.0.0.1:10006}]
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.545328 [INFO] serf: EventMemberJoin: Node 60f6f106-7a45-4168-c987-508788fd93c5.dc1 127.0.0.1
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.551618 [INFO] serf: EventMemberJoin: Node 60f6f106-7a45-4168-c987-508788fd93c5 127.0.0.1
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.553101 [INFO] consul: Handled member-join event for server "Node 60f6f106-7a45-4168-c987-508788fd93c5.dc1" in area "wan"
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.553204 [INFO] consul: Adding LAN server Node 60f6f106-7a45-4168-c987-508788fd93c5 (Addr: tcp/127.0.0.1:10006) (DC: dc1)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.553862 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.553955 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.556089 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:16.556251 [INFO] agent: started state syncer
2019/11/27 02:22:16 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:16 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:17 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:17 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.174093 [INFO] consul: cluster leadership acquired
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.174684 [INFO] consul: New leader elected: Node 60f6f106-7a45-4168-c987-508788fd93c5
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.458888 [DEBUG] http: Request GET /v1/catalog/datacenters (1.233711ms) from=127.0.0.1:35646
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.460125 [INFO] agent: Requesting shutdown
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.460221 [INFO] consul: shutting down server
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.460270 [WARN] serf: Shutdown without a Leave
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.587951 [WARN] serf: Shutdown without a Leave
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.588986 [INFO] agent: Synced node info
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.589133 [DEBUG] agent: Node info in sync
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.675084 [INFO] manager: shutting down
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.750028 [INFO] agent: consul server down
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.750141 [INFO] agent: shutdown complete
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.750209 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.750436 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.750651 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.751209 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.751326 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCatalogListDatacentersCommand - 2019/11/27 02:22:17.751562 [INFO] agent: Endpoints down
--- PASS: TestCatalogListDatacentersCommand (2.33s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/dc	2.517s
=== RUN   TestCatalogListNodesCommand_noTabs
=== PAUSE TestCatalogListNodesCommand_noTabs
=== RUN   TestCatalogListNodesCommand_Validation
=== PAUSE TestCatalogListNodesCommand_Validation
=== RUN   TestCatalogListNodesCommand
=== PAUSE TestCatalogListNodesCommand
=== CONT  TestCatalogListNodesCommand_noTabs
=== CONT  TestCatalogListNodesCommand
=== CONT  TestCatalogListNodesCommand_Validation
--- PASS: TestCatalogListNodesCommand_noTabs (0.00s)
--- PASS: TestCatalogListNodesCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListNodesCommand - 2019/11/27 02:22:48.509428 [WARN] agent: Node name "Node afea5725-f563-09f1-01cc-ef45f9f49d14" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListNodesCommand - 2019/11/27 02:22:48.510402 [DEBUG] tlsutil: Update with version 1
TestCatalogListNodesCommand - 2019/11/27 02:22:48.510478 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogListNodesCommand - 2019/11/27 02:22:48.510698 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCatalogListNodesCommand - 2019/11/27 02:22:48.510820 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:afea5725-f563-09f1-01cc-ef45f9f49d14 Address:127.0.0.1:22006}]
2019/11/27 02:22:49 [INFO]  raft: Node at 127.0.0.1:22006 [Follower] entering Follower state (Leader: "")
TestCatalogListNodesCommand - 2019/11/27 02:22:49.232083 [INFO] serf: EventMemberJoin: Node afea5725-f563-09f1-01cc-ef45f9f49d14.dc1 127.0.0.1
TestCatalogListNodesCommand - 2019/11/27 02:22:49.238045 [INFO] serf: EventMemberJoin: Node afea5725-f563-09f1-01cc-ef45f9f49d14 127.0.0.1
TestCatalogListNodesCommand - 2019/11/27 02:22:49.240406 [INFO] consul: Adding LAN server Node afea5725-f563-09f1-01cc-ef45f9f49d14 (Addr: tcp/127.0.0.1:22006) (DC: dc1)
TestCatalogListNodesCommand - 2019/11/27 02:22:49.242510 [INFO] agent: Started DNS server 127.0.0.1:22001 (udp)
TestCatalogListNodesCommand - 2019/11/27 02:22:49.242799 [INFO] consul: Handled member-join event for server "Node afea5725-f563-09f1-01cc-ef45f9f49d14.dc1" in area "wan"
TestCatalogListNodesCommand - 2019/11/27 02:22:49.243290 [INFO] agent: Started DNS server 127.0.0.1:22001 (tcp)
TestCatalogListNodesCommand - 2019/11/27 02:22:49.245234 [INFO] agent: Started HTTP server on 127.0.0.1:22002 (tcp)
TestCatalogListNodesCommand - 2019/11/27 02:22:49.245384 [INFO] agent: started state syncer
2019/11/27 02:22:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:49 [INFO]  raft: Node at 127.0.0.1:22006 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:49 [INFO]  raft: Node at 127.0.0.1:22006 [Leader] entering Leader state
TestCatalogListNodesCommand - 2019/11/27 02:22:49.749805 [INFO] consul: cluster leadership acquired
TestCatalogListNodesCommand - 2019/11/27 02:22:49.750352 [INFO] consul: New leader elected: Node afea5725-f563-09f1-01cc-ef45f9f49d14
TestCatalogListNodesCommand - 2019/11/27 02:22:50.037585 [INFO] agent: Synced node info
TestCatalogListNodesCommand - 2019/11/27 02:22:50.568211 [DEBUG] agent: Node info in sync
TestCatalogListNodesCommand - 2019/11/27 02:22:50.568340 [DEBUG] agent: Node info in sync
TestCatalogListNodesCommand - 2019/11/27 02:22:50.892938 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogListNodesCommand - 2019/11/27 02:22:50.893559 [DEBUG] consul: Skipping self join check for "Node afea5725-f563-09f1-01cc-ef45f9f49d14" since the cluster is too small
TestCatalogListNodesCommand - 2019/11/27 02:22:50.893733 [INFO] consul: member 'Node afea5725-f563-09f1-01cc-ef45f9f49d14' joined, marking health alive
=== RUN   TestCatalogListNodesCommand/simple
TestCatalogListNodesCommand - 2019/11/27 02:22:51.096996 [DEBUG] http: Request GET /v1/catalog/nodes (2.79777ms) from=127.0.0.1:36442
=== RUN   TestCatalogListNodesCommand/detailed
TestCatalogListNodesCommand - 2019/11/27 02:22:51.117418 [DEBUG] http: Request GET /v1/catalog/nodes (1.922404ms) from=127.0.0.1:36444
=== RUN   TestCatalogListNodesCommand/node-meta
TestCatalogListNodesCommand - 2019/11/27 02:22:51.161882 [DEBUG] http: Request GET /v1/catalog/nodes?node-meta=foo%3Abar (1.422719ms) from=127.0.0.1:36446
=== RUN   TestCatalogListNodesCommand/near
TestCatalogListNodesCommand - 2019/11/27 02:22:51.168679 [DEBUG] http: Request GET /v1/catalog/nodes?near=_agent (1.143376ms) from=127.0.0.1:36448
=== RUN   TestCatalogListNodesCommand/service_present
TestCatalogListNodesCommand - 2019/11/27 02:22:51.179140 [DEBUG] http: Request GET /v1/catalog/service/consul (3.170451ms) from=127.0.0.1:36450
=== RUN   TestCatalogListNodesCommand/service_missing
TestCatalogListNodesCommand - 2019/11/27 02:22:51.191137 [DEBUG] http: Request GET /v1/catalog/service/this-service-will-literally-never-exist (1.476722ms) from=127.0.0.1:36452
TestCatalogListNodesCommand - 2019/11/27 02:22:51.199350 [INFO] agent: Requesting shutdown
TestCatalogListNodesCommand - 2019/11/27 02:22:51.199459 [INFO] consul: shutting down server
TestCatalogListNodesCommand - 2019/11/27 02:22:51.199518 [WARN] serf: Shutdown without a Leave
TestCatalogListNodesCommand - 2019/11/27 02:22:51.327633 [WARN] serf: Shutdown without a Leave
TestCatalogListNodesCommand - 2019/11/27 02:22:51.458896 [INFO] manager: shutting down
TestCatalogListNodesCommand - 2019/11/27 02:22:51.459474 [INFO] agent: consul server down
TestCatalogListNodesCommand - 2019/11/27 02:22:51.459569 [INFO] agent: shutdown complete
TestCatalogListNodesCommand - 2019/11/27 02:22:51.459637 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (tcp)
TestCatalogListNodesCommand - 2019/11/27 02:22:51.459829 [INFO] agent: Stopping DNS server 127.0.0.1:22001 (udp)
TestCatalogListNodesCommand - 2019/11/27 02:22:51.460008 [INFO] agent: Stopping HTTP server 127.0.0.1:22002 (tcp)
TestCatalogListNodesCommand - 2019/11/27 02:22:51.461340 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListNodesCommand - 2019/11/27 02:22:51.461533 [INFO] agent: Endpoints down
--- PASS: TestCatalogListNodesCommand (3.06s)
    --- PASS: TestCatalogListNodesCommand/simple (0.02s)
    --- PASS: TestCatalogListNodesCommand/detailed (0.01s)
    --- PASS: TestCatalogListNodesCommand/node-meta (0.04s)
    --- PASS: TestCatalogListNodesCommand/near (0.01s)
    --- PASS: TestCatalogListNodesCommand/service_present (0.01s)
    --- PASS: TestCatalogListNodesCommand/service_missing (0.02s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/nodes	3.252s
=== RUN   TestCatalogListServicesCommand_noTabs
=== PAUSE TestCatalogListServicesCommand_noTabs
=== RUN   TestCatalogListServicesCommand_Validation
=== PAUSE TestCatalogListServicesCommand_Validation
=== RUN   TestCatalogListServicesCommand
=== PAUSE TestCatalogListServicesCommand
=== CONT  TestCatalogListServicesCommand_noTabs
=== CONT  TestCatalogListServicesCommand
=== CONT  TestCatalogListServicesCommand_Validation
--- PASS: TestCatalogListServicesCommand_Validation (0.00s)
--- PASS: TestCatalogListServicesCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestCatalogListServicesCommand - 2019/11/27 02:22:51.010892 [WARN] agent: Node name "Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCatalogListServicesCommand - 2019/11/27 02:22:51.011846 [DEBUG] tlsutil: Update with version 1
TestCatalogListServicesCommand - 2019/11/27 02:22:51.011922 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCatalogListServicesCommand - 2019/11/27 02:22:51.012140 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCatalogListServicesCommand - 2019/11/27 02:22:51.012278 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:22:51 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0f01d4b4-f163-8221-22ff-fdd47d2d26e2 Address:127.0.0.1:52006}]
2019/11/27 02:22:51 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
TestCatalogListServicesCommand - 2019/11/27 02:22:51.908638 [INFO] serf: EventMemberJoin: Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2.dc1 127.0.0.1
TestCatalogListServicesCommand - 2019/11/27 02:22:51.920247 [INFO] serf: EventMemberJoin: Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2 127.0.0.1
TestCatalogListServicesCommand - 2019/11/27 02:22:51.922492 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestCatalogListServicesCommand - 2019/11/27 02:22:51.922585 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
TestCatalogListServicesCommand - 2019/11/27 02:22:51.922656 [INFO] consul: Handled member-join event for server "Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2.dc1" in area "wan"
TestCatalogListServicesCommand - 2019/11/27 02:22:51.922826 [INFO] consul: Adding LAN server Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2 (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestCatalogListServicesCommand - 2019/11/27 02:22:51.924741 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestCatalogListServicesCommand - 2019/11/27 02:22:51.924913 [INFO] agent: started state syncer
2019/11/27 02:22:51 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:22:51 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
2019/11/27 02:22:52 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:22:52 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
TestCatalogListServicesCommand - 2019/11/27 02:22:52.392646 [INFO] consul: cluster leadership acquired
TestCatalogListServicesCommand - 2019/11/27 02:22:52.393348 [INFO] consul: New leader elected: Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2
TestCatalogListServicesCommand - 2019/11/27 02:22:52.695012 [INFO] agent: Synced node info
TestCatalogListServicesCommand - 2019/11/27 02:22:52.695151 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/11/27 02:22:53.559705 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCatalogListServicesCommand - 2019/11/27 02:22:53.560366 [DEBUG] consul: Skipping self join check for "Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2" since the cluster is too small
TestCatalogListServicesCommand - 2019/11/27 02:22:53.560556 [INFO] consul: member 'Node 0f01d4b4-f163-8221-22ff-fdd47d2d26e2' joined, marking health alive
TestCatalogListServicesCommand - 2019/11/27 02:22:53.937509 [INFO] agent: Synced service "testing"
TestCatalogListServicesCommand - 2019/11/27 02:22:53.937592 [DEBUG] agent: Node info in sync
TestCatalogListServicesCommand - 2019/11/27 02:22:53.937676 [DEBUG] http: Request PUT /v1/agent/service/register (212.885543ms) from=127.0.0.1:49012
=== RUN   TestCatalogListServicesCommand/simple
TestCatalogListServicesCommand - 2019/11/27 02:22:53.950299 [DEBUG] http: Request GET /v1/catalog/services (2.232416ms) from=127.0.0.1:49014
=== RUN   TestCatalogListServicesCommand/tags
TestCatalogListServicesCommand - 2019/11/27 02:22:53.967469 [DEBUG] http: Request GET /v1/catalog/services (1.245379ms) from=127.0.0.1:49016
=== RUN   TestCatalogListServicesCommand/node_missing
TestCatalogListServicesCommand - 2019/11/27 02:22:53.977662 [DEBUG] http: Request GET /v1/catalog/node/not-a-real-node (2.16608ms) from=127.0.0.1:49018
=== RUN   TestCatalogListServicesCommand/node_present
TestCatalogListServicesCommand - 2019/11/27 02:22:53.986501 [DEBUG] http: Request GET /v1/catalog/node/Node%200f01d4b4-f163-8221-22ff-fdd47d2d26e2 (1.075706ms) from=127.0.0.1:49020
=== RUN   TestCatalogListServicesCommand/node-meta
TestCatalogListServicesCommand - 2019/11/27 02:22:54.014108 [DEBUG] http: Request GET /v1/catalog/services?node-meta=foo%3Abar (1.539724ms) from=127.0.0.1:49022
TestCatalogListServicesCommand - 2019/11/27 02:22:54.020030 [INFO] agent: Requesting shutdown
TestCatalogListServicesCommand - 2019/11/27 02:22:54.020155 [INFO] consul: shutting down server
TestCatalogListServicesCommand - 2019/11/27 02:22:54.020214 [WARN] serf: Shutdown without a Leave
TestCatalogListServicesCommand - 2019/11/27 02:22:54.069927 [WARN] serf: Shutdown without a Leave
TestCatalogListServicesCommand - 2019/11/27 02:22:54.125491 [INFO] manager: shutting down
TestCatalogListServicesCommand - 2019/11/27 02:22:54.125940 [INFO] agent: consul server down
TestCatalogListServicesCommand - 2019/11/27 02:22:54.125988 [INFO] agent: shutdown complete
TestCatalogListServicesCommand - 2019/11/27 02:22:54.126038 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestCatalogListServicesCommand - 2019/11/27 02:22:54.126163 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestCatalogListServicesCommand - 2019/11/27 02:22:54.126298 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestCatalogListServicesCommand - 2019/11/27 02:22:54.127483 [INFO] agent: Waiting for endpoints to shut down
TestCatalogListServicesCommand - 2019/11/27 02:22:54.127688 [INFO] agent: Endpoints down
--- PASS: TestCatalogListServicesCommand (3.21s)
    --- PASS: TestCatalogListServicesCommand/simple (0.02s)
    --- PASS: TestCatalogListServicesCommand/tags (0.01s)
    --- PASS: TestCatalogListServicesCommand/node_missing (0.01s)
    --- PASS: TestCatalogListServicesCommand/node_present (0.01s)
    --- PASS: TestCatalogListServicesCommand/node-meta (0.03s)
PASS
ok  	github.com/hashicorp/consul/command/catalog/list/services	3.360s
=== RUN   TestConnectCommand_noTabs
=== PAUSE TestConnectCommand_noTabs
=== CONT  TestConnectCommand_noTabs
--- PASS: TestConnectCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect	0.041s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca	0.105s
=== RUN   TestConnectCAGetConfigCommand_noTabs
=== PAUSE TestConnectCAGetConfigCommand_noTabs
=== RUN   TestConnectCAGetConfigCommand
=== PAUSE TestConnectCAGetConfigCommand
=== CONT  TestConnectCAGetConfigCommand_noTabs
=== CONT  TestConnectCAGetConfigCommand
--- PASS: TestConnectCAGetConfigCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCAGetConfigCommand - 2019/11/27 02:23:08.376784 [WARN] agent: Node name "Node eae2e56d-34dc-4263-bf2f-510b41fa5afb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCAGetConfigCommand - 2019/11/27 02:23:08.378538 [DEBUG] tlsutil: Update with version 1
TestConnectCAGetConfigCommand - 2019/11/27 02:23:08.378623 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectCAGetConfigCommand - 2019/11/27 02:23:08.378842 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestConnectCAGetConfigCommand - 2019/11/27 02:23:08.378950 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:eae2e56d-34dc-4263-bf2f-510b41fa5afb Address:127.0.0.1:26506}]
2019/11/27 02:23:09 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.204001 [INFO] serf: EventMemberJoin: Node eae2e56d-34dc-4263-bf2f-510b41fa5afb.dc1 127.0.0.1
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.220118 [INFO] serf: EventMemberJoin: Node eae2e56d-34dc-4263-bf2f-510b41fa5afb 127.0.0.1
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.222210 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.223468 [INFO] consul: Adding LAN server Node eae2e56d-34dc-4263-bf2f-510b41fa5afb (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.223731 [INFO] consul: Handled member-join event for server "Node eae2e56d-34dc-4263-bf2f-510b41fa5afb.dc1" in area "wan"
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.224312 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.226228 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.226358 [INFO] agent: started state syncer
2019/11/27 02:23:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:09 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:09 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.660662 [INFO] consul: cluster leadership acquired
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.661559 [INFO] consul: New leader elected: Node eae2e56d-34dc-4263-bf2f-510b41fa5afb
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.925587 [INFO] agent: Synced node info
TestConnectCAGetConfigCommand - 2019/11/27 02:23:09.925709 [DEBUG] agent: Node info in sync
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.205232 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.206798 [DEBUG] consul: Skipping self join check for "Node eae2e56d-34dc-4263-bf2f-510b41fa5afb" since the cluster is too small
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.206969 [INFO] consul: member 'Node eae2e56d-34dc-4263-bf2f-510b41fa5afb' joined, marking health alive
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.892983 [DEBUG] http: Request GET /v1/connect/ca/configuration (11.236081ms) from=127.0.0.1:48976
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.897445 [INFO] agent: Requesting shutdown
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.898367 [INFO] consul: shutting down server
TestConnectCAGetConfigCommand - 2019/11/27 02:23:11.898440 [WARN] serf: Shutdown without a Leave
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.076879 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.076957 [DEBUG] agent: Node info in sync
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.279681 [WARN] serf: Shutdown without a Leave
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.373640 [INFO] manager: shutting down
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.374027 [INFO] agent: consul server down
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.374081 [INFO] agent: shutdown complete
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.374133 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.374274 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.374429 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.374928 [INFO] agent: Waiting for endpoints to shut down
TestConnectCAGetConfigCommand - 2019/11/27 02:23:12.375110 [INFO] agent: Endpoints down
--- PASS: TestConnectCAGetConfigCommand (4.21s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca/get	4.432s
=== RUN   TestConnectCASetConfigCommand_noTabs
=== PAUSE TestConnectCASetConfigCommand_noTabs
=== RUN   TestConnectCASetConfigCommand
=== PAUSE TestConnectCASetConfigCommand
=== CONT  TestConnectCASetConfigCommand_noTabs
=== CONT  TestConnectCASetConfigCommand
--- PASS: TestConnectCASetConfigCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestConnectCASetConfigCommand - 2019/11/27 02:23:11.462017 [WARN] agent: Node name "Node 727354b1-4dec-3451-8541-768a89b15be0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectCASetConfigCommand - 2019/11/27 02:23:11.462828 [DEBUG] tlsutil: Update with version 1
TestConnectCASetConfigCommand - 2019/11/27 02:23:11.462896 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectCASetConfigCommand - 2019/11/27 02:23:11.463115 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestConnectCASetConfigCommand - 2019/11/27 02:23:11.463336 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:12 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:727354b1-4dec-3451-8541-768a89b15be0 Address:127.0.0.1:53506}]
2019/11/27 02:23:12 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.885197 [INFO] serf: EventMemberJoin: Node 727354b1-4dec-3451-8541-768a89b15be0.dc1 127.0.0.1
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.891071 [INFO] serf: EventMemberJoin: Node 727354b1-4dec-3451-8541-768a89b15be0 127.0.0.1
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.894577 [INFO] consul: Adding LAN server Node 727354b1-4dec-3451-8541-768a89b15be0 (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.896632 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.907385 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.907867 [INFO] consul: Handled member-join event for server "Node 727354b1-4dec-3451-8541-768a89b15be0.dc1" in area "wan"
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.909704 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestConnectCASetConfigCommand - 2019/11/27 02:23:12.909914 [INFO] agent: started state syncer
2019/11/27 02:23:12 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:12 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:13 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:13 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestConnectCASetConfigCommand - 2019/11/27 02:23:13.748773 [INFO] consul: cluster leadership acquired
TestConnectCASetConfigCommand - 2019/11/27 02:23:13.749378 [INFO] consul: New leader elected: Node 727354b1-4dec-3451-8541-768a89b15be0
TestConnectCASetConfigCommand - 2019/11/27 02:23:14.214209 [INFO] agent: Synced node info
TestConnectCASetConfigCommand - 2019/11/27 02:23:14.214322 [DEBUG] agent: Node info in sync
TestConnectCASetConfigCommand - 2019/11/27 02:23:14.676110 [DEBUG] agent: Node info in sync
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.035932 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.036366 [DEBUG] consul: Skipping self join check for "Node 727354b1-4dec-3451-8541-768a89b15be0" since the cluster is too small
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.036520 [INFO] consul: member 'Node 727354b1-4dec-3451-8541-768a89b15be0' joined, marking health alive
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.380140 [INFO] connect: CA provider config updated
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.380311 [DEBUG] http: Request PUT /v1/connect/ca/configuration (159.808551ms) from=127.0.0.1:42944
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.381753 [INFO] agent: Requesting shutdown
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.381846 [INFO] consul: shutting down server
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.381893 [WARN] serf: Shutdown without a Leave
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.435123 [WARN] serf: Shutdown without a Leave
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.490782 [INFO] manager: shutting down
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.491200 [INFO] agent: consul server down
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.491247 [INFO] agent: shutdown complete
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.491296 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.491418 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.491558 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.492421 [INFO] agent: Waiting for endpoints to shut down
TestConnectCASetConfigCommand - 2019/11/27 02:23:15.492740 [INFO] agent: Endpoints down
--- PASS: TestConnectCASetConfigCommand (4.12s)
PASS
ok  	github.com/hashicorp/consul/command/connect/ca/set	4.266s
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== RUN   TestGenerateConfig
=== RUN   TestGenerateConfig/no-args
=== RUN   TestGenerateConfig/defaults
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 8502
            }
          }
        ]
      }
    ]
  },
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/grpc-addr-flag
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 9999
            }
          }
        ]
      }
    ]
  },
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
=== RUN   TestGenerateConfig/grpc-addr-env
{
  "admin": {
    "access_log_path": "/dev/null",
    "address": {
      "socket_address": {
        "address": "127.0.0.1",
        "port_value": 19000
      }
    }
  },
  "node": {
    "cluster": "test-proxy",
    "id": "test-proxy"
  },
  "static_resources": {
    "clusters": [
      {
        "name": "local_agent",
        "connect_timeout": "1s",
        "type": "STATIC",
        "http2_protocol_options": {},
        "hosts": [
          {
            "socket_address": {
              "address": "127.0.0.1",
              "port_value": 9999
            }
          }
        ]
      }
    ]
  },
  "dynamic_resources": {
    "lds_config": { "ads": {} },
    "cds_config": { "ads": {} },
    "ads_config": {
      "api_type": "GRPC",
      "grpc_services": {
        "initial_metadata": [
          {
            "key": "x-consul-token",
            "value": ""
          }
        ],
        "envoy_grpc": {
          "cluster_name": "local_agent"
        }
      }
    }
  }
}
--- PASS: TestGenerateConfig (0.14s)
    --- PASS: TestGenerateConfig/no-args (0.00s)
    --- PASS: TestGenerateConfig/defaults (0.11s)
    --- PASS: TestGenerateConfig/grpc-addr-flag (0.01s)
    --- PASS: TestGenerateConfig/grpc-addr-env (0.02s)
=== RUN   TestExecEnvoy
=== RUN   TestExecEnvoy/default
=== RUN   TestExecEnvoy/hot-restart-epoch
=== RUN   TestExecEnvoy/hot-restart-version
=== RUN   TestExecEnvoy/hot-restart-version#01
=== RUN   TestExecEnvoy/hot-restart-version#02
--- PASS: TestExecEnvoy (1.41s)
    --- PASS: TestExecEnvoy/default (0.29s)
    --- PASS: TestExecEnvoy/hot-restart-epoch (0.23s)
    --- PASS: TestExecEnvoy/hot-restart-version (0.32s)
    --- PASS: TestExecEnvoy/hot-restart-version#01 (0.25s)
    --- PASS: TestExecEnvoy/hot-restart-version#02 (0.32s)
=== RUN   TestHelperProcess
--- PASS: TestHelperProcess (0.00s)
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/connect/envoy	1.659s
=== RUN   TestFlagUpstreams_impl
--- PASS: TestFlagUpstreams_impl (0.00s)
=== RUN   TestFlagUpstreams
=== RUN   TestFlagUpstreams/bad_format
=== RUN   TestFlagUpstreams/port_not_int
=== RUN   TestFlagUpstreams/4_parts
=== RUN   TestFlagUpstreams/single_value
=== RUN   TestFlagUpstreams/single_value_prepared_query
=== RUN   TestFlagUpstreams/invalid_type
=== RUN   TestFlagUpstreams/address_specified
=== RUN   TestFlagUpstreams/repeat_value,_overwrite
--- PASS: TestFlagUpstreams (0.01s)
    --- PASS: TestFlagUpstreams/bad_format (0.00s)
    --- PASS: TestFlagUpstreams/port_not_int (0.00s)
    --- PASS: TestFlagUpstreams/4_parts (0.00s)
    --- PASS: TestFlagUpstreams/single_value (0.00s)
    --- PASS: TestFlagUpstreams/single_value_prepared_query (0.00s)
    --- PASS: TestFlagUpstreams/invalid_type (0.00s)
    --- PASS: TestFlagUpstreams/address_specified (0.00s)
    --- PASS: TestFlagUpstreams/repeat_value,_overwrite (0.00s)
=== RUN   TestCommandConfigWatcher
=== PAUSE TestCommandConfigWatcher
=== RUN   TestCatalogCommand_noTabs
=== PAUSE TestCatalogCommand_noTabs
=== RUN   TestRegisterMonitor_good
=== PAUSE TestRegisterMonitor_good
=== RUN   TestRegisterMonitor_heartbeat
=== PAUSE TestRegisterMonitor_heartbeat
=== CONT  TestCommandConfigWatcher
=== RUN   TestCommandConfigWatcher/-service_flag_only
=== CONT  TestRegisterMonitor_good
=== CONT  TestRegisterMonitor_heartbeat
=== CONT  TestCatalogCommand_noTabs
--- PASS: TestCatalogCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestRegisterMonitor_good - 2019/11/27 02:23:36.368459 [WARN] agent: Node name "Node 3d5c7ea3-3028-e24f-4d98-faacbfc47180" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRegisterMonitor_good - 2019/11/27 02:23:36.369584 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_good - 2019/11/27 02:23:36.373118 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/11/27 02:23:36.373622 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestRegisterMonitor_good - 2019/11/27 02:23:36.374523 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:36.413903 [WARN] agent: Node name "Node 054ec3f9-6818-b823-0386-24f5367110ee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:36.414406 [DEBUG] tlsutil: Update with version 1
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:36.414525 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:36.414728 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:36.414876 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:36.453093 [WARN] agent: Node name "Node 3525d8d7-fb26-168f-d0d9-09074b6b564b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:36.453685 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:36.453807 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:36.453971 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:36.454101 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3525d8d7-fb26-168f-d0d9-09074b6b564b Address:127.0.0.1:14506}]
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
2019/11/27 02:23:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3d5c7ea3-3028-e24f-4d98-faacbfc47180 Address:127.0.0.1:14512}]
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.249970 [INFO] serf: EventMemberJoin: Node 3525d8d7-fb26-168f-d0d9-09074b6b564b.dc1 127.0.0.1
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14512 [Follower] entering Follower state (Leader: "")
TestRegisterMonitor_good - 2019/11/27 02:23:37.252309 [INFO] serf: EventMemberJoin: Node 3d5c7ea3-3028-e24f-4d98-faacbfc47180.dc1 127.0.0.1
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.253609 [INFO] serf: EventMemberJoin: Node 3525d8d7-fb26-168f-d0d9-09074b6b564b 127.0.0.1
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.254644 [INFO] consul: Adding LAN server Node 3525d8d7-fb26-168f-d0d9-09074b6b564b (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.254938 [INFO] consul: Handled member-join event for server "Node 3525d8d7-fb26-168f-d0d9-09074b6b564b.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.256119 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.256589 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestRegisterMonitor_good - 2019/11/27 02:23:37.261642 [INFO] serf: EventMemberJoin: Node 3d5c7ea3-3028-e24f-4d98-faacbfc47180 127.0.0.1
TestRegisterMonitor_good - 2019/11/27 02:23:37.262969 [INFO] consul: Handled member-join event for server "Node 3d5c7ea3-3028-e24f-4d98-faacbfc47180.dc1" in area "wan"
TestRegisterMonitor_good - 2019/11/27 02:23:37.263271 [INFO] consul: Adding LAN server Node 3d5c7ea3-3028-e24f-4d98-faacbfc47180 (Addr: tcp/127.0.0.1:14512) (DC: dc1)
TestRegisterMonitor_good - 2019/11/27 02:23:37.263291 [INFO] agent: Started DNS server 127.0.0.1:14507 (udp)
TestRegisterMonitor_good - 2019/11/27 02:23:37.263701 [INFO] agent: Started DNS server 127.0.0.1:14507 (tcp)
TestRegisterMonitor_good - 2019/11/27 02:23:37.266461 [INFO] agent: Started HTTP server on 127.0.0.1:14508 (tcp)
TestRegisterMonitor_good - 2019/11/27 02:23:37.266591 [INFO] agent: started state syncer
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.269509 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.269634 [INFO] agent: started state syncer
2019/11/27 02:23:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14512 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:054ec3f9-6818-b823-0386-24f5367110ee Address:127.0.0.1:14518}]
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.338543 [INFO] serf: EventMemberJoin: Node 054ec3f9-6818-b823-0386-24f5367110ee.dc1 127.0.0.1
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14518 [Follower] entering Follower state (Leader: "")
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.385223 [INFO] serf: EventMemberJoin: Node 054ec3f9-6818-b823-0386-24f5367110ee 127.0.0.1
2019/11/27 02:23:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14518 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.400716 [INFO] consul: Adding LAN server Node 054ec3f9-6818-b823-0386-24f5367110ee (Addr: tcp/127.0.0.1:14518) (DC: dc1)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.403351 [INFO] consul: Handled member-join event for server "Node 054ec3f9-6818-b823-0386-24f5367110ee.dc1" in area "wan"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.417700 [INFO] agent: Started DNS server 127.0.0.1:14513 (udp)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.418276 [INFO] agent: Started DNS server 127.0.0.1:14513 (tcp)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.423159 [INFO] agent: Started HTTP server on 127.0.0.1:14514 (tcp)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.423574 [INFO] agent: started state syncer
2019/11/27 02:23:37 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14512 [Leader] entering Leader state
2019/11/27 02:23:37 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14518 [Leader] entering Leader state
2019/11/27 02:23:37 [INFO]  raft: Election won. Tally: 1
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.858217 [INFO] consul: cluster leadership acquired
2019/11/27 02:23:37 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.858485 [INFO] consul: cluster leadership acquired
TestRegisterMonitor_good - 2019/11/27 02:23:37.858715 [INFO] consul: cluster leadership acquired
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:37.859502 [INFO] consul: New leader elected: Node 054ec3f9-6818-b823-0386-24f5367110ee
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:37.859625 [INFO] consul: New leader elected: Node 3525d8d7-fb26-168f-d0d9-09074b6b564b
TestRegisterMonitor_good - 2019/11/27 02:23:37.859926 [INFO] consul: New leader elected: Node 3d5c7ea3-3028-e24f-4d98-faacbfc47180
TestRegisterMonitor_good - 2019/11/27 02:23:37.981425 [DEBUG] http: Request GET /v1/catalog/service/foo-proxy?stale= (3.758804ms) from=127.0.0.1:58370
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.129273 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.129563 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.129781 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:38.223475 [INFO] agent: Synced node info
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:38.223588 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.223923 [INFO] agent: Synced service "no-sidecar"
TestRegisterMonitor_good - 2019/11/27 02:23:38.226636 [INFO] agent: Synced node info
TestRegisterMonitor_good - 2019/11/27 02:23:38.243936 [DEBUG] http: Request GET /v1/agent/services (266.256092ms) from=127.0.0.1:58368
TestRegisterMonitor_good - 2019/11/27 02:23:38.282187 [DEBUG] http: Request GET /v1/agent/services (1.824067ms) from=127.0.0.1:58368
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.322684 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.391997 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.392186 [WARN] agent: Syncing service "one-sidecar" failed. raft is already shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.392258 [ERR] agent: failed to sync remote state: raft is already shutdown
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.467393 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.467483 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.467544 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.467695 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.467860 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.468077 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.468152 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service_flag_with_upstreams
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.472300 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_only - 2019/11/27 02:23:38.472820 [ERR] consul: failed to establish leadership: raft is already shutdown
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:38.558713 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:38.590676 [WARN] agent: Node name "Node cadd0b2a-2353-6385-8e12-5aac22eb4c07" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:38.591430 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:38.596304 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:38.596745 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:38.596984 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_good - 2019/11/27 02:23:38.728007 [INFO] agent: Synced service "foo-proxy"
TestRegisterMonitor_good - 2019/11/27 02:23:38.732924 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_good - 2019/11/27 02:23:38.733223 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/11/27 02:23:38.733448 [DEBUG] http: Request PUT /v1/agent/service/register (746.182016ms) from=127.0.0.1:58370
2019/11/27 02:23:38 [INFO] proxy: registered Consul service: foo-proxy
2019/11/27 02:23:38 [INFO] proxy: stop request received, deregistering
TestRegisterMonitor_good - 2019/11/27 02:23:38.738794 [DEBUG] agent: removed check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/11/27 02:23:38.739167 [DEBUG] agent: removed service "foo-proxy"
TestRegisterMonitor_good - 2019/11/27 02:23:39.267052 [INFO] agent: Deregistered service "foo-proxy"
TestRegisterMonitor_good - 2019/11/27 02:23:39.611944 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_good - 2019/11/27 02:23:39.612031 [DEBUG] agent: Node info in sync
TestRegisterMonitor_good - 2019/11/27 02:23:39.612123 [DEBUG] http: Request PUT /v1/agent/service/deregister/foo-proxy (876.421118ms) from=127.0.0.1:58370
TestRegisterMonitor_good - 2019/11/27 02:23:39.614663 [DEBUG] http: Request GET /v1/agent/services (599.022µs) from=127.0.0.1:58370
TestRegisterMonitor_good - 2019/11/27 02:23:39.616262 [INFO] agent: Requesting shutdown
TestRegisterMonitor_good - 2019/11/27 02:23:39.616367 [INFO] consul: shutting down server
TestRegisterMonitor_good - 2019/11/27 02:23:39.616418 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_good - 2019/11/27 02:23:39.700364 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:39.701340 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:39.701857 [DEBUG] consul: Skipping self join check for "Node 054ec3f9-6818-b823-0386-24f5367110ee" since the cluster is too small
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:39.702068 [INFO] consul: member 'Node 054ec3f9-6818-b823-0386-24f5367110ee' joined, marking health alive
2019/11/27 02:23:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cadd0b2a-2353-6385-8e12-5aac22eb4c07 Address:127.0.0.1:14524}]
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.705810 [INFO] serf: EventMemberJoin: Node cadd0b2a-2353-6385-8e12-5aac22eb4c07.dc1 127.0.0.1
2019/11/27 02:23:39 [INFO]  raft: Node at 127.0.0.1:14524 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.713836 [INFO] serf: EventMemberJoin: Node cadd0b2a-2353-6385-8e12-5aac22eb4c07 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.715083 [INFO] consul: Adding LAN server Node cadd0b2a-2353-6385-8e12-5aac22eb4c07 (Addr: tcp/127.0.0.1:14524) (DC: dc1)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.715749 [INFO] consul: Handled member-join event for server "Node cadd0b2a-2353-6385-8e12-5aac22eb4c07.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.724829 [INFO] agent: Started DNS server 127.0.0.1:14519 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.725588 [INFO] agent: Started DNS server 127.0.0.1:14519 (udp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.728566 [INFO] agent: Started HTTP server on 127.0.0.1:14520 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.729180 [INFO] agent: started state syncer
2019/11/27 02:23:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:39 [INFO]  raft: Node at 127.0.0.1:14524 [Candidate] entering Candidate state in term 2
TestRegisterMonitor_good - 2019/11/27 02:23:39.822596 [INFO] manager: shutting down
TestRegisterMonitor_good - 2019/11/27 02:23:39.823645 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestRegisterMonitor_good - 2019/11/27 02:23:39.823987 [INFO] agent: consul server down
TestRegisterMonitor_good - 2019/11/27 02:23:39.824103 [INFO] agent: shutdown complete
TestRegisterMonitor_good - 2019/11/27 02:23:39.824158 [INFO] agent: Stopping DNS server 127.0.0.1:14507 (tcp)
TestRegisterMonitor_good - 2019/11/27 02:23:39.824339 [INFO] agent: Stopping DNS server 127.0.0.1:14507 (udp)
TestRegisterMonitor_good - 2019/11/27 02:23:39.824505 [INFO] agent: Stopping HTTP server 127.0.0.1:14508 (tcp)
TestRegisterMonitor_good - 2019/11/27 02:23:39.825257 [INFO] agent: Waiting for endpoints to shut down
TestRegisterMonitor_good - 2019/11/27 02:23:39.825350 [INFO] agent: Endpoints down
--- PASS: TestRegisterMonitor_good (3.66s)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:39.970066 [DEBUG] http: Request GET /v1/catalog/service/foo-proxy?stale= (1.432719ms) from=127.0.0.1:41370
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:39.970564 [DEBUG] http: Request GET /v1/agent/services (2.046075ms) from=127.0.0.1:41372
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:39.974527 [WARN] agent: Check "service:two-sidecars-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:21000: connect: connection refused
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:39.999170 [DEBUG] http: Request GET /v1/agent/services (696.359µs) from=127.0.0.1:41372
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.002974 [DEBUG] http: Request GET /v1/agent/checks (681.358µs) from=127.0.0.1:41372
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.202861 [DEBUG] agent: Check "foo-proxy-ttl" status is now critical
2019/11/27 02:23:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:40 [INFO]  raft: Node at 127.0.0.1:14524 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.456331 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.456870 [INFO] consul: New leader elected: Node cadd0b2a-2353-6385-8e12-5aac22eb4c07
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.459036 [INFO] agent: Synced service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.459135 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.459174 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.459291 [DEBUG] http: Request PUT /v1/agent/service/register (485.648793ms) from=127.0.0.1:41370
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.459910 [DEBUG] agent: Service "foo-proxy" in sync
2019/11/27 02:23:40 [INFO] proxy: registered Consul service: foo-proxy
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.530944 [DEBUG] agent: Check "foo-proxy-ttl" status is now passing
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.568789 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.568937 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.586923 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.689217 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.778191 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.855982 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856186 [WARN] agent: Syncing service "two-sidecars-sidecar-proxy" failed. leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856251 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856271 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856412 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856479 [INFO] agent: Stopping DNS server 127.0.0.1:14519 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856650 [INFO] agent: Stopping DNS server 127.0.0.1:14519 (udp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.856901 [INFO] agent: Stopping HTTP server 127.0.0.1:14520 (tcp)
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.857144 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_with_upstreams - 2019/11/27 02:23:40.857220 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service_flag_with_-service-addr
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858145 [INFO] agent: Synced check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858210 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858331 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858389 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858425 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858516 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858570 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858605 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858685 [DEBUG] http: Request PUT /v1/agent/check/fail/foo-proxy-ttl?note= (851.483863ms) from=127.0.0.1:41372
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:40.858922 [DEBUG] agent: Service "foo-proxy" in sync
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:40.996282 [WARN] agent: Node name "Node 659a24c0-59b1-c90e-c20c-d0d389b365e2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:40.996940 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:40.997033 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:40.997215 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:40.997344 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.024423 [INFO] agent: Synced check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.024527 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.025705 [DEBUG] http: Request GET /v1/agent/checks (162.731628ms) from=127.0.0.1:41372
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.026747 [DEBUG] agent: Service "foo-proxy" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.026857 [DEBUG] agent: Check "foo-proxy-ttl" in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.026901 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.026973 [DEBUG] http: Request PUT /v1/agent/check/pass/foo-proxy-ttl?note= (496.057507ms) from=127.0.0.1:41370
2019/11/27 02:23:41 [INFO] proxy: stop request received, deregistering
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.030223 [DEBUG] agent: removed check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.030302 [DEBUG] agent: removed service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.247570 [INFO] agent: Deregistered service "foo-proxy"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.415467 [INFO] agent: Deregistered check "foo-proxy-ttl"
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.415550 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.415889 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.415982 [DEBUG] http: Request PUT /v1/agent/service/deregister/foo-proxy (387.591198ms) from=127.0.0.1:41370
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.416078 [DEBUG] agent: Node info in sync
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.416571 [INFO] agent: Requesting shutdown
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.416648 [INFO] consul: shutting down server
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.416745 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.489103 [WARN] serf: Shutdown without a Leave
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.566979 [INFO] manager: shutting down
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.567654 [INFO] agent: consul server down
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.567716 [INFO] agent: shutdown complete
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.567775 [INFO] agent: Stopping DNS server 127.0.0.1:14513 (tcp)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.567907 [INFO] agent: Stopping DNS server 127.0.0.1:14513 (udp)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.568075 [INFO] agent: Stopping HTTP server 127.0.0.1:14514 (tcp)
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.568716 [INFO] agent: Waiting for endpoints to shut down
TestRegisterMonitor_heartbeat - 2019/11/27 02:23:41.568806 [INFO] agent: Endpoints down
--- PASS: TestRegisterMonitor_heartbeat (5.40s)
2019/11/27 02:23:41 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:659a24c0-59b1-c90e-c20c-d0d389b365e2 Address:127.0.0.1:14530}]
2019/11/27 02:23:41 [INFO]  raft: Node at 127.0.0.1:14530 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.870386 [INFO] serf: EventMemberJoin: Node 659a24c0-59b1-c90e-c20c-d0d389b365e2.dc1 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.874162 [INFO] serf: EventMemberJoin: Node 659a24c0-59b1-c90e-c20c-d0d389b365e2 127.0.0.1
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.875390 [INFO] consul: Adding LAN server Node 659a24c0-59b1-c90e-c20c-d0d389b365e2 (Addr: tcp/127.0.0.1:14530) (DC: dc1)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.875880 [INFO] consul: Handled member-join event for server "Node 659a24c0-59b1-c90e-c20c-d0d389b365e2.dc1" in area "wan"
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.881234 [INFO] agent: Started DNS server 127.0.0.1:14525 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.882258 [INFO] agent: Started DNS server 127.0.0.1:14525 (udp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.884367 [INFO] agent: Started HTTP server on 127.0.0.1:14526 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:41.884473 [INFO] agent: started state syncer
2019/11/27 02:23:41 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:41 [INFO]  raft: Node at 127.0.0.1:14530 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:42 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:42 [INFO]  raft: Node at 127.0.0.1:14530 [Leader] entering Leader state
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:42.357466 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:42.357910 [INFO] consul: New leader elected: Node 659a24c0-59b1-c90e-c20c-d0d389b365e2
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:42.498915 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:42.499088 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:42.499141 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:42.655958 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.133527 [INFO] manager: shutting down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.511504 [INFO] agent: consul server down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.511575 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.511633 [INFO] agent: Stopping DNS server 127.0.0.1:14525 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.511842 [INFO] agent: Stopping DNS server 127.0.0.1:14525 (udp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.512010 [INFO] agent: Stopping HTTP server 127.0.0.1:14526 (tcp)
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.512210 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.512281 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-service,_-service-addr,_-listen
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.512939 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.513113 [WARN] agent: Syncing service "no-sidecar" failed. leadership lost while committing log
TestCommandConfigWatcher/-service_flag_with_-service-addr - 2019/11/27 02:23:43.513179 [ERR] agent: failed to sync remote state: leadership lost while committing log
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:43.661665 [WARN] agent: Node name "Node 2adb46d1-b9e9-46e6-f769-429a3a257ac7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:43.664367 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:43.664440 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:43.664608 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:43.664706 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2adb46d1-b9e9-46e6-f769-429a3a257ac7 Address:127.0.0.1:14536}]
2019/11/27 02:23:45 [INFO]  raft: Node at 127.0.0.1:14536 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.171298 [INFO] serf: EventMemberJoin: Node 2adb46d1-b9e9-46e6-f769-429a3a257ac7.dc1 127.0.0.1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.204601 [INFO] serf: EventMemberJoin: Node 2adb46d1-b9e9-46e6-f769-429a3a257ac7 127.0.0.1
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.207627 [INFO] consul: Adding LAN server Node 2adb46d1-b9e9-46e6-f769-429a3a257ac7 (Addr: tcp/127.0.0.1:14536) (DC: dc1)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.208189 [INFO] consul: Handled member-join event for server "Node 2adb46d1-b9e9-46e6-f769-429a3a257ac7.dc1" in area "wan"
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.209046 [INFO] agent: Started DNS server 127.0.0.1:14531 (udp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.210826 [INFO] agent: Started DNS server 127.0.0.1:14531 (tcp)
2019/11/27 02:23:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:45 [INFO]  raft: Node at 127.0.0.1:14536 [Candidate] entering Candidate state in term 2
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.228679 [INFO] agent: Started HTTP server on 127.0.0.1:14532 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.228806 [INFO] agent: started state syncer
2019/11/27 02:23:45 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:45 [INFO]  raft: Node at 127.0.0.1:14536 [Leader] entering Leader state
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:45.999861 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.000282 [INFO] consul: New leader elected: Node 2adb46d1-b9e9-46e6-f769-429a3a257ac7
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.082570 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.082796 [INFO] consul: shutting down server
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.082862 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.083051 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.513531 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:46.822361 [INFO] manager: shutting down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.044433 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.044701 [INFO] agent: consul server down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.044764 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.044820 [INFO] agent: Stopping DNS server 127.0.0.1:14531 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.044979 [INFO] agent: Stopping DNS server 127.0.0.1:14531 (udp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.045176 [INFO] agent: Stopping HTTP server 127.0.0.1:14532 (tcp)
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.045443 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-service,_-service-addr,_-listen - 2019/11/27 02:23:47.045524 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_no_sidecar
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:47.116397 [WARN] agent: Node name "Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:47.116846 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:47.116917 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:47.117067 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:47.117164 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2b6e362e-51fd-35d4-828e-a43cbae1ecee Address:127.0.0.1:14542}]
2019/11/27 02:23:50 [INFO]  raft: Node at 127.0.0.1:14542 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.395420 [INFO] serf: EventMemberJoin: Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.400750 [INFO] serf: EventMemberJoin: Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.403692 [INFO] agent: Started DNS server 127.0.0.1:14537 (udp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.404260 [INFO] consul: Adding LAN server Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee (Addr: tcp/127.0.0.1:14542) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.404542 [INFO] consul: Handled member-join event for server "Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.406196 [INFO] agent: Started DNS server 127.0.0.1:14537 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.408837 [INFO] agent: Started HTTP server on 127.0.0.1:14538 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:50.408962 [INFO] agent: started state syncer
2019/11/27 02:23:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:50 [INFO]  raft: Node at 127.0.0.1:14542 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:52 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:52 [INFO]  raft: Node at 127.0.0.1:14542 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:52.066779 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:52.067220 [INFO] consul: New leader elected: Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:52.368288 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:53.300222 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:53.593246 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:53.734011 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.024487 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.178178 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.890665 [DEBUG] consul: Skipping self join check for "Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.890886 [INFO] consul: member 'Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.894686 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.894850 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.894901 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.894992 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.895073 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.895137 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.896141 [DEBUG] http: Request GET /v1/agent/services (2.732978553s) from=127.0.0.1:49838
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.899143 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.899275 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.899332 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.902476 [WARN] consul: error getting server health from "Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee": rpc error making call: EOF
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:54.988257 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.066082 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.068355 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.068419 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.068474 [INFO] agent: Stopping DNS server 127.0.0.1:14537 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.068610 [INFO] agent: Stopping DNS server 127.0.0.1:14537 (udp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.068760 [INFO] agent: Stopping HTTP server 127.0.0.1:14538 (tcp)
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.069197 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.069283 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:55.156824 [WARN] agent: Node name "Node f136797b-468e-92fb-824f-683dcd505226" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:55.157413 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:55.157487 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:55.157716 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:55.157826 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_no_sidecar - 2019/11/27 02:23:55.889999 [WARN] consul: error getting server health from "Node 2b6e362e-51fd-35d4-828e-a43cbae1ecee": context deadline exceeded
2019/11/27 02:23:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f136797b-468e-92fb-824f-683dcd505226 Address:127.0.0.1:14548}]
2019/11/27 02:23:56 [INFO]  raft: Node at 127.0.0.1:14548 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.439611 [INFO] serf: EventMemberJoin: Node f136797b-468e-92fb-824f-683dcd505226.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.442927 [INFO] serf: EventMemberJoin: Node f136797b-468e-92fb-824f-683dcd505226 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.443962 [INFO] consul: Handled member-join event for server "Node f136797b-468e-92fb-824f-683dcd505226.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.444246 [INFO] consul: Adding LAN server Node f136797b-468e-92fb-824f-683dcd505226 (Addr: tcp/127.0.0.1:14548) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.445131 [INFO] agent: Started DNS server 127.0.0.1:14543 (udp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.445481 [INFO] agent: Started DNS server 127.0.0.1:14543 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.447449 [INFO] agent: Started HTTP server on 127.0.0.1:14544 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:56.447534 [INFO] agent: started state syncer
2019/11/27 02:23:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:56 [INFO]  raft: Node at 127.0.0.1:14548 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:57 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:14548 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:57.099772 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:57.100132 [INFO] consul: New leader elected: Node f136797b-468e-92fb-824f-683dcd505226
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:57.445250 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:57.848933 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:58.486810 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:59.505545 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:59.825728 [WARN] agent: Check "service:one-sidecar-sidecar-proxy:1" socket connection failed: dial tcp 127.0.0.1:9999: connect: connection refused
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:59.978670 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:59.979085 [DEBUG] consul: Skipping self join check for "Node f136797b-468e-92fb-824f-683dcd505226" since the cluster is too small
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:59.979270 [INFO] consul: member 'Node f136797b-468e-92fb-824f-683dcd505226' joined, marking health alive
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:23:59.982187 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646433 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646542 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646599 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646645 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646753 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646794 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.646954 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647009 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647051 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647100 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647136 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647171 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647219 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647268 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647312 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647362 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.647394 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.653048 [DEBUG] http: Request GET /v1/agent/services (3.247014223s) from=127.0.0.1:51526
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.657751 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.657957 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.658020 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.811581 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.972620 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.973422 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.973535 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.973632 [INFO] agent: Stopping DNS server 127.0.0.1:14543 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.973880 [INFO] agent: Stopping DNS server 127.0.0.1:14543 (udp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.974124 [INFO] agent: Stopping HTTP server 127.0.0.1:14544 (tcp)
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.974712 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars - 2019/11/27 02:24:00.974820 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_non-existent
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:01.107279 [WARN] agent: Node name "Node 9880312e-a9b0-c843-c094-82ce5ae31c4c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:01.107706 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:01.107786 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:01.107975 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:01.108098 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:9880312e-a9b0-c843-c094-82ce5ae31c4c Address:127.0.0.1:14554}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:14554 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.524887 [INFO] serf: EventMemberJoin: Node 9880312e-a9b0-c843-c094-82ce5ae31c4c.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.532704 [INFO] serf: EventMemberJoin: Node 9880312e-a9b0-c843-c094-82ce5ae31c4c 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.533928 [INFO] consul: Adding LAN server Node 9880312e-a9b0-c843-c094-82ce5ae31c4c (Addr: tcp/127.0.0.1:14554) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.534179 [INFO] consul: Handled member-join event for server "Node 9880312e-a9b0-c843-c094-82ce5ae31c4c.dc1" in area "wan"
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.537319 [INFO] agent: Started DNS server 127.0.0.1:14549 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.539304 [INFO] agent: Started DNS server 127.0.0.1:14549 (udp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.546816 [INFO] agent: Started HTTP server on 127.0.0.1:14550 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:02.546916 [INFO] agent: started state syncer
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:14554 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:03 [INFO]  raft: Node at 127.0.0.1:14554 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.114416 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.114979 [INFO] consul: New leader elected: Node 9880312e-a9b0-c843-c094-82ce5ae31c4c
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.206027 [DEBUG] http: Request GET /v1/agent/services (1.774065ms) from=127.0.0.1:40066
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.210115 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.210322 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.210395 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.211195 [ERR] agent: failed to sync remote state: No cluster leader
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.354564 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.436626 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612191 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612470 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612531 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612565 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612590 [INFO] agent: Stopping DNS server 127.0.0.1:14549 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612805 [INFO] agent: Stopping DNS server 127.0.0.1:14549 (udp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.612987 [INFO] agent: Stopping HTTP server 127.0.0.1:14550 (tcp)
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.613527 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_non-existent - 2019/11/27 02:24:03.613649 [INFO] agent: Endpoints down
=== RUN   TestCommandConfigWatcher/-sidecar-for,_one_sidecar
WARNING: bootstrap = true: do not enable unless necessary
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:03.710911 [WARN] agent: Node name "Node 975334ec-490a-8c60-12a3-3634e5c68ee8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:03.711347 [DEBUG] tlsutil: Update with version 1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:03.711430 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:03.711630 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:03.711796 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:24:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:975334ec-490a-8c60-12a3-3634e5c68ee8 Address:127.0.0.1:14560}]
2019/11/27 02:24:05 [INFO]  raft: Node at 127.0.0.1:14560 [Follower] entering Follower state (Leader: "")
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.728462 [INFO] serf: EventMemberJoin: Node 975334ec-490a-8c60-12a3-3634e5c68ee8.dc1 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.762122 [INFO] serf: EventMemberJoin: Node 975334ec-490a-8c60-12a3-3634e5c68ee8 127.0.0.1
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.764187 [INFO] consul: Adding LAN server Node 975334ec-490a-8c60-12a3-3634e5c68ee8 (Addr: tcp/127.0.0.1:14560) (DC: dc1)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.772428 [INFO] consul: Handled member-join event for server "Node 975334ec-490a-8c60-12a3-3634e5c68ee8.dc1" in area "wan"
2019/11/27 02:24:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:05 [INFO]  raft: Node at 127.0.0.1:14560 [Candidate] entering Candidate state in term 2
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.781335 [INFO] agent: Started DNS server 127.0.0.1:14555 (udp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.782425 [INFO] agent: Started DNS server 127.0.0.1:14555 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.786123 [INFO] agent: Started HTTP server on 127.0.0.1:14556 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:05.786211 [INFO] agent: started state syncer
2019/11/27 02:24:06 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:06 [INFO]  raft: Node at 127.0.0.1:14560 [Leader] entering Leader state
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:06.978845 [INFO] consul: cluster leadership acquired
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:06.979775 [INFO] consul: New leader elected: Node 975334ec-490a-8c60-12a3-3634e5c68ee8
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:07.645045 [INFO] agent: Synced service "no-sidecar"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:08.159760 [INFO] agent: Synced service "one-sidecar"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:09.158758 [INFO] agent: Synced service "one-sidecar-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:10.357252 [INFO] agent: Synced service "two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.023717 [INFO] agent: Synced service "two-sidecars-sidecar-proxy"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.277780 [INFO] agent: Synced service "other-sidecar-for-two-sidecars"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.277894 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.277954 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278050 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278109 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278141 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278283 [DEBUG] agent: Service "two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278350 [DEBUG] agent: Service "two-sidecars-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278392 [DEBUG] agent: Service "other-sidecar-for-two-sidecars" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278443 [DEBUG] agent: Service "no-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278478 [DEBUG] agent: Service "one-sidecar" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278522 [DEBUG] agent: Service "one-sidecar-sidecar-proxy" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278570 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278620 [DEBUG] agent: Check "service:one-sidecar-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278663 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:1" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278714 [DEBUG] agent: Check "service:two-sidecars-sidecar-proxy:2" in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.278745 [DEBUG] agent: Node info in sync
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.281271 [DEBUG] http: Request GET /v1/agent/services (4.019884847s) from=127.0.0.1:50304
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.286861 [DEBUG] http: Request GET /v1/agent/service/one-sidecar-sidecar-proxy (1.858735ms) from=127.0.0.1:50304
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.290356 [DEBUG] http: Request GET /v1/agent/service/one-sidecar-sidecar-proxy (3.868474ms) from=127.0.0.1:50320
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.299260 [INFO] agent: Requesting shutdown
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.299426 [INFO] consul: shutting down server
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.299484 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.476235 [WARN] serf: Shutdown without a Leave
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.554110 [INFO] manager: shutting down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.555742 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.556170 [ERR] consul: failed to get raft configuration: raft is already shutdown
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.556456 [ERR] consul: failed to reconcile member: {Node 975334ec-490a-8c60-12a3-3634e5c68ee8 127.0.0.1 14558 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:975334ec-490a-8c60-12a3-3634e5c68ee8 port:14560 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:14559] alive 1 5 2 2 5 4}: raft is already shutdown
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.556589 [INFO] agent: consul server down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.556644 [INFO] agent: shutdown complete
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.556763 [INFO] agent: Stopping DNS server 127.0.0.1:14555 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.556951 [INFO] agent: Stopping DNS server 127.0.0.1:14555 (udp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:11.557123 [INFO] agent: Stopping HTTP server 127.0.0.1:14556 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:12.557650 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:14556 (tcp)
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:12.557747 [INFO] agent: Waiting for endpoints to shut down
TestCommandConfigWatcher/-sidecar-for,_one_sidecar - 2019/11/27 02:24:12.557788 [INFO] agent: Endpoints down
--- PASS: TestCommandConfigWatcher (36.40s)
    --- PASS: TestCommandConfigWatcher/-service_flag_only (2.31s)
    --- PASS: TestCommandConfigWatcher/-service_flag_with_upstreams (2.39s)
    --- PASS: TestCommandConfigWatcher/-service_flag_with_-service-addr (2.65s)
    --- PASS: TestCommandConfigWatcher/-service,_-service-addr,_-listen (3.53s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_no_sidecar (8.02s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_multiple_sidecars (5.91s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_non-existent (2.64s)
    --- PASS: TestCommandConfigWatcher/-sidecar-for,_one_sidecar (8.94s)
PASS
ok  	github.com/hashicorp/consul/command/connect/proxy	36.659s
=== RUN   TestDebugCommand_noTabs
=== PAUSE TestDebugCommand_noTabs
=== RUN   TestDebugCommand
--- SKIP: TestDebugCommand (0.00s)
    debug_test.go:29: DM-skipped
=== RUN   TestDebugCommand_Archive
=== PAUSE TestDebugCommand_Archive
=== RUN   TestDebugCommand_ArgsBad
=== PAUSE TestDebugCommand_ArgsBad
=== RUN   TestDebugCommand_OutputPathBad
=== PAUSE TestDebugCommand_OutputPathBad
=== RUN   TestDebugCommand_OutputPathExists
=== PAUSE TestDebugCommand_OutputPathExists
=== RUN   TestDebugCommand_CaptureTargets
=== PAUSE TestDebugCommand_CaptureTargets
=== RUN   TestDebugCommand_ProfilesExist
=== PAUSE TestDebugCommand_ProfilesExist
=== RUN   TestDebugCommand_ValidateTiming
=== PAUSE TestDebugCommand_ValidateTiming
=== RUN   TestDebugCommand_DebugDisabled
=== PAUSE TestDebugCommand_DebugDisabled
=== CONT  TestDebugCommand_noTabs
=== CONT  TestDebugCommand_DebugDisabled
--- PASS: TestDebugCommand_noTabs (0.00s)
=== CONT  TestDebugCommand_OutputPathExists
=== CONT  TestDebugCommand_OutputPathBad
=== CONT  TestDebugCommand_CaptureTargets
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:56.556151 [WARN] agent: Node name "Node 1e3d6e38-620c-290d-c3f0-0f79493685a1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:56.556960 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:56.557180 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:56.557398 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:56.557497 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:56.584347 [WARN] agent: Node name "Node 5e62198d-767b-5a8b-1bd2-180a7b574e0a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:56.584784 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:56.584854 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:56.585020 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:56.585144 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:56.590590 [WARN] agent: Node name "Node c5476181-adf7-0f19-d211-e53cf982730b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:56.590996 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:56.591071 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:56.591797 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:56.591926 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:56.601836 [WARN] agent: Node name "Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:56.602415 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:56.602559 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:56.602811 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:56.602992 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff Address:127.0.0.1:16006}]
2019/11/27 02:23:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c5476181-adf7-0f19-d211-e53cf982730b Address:127.0.0.1:16018}]
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16006 [Follower] entering Follower state (Leader: "")
2019/11/27 02:23:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1e3d6e38-620c-290d-c3f0-0f79493685a1 Address:127.0.0.1:16024}]
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16018 [Follower] entering Follower state (Leader: "")
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16024 [Follower] entering Follower state (Leader: "")
2019/11/27 02:23:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5e62198d-767b-5a8b-1bd2-180a7b574e0a Address:127.0.0.1:16012}]
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16012 [Follower] entering Follower state (Leader: "")
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.685791 [INFO] serf: EventMemberJoin: Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.687242 [INFO] serf: EventMemberJoin: Node 1e3d6e38-620c-290d-c3f0-0f79493685a1.dc1 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.696152 [INFO] serf: EventMemberJoin: Node 5e62198d-767b-5a8b-1bd2-180a7b574e0a.dc1 127.0.0.1
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.698705 [INFO] serf: EventMemberJoin: Node c5476181-adf7-0f19-d211-e53cf982730b.dc1 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.702361 [INFO] serf: EventMemberJoin: Node 5e62198d-767b-5a8b-1bd2-180a7b574e0a 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.704849 [INFO] consul: Adding LAN server Node 5e62198d-767b-5a8b-1bd2-180a7b574e0a (Addr: tcp/127.0.0.1:16012) (DC: dc1)
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.705223 [INFO] consul: Handled member-join event for server "Node 5e62198d-767b-5a8b-1bd2-180a7b574e0a.dc1" in area "wan"
2019/11/27 02:23:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16024 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.728793 [INFO] serf: EventMemberJoin: Node c5476181-adf7-0f19-d211-e53cf982730b 127.0.0.1
2019/11/27 02:23:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16012 [Candidate] entering Candidate state in term 2
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.730952 [INFO] agent: Started DNS server 127.0.0.1:16007 (udp)
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.731168 [INFO] agent: Started DNS server 127.0.0.1:16013 (udp)
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.731284 [INFO] agent: Started DNS server 127.0.0.1:16007 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.733415 [INFO] serf: EventMemberJoin: Node 1e3d6e38-620c-290d-c3f0-0f79493685a1 127.0.0.1
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.733733 [INFO] agent: Started HTTP server on 127.0.0.1:16008 (tcp)
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:57.734033 [INFO] agent: started state syncer
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.728793 [INFO] serf: EventMemberJoin: Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.735042 [INFO] agent: Started DNS server 127.0.0.1:16019 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.735727 [INFO] consul: Adding LAN server Node 1e3d6e38-620c-290d-c3f0-0f79493685a1 (Addr: tcp/127.0.0.1:16024) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.735937 [INFO] consul: Handled member-join event for server "Node 1e3d6e38-620c-290d-c3f0-0f79493685a1.dc1" in area "wan"
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.736161 [INFO] agent: Started DNS server 127.0.0.1:16001 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.736356 [INFO] agent: Started DNS server 127.0.0.1:16019 (tcp)
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.736791 [INFO] consul: Adding LAN server Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff (Addr: tcp/127.0.0.1:16006) (DC: dc1)
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.736992 [INFO] consul: Handled member-join event for server "Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff.dc1" in area "wan"
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.737085 [INFO] consul: Adding LAN server Node c5476181-adf7-0f19-d211-e53cf982730b (Addr: tcp/127.0.0.1:16018) (DC: dc1)
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.737259 [INFO] consul: Handled member-join event for server "Node c5476181-adf7-0f19-d211-e53cf982730b.dc1" in area "wan"
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.737406 [INFO] agent: Started DNS server 127.0.0.1:16001 (tcp)
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.737654 [INFO] agent: Started DNS server 127.0.0.1:16013 (tcp)
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.739166 [INFO] agent: Started HTTP server on 127.0.0.1:16002 (tcp)
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:57.739237 [INFO] agent: started state syncer
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.739390 [INFO] agent: Started HTTP server on 127.0.0.1:16014 (tcp)
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:57.739453 [INFO] agent: started state syncer
2019/11/27 02:23:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16006 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:16018 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.744451 [INFO] agent: Started HTTP server on 127.0.0.1:16020 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:57.744675 [INFO] agent: started state syncer
2019/11/27 02:23:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:58 [INFO]  raft: Node at 127.0.0.1:16012 [Leader] entering Leader state
2019/11/27 02:23:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:58 [INFO]  raft: Node at 127.0.0.1:16024 [Leader] entering Leader state
2019/11/27 02:23:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:58 [INFO]  raft: Node at 127.0.0.1:16006 [Leader] entering Leader state
2019/11/27 02:23:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:58 [INFO]  raft: Node at 127.0.0.1:16018 [Leader] entering Leader state
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:58.690083 [INFO] consul: cluster leadership acquired
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:58.690260 [INFO] consul: cluster leadership acquired
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:58.690606 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:58.690764 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:58.692558 [INFO] consul: New leader elected: Node 1e3d6e38-620c-290d-c3f0-0f79493685a1
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:58.692813 [INFO] consul: New leader elected: Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:58.693000 [INFO] consul: New leader elected: Node c5476181-adf7-0f19-d211-e53cf982730b
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:58.693178 [INFO] consul: New leader elected: Node 5e62198d-767b-5a8b-1bd2-180a7b574e0a
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:59.499975 [INFO] agent: Synced node info
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.500449 [INFO] agent: Synced node info
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.500539 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:59.501843 [INFO] agent: Synced node info
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:59.508072 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/11/27 02:23:59.705617 [DEBUG] http: Request GET /v1/agent/self (194.55477ms) from=127.0.0.1:36690
TestDebugCommand_DebugDisabled - 2019/11/27 02:23:59.726617 [DEBUG] http: Request GET /v1/agent/self (196.866521ms) from=127.0.0.1:40500
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.733517 [DEBUG] http: Request GET /v1/agent/self (203.581099ms) from=127.0.0.1:43200
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:59.757895 [DEBUG] http: Request GET /v1/agent/self (222.667129ms) from=127.0.0.1:52446
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.762116 [INFO] agent: Requesting shutdown
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.762246 [INFO] consul: shutting down server
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.762307 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:59.769784 [INFO] agent: Requesting shutdown
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:59.769900 [INFO] consul: shutting down server
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:59.769956 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathBad - 2019/11/27 02:23:59.866243 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/11/27 02:23:59.868815 [WARN] serf: Shutdown without a Leave
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.426002 [INFO] manager: shutting down
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.432724 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:00.437941 [DEBUG] http: Request GET /v1/agent/self (673.450918ms) from=127.0.0.1:36690
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:00.665448 [WARN] agent: Node name "Node 678ebb70-526d-2c6d-aca2-4734767c7005" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:00.665922 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:00.665998 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:00.666169 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:00.666270 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:00.831125 [DEBUG] http: Request GET /v1/agent/host (1.059495343s) from=127.0.0.1:40500
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.949625 [ERR] agent: failed to sync remote state: No cluster leader
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.967917 [INFO] agent: consul server down
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.967983 [INFO] agent: shutdown complete
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.968044 [INFO] agent: Stopping DNS server 127.0.0.1:16007 (tcp)
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.968181 [INFO] agent: Stopping DNS server 127.0.0.1:16007 (udp)
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.968335 [INFO] agent: Stopping HTTP server 127.0.0.1:16008 (tcp)
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.968813 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.968922 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestDebugCommand_OutputPathExists - 2019/11/27 02:24:00.969107 [INFO] agent: Endpoints down
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:00.974471 [DEBUG] http: Request GET /v1/agent/self (138.698729ms) from=127.0.0.1:40500
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.982518 [INFO] agent: consul server down
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.982602 [INFO] agent: shutdown complete
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.982724 [INFO] agent: Stopping DNS server 127.0.0.1:16013 (tcp)
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.982887 [INFO] agent: Stopping DNS server 127.0.0.1:16013 (udp)
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.983048 [INFO] agent: Stopping HTTP server 127.0.0.1:16014 (tcp)
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.984903 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.985251 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
=== CONT  TestDebugCommand_ArgsBad
TestDebugCommand_OutputPathBad - 2019/11/27 02:24:00.987041 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_OutputPathExists (4.54s)
--- PASS: TestDebugCommand_ArgsBad (0.02s)
=== CONT  TestDebugCommand_Archive
--- PASS: TestDebugCommand_OutputPathBad (4.57s)
=== CONT  TestDebugCommand_ValidateTiming
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.011623 [DEBUG] http: Request GET /v1/agent/members?wan=1 (1.265712ms) from=127.0.0.1:40500
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:01.063060 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:01.063176 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.090721 [DEBUG] http: Request GET /v1/agent/metrics (5.158855ms) from=127.0.0.1:40508
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:01.135145 [WARN] agent: Node name "Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:01.135774 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:01.135940 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:01.136222 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:01.136411 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_Archive - 2019/11/27 02:24:01.138107 [WARN] agent: Node name "Node 1279af43-87c8-f5c0-5509-58c3f4a6c8dc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_Archive - 2019/11/27 02:24:01.138550 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_Archive - 2019/11/27 02:24:01.138625 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_Archive - 2019/11/27 02:24:01.138800 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_Archive - 2019/11/27 02:24:01.138901 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.203080 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.203253 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.955791 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.956335 [DEBUG] consul: Skipping self join check for "Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff" since the cluster is too small
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:01.956557 [INFO] consul: member 'Node 3a3cd9b7-cc68-ddfb-34a8-8cbd5f1cc0ff' joined, marking health alive
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:01.962742 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:01.963157 [DEBUG] consul: Skipping self join check for "Node 1e3d6e38-620c-290d-c3f0-0f79493685a1" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:01.963318 [INFO] consul: member 'Node 1e3d6e38-620c-290d-c3f0-0f79493685a1' joined, marking health alive
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:02.103644 [DEBUG] http: Request GET /v1/agent/metrics (1.325382ms) from=127.0.0.1:40510
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:678ebb70-526d-2c6d-aca2-4734767c7005 Address:127.0.0.1:16030}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16030 [Follower] entering Follower state (Leader: "")
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.239713 [INFO] serf: EventMemberJoin: Node 678ebb70-526d-2c6d-aca2-4734767c7005.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.246256 [INFO] serf: EventMemberJoin: Node 678ebb70-526d-2c6d-aca2-4734767c7005 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.247503 [INFO] consul: Adding LAN server Node 678ebb70-526d-2c6d-aca2-4734767c7005 (Addr: tcp/127.0.0.1:16030) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.247761 [INFO] consul: Handled member-join event for server "Node 678ebb70-526d-2c6d-aca2-4734767c7005.dc1" in area "wan"
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.248272 [INFO] agent: Started DNS server 127.0.0.1:16025 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.248345 [INFO] agent: Started DNS server 127.0.0.1:16025 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.250430 [INFO] agent: Started HTTP server on 127.0.0.1:16026 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.250530 [INFO] agent: started state syncer
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16030 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d Address:127.0.0.1:16042}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16042 [Follower] entering Follower state (Leader: "")
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1279af43-87c8-f5c0-5509-58c3f4a6c8dc Address:127.0.0.1:16036}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16036 [Follower] entering Follower state (Leader: "")
TestDebugCommand_Archive - 2019/11/27 02:24:02.518757 [INFO] serf: EventMemberJoin: Node 1279af43-87c8-f5c0-5509-58c3f4a6c8dc.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.522620 [INFO] serf: EventMemberJoin: Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d.dc1 127.0.0.1
TestDebugCommand_Archive - 2019/11/27 02:24:02.527366 [INFO] serf: EventMemberJoin: Node 1279af43-87c8-f5c0-5509-58c3f4a6c8dc 127.0.0.1
TestDebugCommand_Archive - 2019/11/27 02:24:02.530574 [INFO] consul: Handled member-join event for server "Node 1279af43-87c8-f5c0-5509-58c3f4a6c8dc.dc1" in area "wan"
TestDebugCommand_Archive - 2019/11/27 02:24:02.530651 [INFO] consul: Adding LAN server Node 1279af43-87c8-f5c0-5509-58c3f4a6c8dc (Addr: tcp/127.0.0.1:16036) (DC: dc1)
TestDebugCommand_Archive - 2019/11/27 02:24:02.531219 [INFO] agent: Started DNS server 127.0.0.1:16031 (tcp)
TestDebugCommand_Archive - 2019/11/27 02:24:02.533203 [INFO] agent: Started DNS server 127.0.0.1:16031 (udp)
TestDebugCommand_Archive - 2019/11/27 02:24:02.535394 [INFO] agent: Started HTTP server on 127.0.0.1:16032 (tcp)
TestDebugCommand_Archive - 2019/11/27 02:24:02.535484 [INFO] agent: started state syncer
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.538604 [INFO] serf: EventMemberJoin: Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.540447 [INFO] consul: Handled member-join event for server "Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.540749 [INFO] consul: Adding LAN server Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d (Addr: tcp/127.0.0.1:16042) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.544930 [INFO] agent: Started DNS server 127.0.0.1:16037 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.545271 [INFO] agent: Started DNS server 127.0.0.1:16037 (udp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.548154 [INFO] agent: Started HTTP server on 127.0.0.1:16038 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:02.548317 [INFO] agent: started state syncer
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16042 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16036 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:16030 [Leader] entering Leader state
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.943894 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:02.944486 [INFO] consul: New leader elected: Node 678ebb70-526d-2c6d-aca2-4734767c7005
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:03.110233 [DEBUG] http: Request GET /v1/agent/metrics (1.016037ms) from=127.0.0.1:40514
2019/11/27 02:24:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:03 [INFO]  raft: Node at 127.0.0.1:16042 [Leader] entering Leader state
2019/11/27 02:24:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:03 [INFO]  raft: Node at 127.0.0.1:16036 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.116629 [INFO] consul: cluster leadership acquired
TestDebugCommand_Archive - 2019/11/27 02:24:03.116802 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.117091 [INFO] consul: New leader elected: Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d
TestDebugCommand_Archive - 2019/11/27 02:24:03.117091 [INFO] consul: New leader elected: Node 1279af43-87c8-f5c0-5509-58c3f4a6c8dc
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.269344 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.269474 [DEBUG] agent: Node info in sync
TestDebugCommand_Archive - 2019/11/27 02:24:03.434599 [INFO] agent: Synced node info
TestDebugCommand_Archive - 2019/11/27 02:24:03.434752 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.444174 [DEBUG] http: Request GET /v1/agent/self (166.178063ms) from=127.0.0.1:39134
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.512803 [DEBUG] http: Request GET /v1/agent/host (31.695489ms) from=127.0.0.1:39134
TestDebugCommand_Archive - 2019/11/27 02:24:03.625088 [DEBUG] http: Request GET /v1/agent/self (157.596416ms) from=127.0.0.1:58144
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.629376 [DEBUG] http: Request GET /v1/agent/self (110.854711ms) from=127.0.0.1:39134
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.646220 [DEBUG] http: Request GET /v1/agent/members?wan=1 (1.434052ms) from=127.0.0.1:39134
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.739131 [WARN] agent: Node name "Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.739543 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.739609 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.739803 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:03.739910 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.869976 [INFO] agent: Synced node info
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.870131 [DEBUG] agent: Node info in sync
TestDebugCommand_Archive - 2019/11/27 02:24:03.875790 [DEBUG] http: Request GET /v1/agent/self (230.717417ms) from=127.0.0.1:58144
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.966299 [WARN] agent: Node name "Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.966725 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.966797 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.967021 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:03.967125 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.074406 [INFO] agent: Requesting shutdown
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.074528 [INFO] consul: shutting down server
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.074578 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/11/27 02:24:04.080164 [DEBUG] agent: Node info in sync
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.190806 [WARN] serf: Shutdown without a Leave
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.285653 [INFO] manager: shutting down
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.286340 [INFO] agent: consul server down
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.286402 [INFO] agent: shutdown complete
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.286462 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (tcp)
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.286610 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (udp)
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:04.288895 [INFO] agent: Stopping HTTP server 127.0.0.1:16002 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:04.563033 [DEBUG] agent: Node info in sync
TestDebugCommand_Archive - 2019/11/27 02:24:04.902334 [INFO] agent: Requesting shutdown
TestDebugCommand_Archive - 2019/11/27 02:24:04.902744 [INFO] consul: shutting down server
TestDebugCommand_Archive - 2019/11/27 02:24:04.903117 [WARN] serf: Shutdown without a Leave
TestDebugCommand_Archive - 2019/11/27 02:24:05.089417 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.094676 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.095105 [DEBUG] consul: Skipping self join check for "Node 678ebb70-526d-2c6d-aca2-4734767c7005" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.095256 [INFO] consul: member 'Node 678ebb70-526d-2c6d-aca2-4734767c7005' joined, marking health alive
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:05.289258 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:16002 (tcp)
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:05.289332 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_DebugDisabled - 2019/11/27 02:24:05.289366 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_DebugDisabled (8.87s)
=== CONT  TestDebugCommand_ProfilesExist
TestDebugCommand_Archive - 2019/11/27 02:24:05.321155 [INFO] manager: shutting down
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:05.362280 [WARN] agent: Node name "Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:05.362705 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:05.362765 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:05.363148 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:05.363296 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_Archive - 2019/11/27 02:24:05.454631 [INFO] agent: consul server down
TestDebugCommand_Archive - 2019/11/27 02:24:05.454715 [INFO] agent: shutdown complete
TestDebugCommand_Archive - 2019/11/27 02:24:05.454785 [INFO] agent: Stopping DNS server 127.0.0.1:16031 (tcp)
TestDebugCommand_Archive - 2019/11/27 02:24:05.454969 [INFO] agent: Stopping DNS server 127.0.0.1:16031 (udp)
TestDebugCommand_Archive - 2019/11/27 02:24:05.455161 [INFO] agent: Stopping HTTP server 127.0.0.1:16032 (tcp)
TestDebugCommand_Archive - 2019/11/27 02:24:05.455683 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_Archive - 2019/11/27 02:24:05.455792 [ERR] connect: Apply failed leadership lost while committing log
TestDebugCommand_Archive - 2019/11/27 02:24:05.455839 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestDebugCommand_Archive - 2019/11/27 02:24:05.456026 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_Archive (4.45s)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.725949 [INFO] connect: initialized primary datacenter CA with provider "consul"
2019/11/27 02:24:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7e757fb1-7d37-be5b-2bb7-657a32998bf6 Address:127.0.0.1:16048}]
2019/11/27 02:24:05 [INFO]  raft: Node at 127.0.0.1:16048 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.729744 [DEBUG] consul: Skipping self join check for "Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d" since the cluster is too small
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.729928 [INFO] consul: member 'Node c3ecb3f5-4ea7-b59a-450b-0eb32006cd5d' joined, marking health alive
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.730615 [INFO] serf: EventMemberJoin: Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.734703 [INFO] serf: EventMemberJoin: Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.735567 [INFO] consul: Handled member-join event for server "Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6.dc1" in area "wan"
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.735866 [INFO] consul: Adding LAN server Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6 (Addr: tcp/127.0.0.1:16048) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.736430 [INFO] agent: Started DNS server 127.0.0.1:16043 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.737326 [INFO] agent: Started DNS server 127.0.0.1:16043 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.739526 [INFO] agent: Started HTTP server on 127.0.0.1:16044 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:05.739621 [INFO] agent: started state syncer
2019/11/27 02:24:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:05 [INFO]  raft: Node at 127.0.0.1:16048 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:05 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c Address:127.0.0.1:16054}]
2019/11/27 02:24:05 [INFO]  raft: Node at 127.0.0.1:16054 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.936533 [INFO] serf: EventMemberJoin: Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.939891 [INFO] serf: EventMemberJoin: Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.940481 [INFO] consul: Handled member-join event for server "Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.940710 [INFO] consul: Adding LAN server Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c (Addr: tcp/127.0.0.1:16054) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.941244 [INFO] agent: Started DNS server 127.0.0.1:16049 (udp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.941315 [INFO] agent: Started DNS server 127.0.0.1:16049 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.943360 [INFO] agent: Started HTTP server on 127.0.0.1:16050 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:05.943458 [INFO] agent: started state syncer
2019/11/27 02:24:05 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:05 [INFO]  raft: Node at 127.0.0.1:16054 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:06.015051 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:06.015117 [DEBUG] agent: Node info in sync
2019/11/27 02:24:06 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:06 [INFO]  raft: Node at 127.0.0.1:16048 [Leader] entering Leader state
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:06.989787 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:06.990772 [INFO] consul: New leader elected: Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6
2019/11/27 02:24:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:07 [INFO]  raft: Node at 127.0.0.1:16054 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:07.201412 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:07.201885 [INFO] consul: New leader elected: Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c
2019/11/27 02:24:07 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd Address:127.0.0.1:16060}]
2019/11/27 02:24:07 [INFO]  raft: Node at 127.0.0.1:16060 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.309246 [INFO] serf: EventMemberJoin: Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd.dc1 127.0.0.1
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.313648 [INFO] serf: EventMemberJoin: Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd 127.0.0.1
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.314333 [INFO] consul: Handled member-join event for server "Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd.dc1" in area "wan"
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.314382 [INFO] consul: Adding LAN server Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd (Addr: tcp/127.0.0.1:16060) (DC: dc1)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.315040 [INFO] agent: Started DNS server 127.0.0.1:16055 (udp)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.315107 [INFO] agent: Started DNS server 127.0.0.1:16055 (tcp)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.318624 [INFO] agent: Started HTTP server on 127.0.0.1:16056 (tcp)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:07.318728 [INFO] agent: started state syncer
2019/11/27 02:24:07 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:07 [INFO]  raft: Node at 127.0.0.1:16060 [Candidate] entering Candidate state in term 2
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:07.645651 [INFO] agent: Synced node info
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:07.944228 [INFO] agent: Synced node info
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:07.944345 [DEBUG] agent: Node info in sync
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:08.020782 [WARN] agent: Node name "Node 0ffbd3fd-a899-284d-6e97-5634143cd185" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:08.021357 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:08.021517 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:08.021846 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:08.022033 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.163616 [DEBUG] http: Request GET /v1/agent/self (492.026938ms) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.177296 [DEBUG] http: Request GET /v1/agent/metrics (972.036µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.231296 [DEBUG] http: Request GET /v1/agent/metrics (677.691µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.285302 [DEBUG] http: Request GET /v1/agent/metrics (623.356µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.339232 [DEBUG] http: Request GET /v1/agent/metrics (528.019µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.393150 [DEBUG] http: Request GET /v1/agent/metrics (545.353µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.447766 [DEBUG] http: Request GET /v1/agent/metrics (926.033µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.502196 [DEBUG] http: Request GET /v1/agent/metrics (608.356µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.555902 [DEBUG] http: Request GET /v1/agent/metrics (555.021µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.609732 [DEBUG] http: Request GET /v1/agent/metrics (437.35µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.663916 [DEBUG] http: Request GET /v1/agent/metrics (586.355µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.718622 [DEBUG] http: Request GET /v1/agent/metrics (1.101373ms) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.772737 [DEBUG] http: Request GET /v1/agent/metrics (549.02µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.827256 [DEBUG] http: Request GET /v1/agent/metrics (628.023µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.881221 [DEBUG] http: Request GET /v1/agent/metrics (578.021µs) from=127.0.0.1:59944
2019/11/27 02:24:08 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:08 [INFO]  raft: Node at 127.0.0.1:16060 [Leader] entering Leader state
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:08.902813 [INFO] consul: cluster leadership acquired
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:08.903317 [INFO] consul: New leader elected: Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.925562 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.925681 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.935597 [DEBUG] http: Request GET /v1/agent/metrics (565.687µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:08.989501 [DEBUG] http: Request GET /v1/agent/metrics (543.353µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.045930 [DEBUG] http: Request GET /v1/agent/metrics (593.355µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.099844 [DEBUG] http: Request GET /v1/agent/metrics (582.355µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.159367 [DEBUG] http: Request GET /v1/agent/metrics (2.09841ms) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.220241 [DEBUG] http: Request GET /v1/agent/metrics (623.689µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.274140 [DEBUG] http: Request GET /v1/agent/metrics (536.353µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.329613 [DEBUG] http: Request GET /v1/agent/metrics (670.691µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.383951 [DEBUG] http: Request GET /v1/agent/metrics (633.356µs) from=127.0.0.1:59944
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:09.438346 [INFO] agent: Synced node info
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:09.438519 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.443031 [DEBUG] http: Request GET /v1/agent/metrics (644.024µs) from=127.0.0.1:59944
/tmp/consul-test/TestDebugCommand_ProfilesExist-debug293915459/debug
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.497213 [DEBUG] http: Request GET /v1/agent/metrics (652.024µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.552081 [DEBUG] http: Request GET /v1/agent/metrics (816.03µs) from=127.0.0.1:59944
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:09.572003 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.606880 [DEBUG] http: Request GET /v1/agent/metrics (708.026µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.661127 [DEBUG] http: Request GET /v1/agent/metrics (571.688µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.715731 [DEBUG] http: Request GET /v1/agent/metrics (541.019µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.770152 [DEBUG] http: Request GET /v1/agent/metrics (636.689µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.832020 [DEBUG] http: Request GET /v1/agent/metrics (649.023µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.886466 [DEBUG] http: Request GET /v1/agent/metrics (677.024µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.944836 [DEBUG] http: Request GET /v1/agent/metrics (2.7601ms) from=127.0.0.1:59944
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:09.969661 [DEBUG] http: Request GET /v1/agent/self (505.547759ms) from=127.0.0.1:35776
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:09.998561 [DEBUG] http: Request GET /v1/agent/metrics (524.353µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.052759 [DEBUG] http: Request GET /v1/agent/metrics (602.689µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.106563 [DEBUG] http: Request GET /v1/agent/metrics (611.689µs) from=127.0.0.1:59944
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.160771 [DEBUG] http: Request GET /v1/agent/metrics (695.358µs) from=127.0.0.1:59944
2019/11/27 02:24:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0ffbd3fd-a899-284d-6e97-5634143cd185 Address:127.0.0.1:16066}]
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.201238 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.201771 [DEBUG] consul: Skipping self join check for "Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.201960 [INFO] consul: member 'Node 7e757fb1-7d37-be5b-2bb7-657a32998bf6' joined, marking health alive
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.205096 [INFO] serf: EventMemberJoin: Node 0ffbd3fd-a899-284d-6e97-5634143cd185.dc1 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.222397 [INFO] serf: EventMemberJoin: Node 0ffbd3fd-a899-284d-6e97-5634143cd185 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.223635 [INFO] agent: Started DNS server 127.0.0.1:16061 (udp)
2019/11/27 02:24:10 [INFO]  raft: Node at 127.0.0.1:16066 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.224102 [INFO] consul: Adding LAN server Node 0ffbd3fd-a899-284d-6e97-5634143cd185 (Addr: tcp/127.0.0.1:16066) (DC: dc1)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.224330 [INFO] consul: Handled member-join event for server "Node 0ffbd3fd-a899-284d-6e97-5634143cd185.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.224941 [INFO] agent: Started DNS server 127.0.0.1:16061 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.226253 [DEBUG] http: Request GET /v1/agent/metrics (1.333049ms) from=127.0.0.1:59944
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.228062 [INFO] agent: Started HTTP server on 127.0.0.1:16062 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.228202 [INFO] agent: started state syncer
2019/11/27 02:24:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:10 [INFO]  raft: Node at 127.0.0.1:16066 [Candidate] entering Candidate state in term 2
WARNING: bootstrap = true: do not enable unless necessary
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.346273 [WARN] agent: Node name "Node 95df03fa-5646-0625-7cdf-6126a7d32601" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.346977 [DEBUG] tlsutil: Update with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.347196 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.347718 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:10.348008 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.357855 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.358512 [DEBUG] consul: Skipping self join check for "Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c" since the cluster is too small
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:10.358769 [INFO] consul: member 'Node d42a6ba3-1f8a-ddc6-c68c-f60c30aebc2c' joined, marking health alive
2019/11/27 02:24:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:11 [INFO]  raft: Node at 127.0.0.1:16066 [Leader] entering Leader state
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.160807 [INFO] consul: cluster leadership acquired
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.163066 [INFO] consul: New leader elected: Node 0ffbd3fd-a899-284d-6e97-5634143cd185
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:11.559693 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:11.561505 [DEBUG] consul: Skipping self join check for "Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd" since the cluster is too small
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:11.561763 [INFO] consul: member 'Node c29e4987-e7cd-ef5f-eb2c-459faaf0b3bd' joined, marking health alive
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.756581 [INFO] agent: Synced node info
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.764147 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.764552 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.765026 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.843029 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.932134 [INFO] manager: shutting down
2019/11/27 02:24:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:95df03fa-5646-0625-7cdf-6126a7d32601 Address:127.0.0.1:16072}]
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.933719 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.933827 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.933983 [INFO] agent: Stopping DNS server 127.0.0.1:16061 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.934341 [INFO] agent: Stopping DNS server 127.0.0.1:16061 (udp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.934695 [INFO] agent: Stopping HTTP server 127.0.0.1:16062 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.936755 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.937262 [ERR] consul: failed to establish leadership: raft is already shutdown
2019/11/27 02:24:11 [INFO]  raft: Node at 127.0.0.1:16072 [Follower] entering Follower state (Leader: "")
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.940086 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.941317 [INFO] agent: Endpoints down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.940789 [INFO] serf: EventMemberJoin: Node 95df03fa-5646-0625-7cdf-6126a7d32601.dc1 127.0.0.1
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.946227 [INFO] serf: EventMemberJoin: Node 95df03fa-5646-0625-7cdf-6126a7d32601 127.0.0.1
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.947694 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.948023 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.948913 [INFO] consul: Adding LAN server Node 95df03fa-5646-0625-7cdf-6126a7d32601 (Addr: tcp/127.0.0.1:16072) (DC: dc1)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.949627 [INFO] consul: Handled member-join event for server "Node 95df03fa-5646-0625-7cdf-6126a7d32601.dc1" in area "wan"
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:11.950044 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.953721 [INFO] agent: Started DNS server 127.0.0.1:16067 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.954623 [INFO] agent: Started DNS server 127.0.0.1:16067 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.957520 [INFO] agent: Started HTTP server on 127.0.0.1:16068 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:11.957699 [INFO] agent: started state syncer
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:11.963827 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:11.963948 [DEBUG] agent: Node info in sync
2019/11/27 02:24:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:11 [INFO]  raft: Node at 127.0.0.1:16072 [Candidate] entering Candidate state in term 2
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.098505 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.187669 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.190319 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.190656 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.191078 [INFO] agent: Stopping DNS server 127.0.0.1:16049 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.191611 [INFO] agent: Stopping DNS server 127.0.0.1:16049 (udp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.192323 [INFO] agent: Stopping HTTP server 127.0.0.1:16050 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.193820 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.194041 [INFO] agent: Endpoints down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.194731 [INFO] agent: Requesting shutdown
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.195055 [INFO] consul: shutting down server
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.195393 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.265153 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.365758 [INFO] manager: shutting down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.369023 [INFO] agent: consul server down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.369463 [INFO] agent: shutdown complete
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.370093 [INFO] agent: Stopping DNS server 127.0.0.1:16037 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.371301 [INFO] agent: Stopping DNS server 127.0.0.1:16037 (udp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.372644 [INFO] agent: Stopping HTTP server 127.0.0.1:16038 (tcp)
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.373184 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ValidateTiming - 2019/11/27 02:24:12.373338 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_ValidateTiming (11.36s)
2019/11/27 02:24:12 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:12 [INFO]  raft: Node at 127.0.0.1:16072 [Leader] entering Leader state
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:12.680927 [INFO] consul: cluster leadership acquired
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:12.681475 [INFO] consul: New leader elected: Node 95df03fa-5646-0625-7cdf-6126a7d32601
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:12.987373 [INFO] agent: Requesting shutdown
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:12.987800 [INFO] consul: shutting down server
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:12.988458 [WARN] serf: Shutdown without a Leave
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.098351 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.177371 [INFO] agent: Synced node info
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.177669 [DEBUG] agent: Node info in sync
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.179296 [INFO] manager: shutting down
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.180626 [INFO] agent: consul server down
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.180733 [INFO] agent: shutdown complete
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.180882 [INFO] agent: Stopping DNS server 127.0.0.1:16055 (tcp)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.181192 [INFO] agent: Stopping DNS server 127.0.0.1:16055 (udp)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:13.181638 [INFO] agent: Stopping HTTP server 127.0.0.1:16056 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.349628 [DEBUG] http: Request GET /v1/agent/self (160.006161ms) from=127.0.0.1:36350
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.430326 [DEBUG] http: Request GET /v1/agent/host (68.106481ms) from=127.0.0.1:36350
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.547751 [DEBUG] http: Request GET /v1/agent/self (111.18505ms) from=127.0.0.1:36350
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.556279 [DEBUG] agent: Node info in sync
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.563792 [DEBUG] http: Request GET /v1/agent/members?wan=1 (1.343716ms) from=127.0.0.1:36350
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.589414 [DEBUG] http: Request GET /v1/agent/metrics (747.36µs) from=127.0.0.1:36350
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.647125 [DEBUG] http: Request GET /v1/agent/metrics (807.696µs) from=127.0.0.1:36350
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.705933 [DEBUG] http: Request GET /v1/agent/metrics (2.314751ms) from=127.0.0.1:36356
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.779323 [DEBUG] http: Request GET /v1/agent/metrics (8.862323ms) from=127.0.0.1:36358
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.856928 [DEBUG] http: Request GET /v1/agent/metrics (676.691µs) from=127.0.0.1:36360
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:13.944572 [DEBUG] http: Request GET /v1/agent/metrics (677.691µs) from=127.0.0.1:36362
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.012598 [DEBUG] http: Request GET /v1/agent/metrics (2.540093ms) from=127.0.0.1:36364
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.070058 [DEBUG] http: Request GET /v1/agent/metrics (992.036µs) from=127.0.0.1:36366
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.129710 [DEBUG] http: Request GET /v1/agent/metrics (3.217784ms) from=127.0.0.1:36368
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.187448 [DEBUG] http: Request GET /v1/agent/metrics (1.407718ms) from=127.0.0.1:36368
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:14.190096 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:16056 (tcp)
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:14.190226 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_ProfilesExist - 2019/11/27 02:24:14.190280 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_ProfilesExist (8.90s)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.246865 [DEBUG] http: Request GET /v1/agent/metrics (1.487387ms) from=127.0.0.1:36372
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.301245 [DEBUG] http: Request GET /v1/agent/metrics (825.03µs) from=127.0.0.1:36372
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.361105 [DEBUG] http: Request GET /v1/agent/metrics (2.360753ms) from=127.0.0.1:36372
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.410233 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.410787 [DEBUG] consul: Skipping self join check for "Node 95df03fa-5646-0625-7cdf-6126a7d32601" since the cluster is too small
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.411048 [INFO] consul: member 'Node 95df03fa-5646-0625-7cdf-6126a7d32601' joined, marking health alive
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.419092 [DEBUG] http: Request GET /v1/agent/metrics (887.032µs) from=127.0.0.1:36378
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.477436 [DEBUG] http: Request GET /v1/agent/metrics (1.538389ms) from=127.0.0.1:36380
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.537094 [DEBUG] http: Request GET /v1/agent/metrics (1.38205ms) from=127.0.0.1:36382
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.603562 [DEBUG] http: Request GET /v1/agent/metrics (1.16971ms) from=127.0.0.1:36384
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.675352 [DEBUG] http: Request GET /v1/agent/metrics (682.024µs) from=127.0.0.1:36386
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.730277 [DEBUG] http: Request GET /v1/agent/metrics (860.365µs) from=127.0.0.1:36386
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.787938 [DEBUG] http: Request GET /v1/agent/metrics (1.442053ms) from=127.0.0.1:36390
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.843916 [DEBUG] http: Request GET /v1/agent/metrics (1.042371ms) from=127.0.0.1:36392
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.900936 [DEBUG] http: Request GET /v1/agent/metrics (2.188746ms) from=127.0.0.1:36392
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:14.958711 [DEBUG] http: Request GET /v1/agent/metrics (965.701µs) from=127.0.0.1:36396
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.017249 [DEBUG] http: Request GET /v1/agent/metrics (1.412718ms) from=127.0.0.1:36398
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.075131 [DEBUG] http: Request GET /v1/agent/metrics (1.27538ms) from=127.0.0.1:36400
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.134487 [DEBUG] http: Request GET /v1/agent/metrics (1.325048ms) from=127.0.0.1:36402
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.191049 [DEBUG] http: Request GET /v1/agent/metrics (876.365µs) from=127.0.0.1:36406
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.246117 [DEBUG] http: Request GET /v1/agent/metrics (1.10204ms) from=127.0.0.1:36406
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.308556 [DEBUG] http: Request GET /v1/agent/metrics (3.035777ms) from=127.0.0.1:36410
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.368010 [DEBUG] http: Request GET /v1/agent/metrics (1.297714ms) from=127.0.0.1:36412
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.424403 [DEBUG] http: Request GET /v1/agent/metrics (656.024µs) from=127.0.0.1:36416
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.481108 [DEBUG] http: Request GET /v1/agent/metrics (849.364µs) from=127.0.0.1:36418
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.539963 [DEBUG] http: Request GET /v1/agent/metrics (1.247712ms) from=127.0.0.1:36420
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.604768 [DEBUG] http: Request GET /v1/agent/metrics (894.699µs) from=127.0.0.1:36422
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.661883 [DEBUG] http: Request GET /v1/agent/metrics (880.699µs) from=127.0.0.1:36424
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.690070 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.690188 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:15.690242 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.134027 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.287188 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.289302 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.289372 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.289435 [INFO] agent: Stopping DNS server 127.0.0.1:16067 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.289602 [INFO] agent: Stopping DNS server 127.0.0.1:16067 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:16.289770 [INFO] agent: Stopping HTTP server 127.0.0.1:16068 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.290199 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:16068 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.290283 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.290321 [INFO] agent: Endpoints down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.297148 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.297256 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.297315 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.342497 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.398145 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.398841 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.398907 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.398968 [INFO] agent: Stopping DNS server 127.0.0.1:16043 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.399128 [INFO] agent: Stopping DNS server 127.0.0.1:16043 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.399287 [INFO] agent: Stopping HTTP server 127.0.0.1:16044 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.399769 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.399851 [INFO] agent: Endpoints down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.404605 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.404711 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.404766 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.464765 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.520393 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.521142 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.521205 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.521265 [INFO] agent: Stopping DNS server 127.0.0.1:16025 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.521425 [INFO] agent: Stopping DNS server 127.0.0.1:16025 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.521596 [INFO] agent: Stopping HTTP server 127.0.0.1:16026 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.522148 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.522241 [INFO] agent: Endpoints down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.525056 [INFO] agent: Requesting shutdown
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.525152 [INFO] consul: shutting down server
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.525223 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.564715 [WARN] serf: Shutdown without a Leave
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.620403 [INFO] manager: shutting down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.621064 [INFO] agent: consul server down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.621123 [INFO] agent: shutdown complete
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.621179 [INFO] agent: Stopping DNS server 127.0.0.1:16019 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.621315 [INFO] agent: Stopping DNS server 127.0.0.1:16019 (udp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.621465 [INFO] agent: Stopping HTTP server 127.0.0.1:16020 (tcp)
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.621953 [INFO] agent: Waiting for endpoints to shut down
TestDebugCommand_CaptureTargets - 2019/11/27 02:24:17.622067 [INFO] agent: Endpoints down
--- PASS: TestDebugCommand_CaptureTargets (21.19s)
PASS
ok  	github.com/hashicorp/consul/command/debug	21.410s
=== RUN   TestEventCommand_noTabs
=== PAUSE TestEventCommand_noTabs
=== RUN   TestEventCommand
=== PAUSE TestEventCommand
=== CONT  TestEventCommand_noTabs
=== CONT  TestEventCommand
--- PASS: TestEventCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestEventCommand - 2019/11/27 02:23:56.510376 [WARN] agent: Node name "Node 75aaba75-be66-9a8f-c6df-e963c4fd450e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventCommand - 2019/11/27 02:23:56.513368 [DEBUG] tlsutil: Update with version 1
TestEventCommand - 2019/11/27 02:23:56.513456 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventCommand - 2019/11/27 02:23:56.513754 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventCommand - 2019/11/27 02:23:56.513853 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:23:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:75aaba75-be66-9a8f-c6df-e963c4fd450e Address:127.0.0.1:23506}]
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:23506 [Follower] entering Follower state (Leader: "")
TestEventCommand - 2019/11/27 02:23:57.532484 [INFO] serf: EventMemberJoin: Node 75aaba75-be66-9a8f-c6df-e963c4fd450e.dc1 127.0.0.1
TestEventCommand - 2019/11/27 02:23:57.536394 [INFO] serf: EventMemberJoin: Node 75aaba75-be66-9a8f-c6df-e963c4fd450e 127.0.0.1
TestEventCommand - 2019/11/27 02:23:57.537224 [INFO] consul: Adding LAN server Node 75aaba75-be66-9a8f-c6df-e963c4fd450e (Addr: tcp/127.0.0.1:23506) (DC: dc1)
TestEventCommand - 2019/11/27 02:23:57.537630 [INFO] consul: Handled member-join event for server "Node 75aaba75-be66-9a8f-c6df-e963c4fd450e.dc1" in area "wan"
TestEventCommand - 2019/11/27 02:23:57.538305 [INFO] agent: Started DNS server 127.0.0.1:23501 (tcp)
TestEventCommand - 2019/11/27 02:23:57.538766 [INFO] agent: Started DNS server 127.0.0.1:23501 (udp)
TestEventCommand - 2019/11/27 02:23:57.540862 [INFO] agent: Started HTTP server on 127.0.0.1:23502 (tcp)
TestEventCommand - 2019/11/27 02:23:57.541099 [INFO] agent: started state syncer
2019/11/27 02:23:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:23:57 [INFO]  raft: Node at 127.0.0.1:23506 [Candidate] entering Candidate state in term 2
2019/11/27 02:23:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:23:58 [INFO]  raft: Node at 127.0.0.1:23506 [Leader] entering Leader state
TestEventCommand - 2019/11/27 02:23:58.200455 [INFO] consul: cluster leadership acquired
TestEventCommand - 2019/11/27 02:23:58.201102 [INFO] consul: New leader elected: Node 75aaba75-be66-9a8f-c6df-e963c4fd450e
TestEventCommand - 2019/11/27 02:23:59.278025 [INFO] agent: Synced node info
TestEventCommand - 2019/11/27 02:23:59.284706 [DEBUG] http: Request GET /v1/agent/self (1.056349238s) from=127.0.0.1:43598
TestEventCommand - 2019/11/27 02:23:59.305294 [DEBUG] http: Request PUT /v1/event/fire/cmd (1.847734ms) from=127.0.0.1:43598
TestEventCommand - 2019/11/27 02:23:59.306984 [DEBUG] consul: User event: cmd
TestEventCommand - 2019/11/27 02:23:59.307622 [DEBUG] agent: new event: cmd (95d61a65-63ba-513a-ad99-fb1275b21a80)
TestEventCommand - 2019/11/27 02:23:59.308746 [INFO] agent: Requesting shutdown
TestEventCommand - 2019/11/27 02:23:59.308862 [INFO] consul: shutting down server
TestEventCommand - 2019/11/27 02:23:59.308945 [WARN] serf: Shutdown without a Leave
TestEventCommand - 2019/11/27 02:23:59.443581 [WARN] serf: Shutdown without a Leave
TestEventCommand - 2019/11/27 02:23:59.511673 [INFO] manager: shutting down
TestEventCommand - 2019/11/27 02:23:59.688516 [INFO] agent: consul server down
TestEventCommand - 2019/11/27 02:23:59.688615 [INFO] agent: shutdown complete
TestEventCommand - 2019/11/27 02:23:59.688679 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (tcp)
TestEventCommand - 2019/11/27 02:23:59.688848 [INFO] agent: Stopping DNS server 127.0.0.1:23501 (udp)
TestEventCommand - 2019/11/27 02:23:59.689062 [INFO] agent: Stopping HTTP server 127.0.0.1:23502 (tcp)
TestEventCommand - 2019/11/27 02:23:59.689752 [INFO] agent: Waiting for endpoints to shut down
TestEventCommand - 2019/11/27 02:23:59.689883 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestEventCommand - 2019/11/27 02:23:59.690131 [INFO] agent: Endpoints down
--- PASS: TestEventCommand (3.26s)
PASS
ok  	github.com/hashicorp/consul/command/event	3.423s
=== RUN   TestExecCommand_noTabs
=== PAUSE TestExecCommand_noTabs
=== RUN   TestExecCommand
=== PAUSE TestExecCommand
=== RUN   TestExecCommand_NoShell
=== PAUSE TestExecCommand_NoShell
=== RUN   TestExecCommand_CrossDC
--- SKIP: TestExecCommand_CrossDC (0.00s)
    exec_test.go:70: DM-skipped
=== RUN   TestExecCommand_Validate
=== PAUSE TestExecCommand_Validate
=== RUN   TestExecCommand_Sessions
=== PAUSE TestExecCommand_Sessions
=== RUN   TestExecCommand_Sessions_Foreign
=== PAUSE TestExecCommand_Sessions_Foreign
=== RUN   TestExecCommand_UploadDestroy
=== PAUSE TestExecCommand_UploadDestroy
=== RUN   TestExecCommand_StreamResults
=== PAUSE TestExecCommand_StreamResults
=== CONT  TestExecCommand_noTabs
=== CONT  TestExecCommand_Sessions_Foreign
=== CONT  TestExecCommand_Validate
=== CONT  TestExecCommand_StreamResults
--- PASS: TestExecCommand_Validate (0.00s)
=== CONT  TestExecCommand_Sessions
--- PASS: TestExecCommand_noTabs (0.00s)
=== CONT  TestExecCommand_UploadDestroy
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_Sessions_Foreign - 2019/11/27 02:23:59.973165 [WARN] agent: Node name "Node d890d31f-bc25-fbf0-2057-110875fa1bc8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions_Foreign - 2019/11/27 02:23:59.974040 [DEBUG] tlsutil: Update with version 1
TestExecCommand_Sessions_Foreign - 2019/11/27 02:23:59.974171 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions_Foreign - 2019/11/27 02:23:59.974471 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestExecCommand_Sessions_Foreign - 2019/11/27 02:23:59.974613 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_StreamResults - 2019/11/27 02:24:00.433854 [WARN] agent: Node name "Node a98719ca-28cc-829a-e976-7e58f4b137fd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_StreamResults - 2019/11/27 02:24:00.434386 [DEBUG] tlsutil: Update with version 1
TestExecCommand_StreamResults - 2019/11/27 02:24:00.434455 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_StreamResults - 2019/11/27 02:24:00.434613 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestExecCommand_StreamResults - 2019/11/27 02:24:00.434707 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_Sessions - 2019/11/27 02:24:00.570118 [WARN] agent: Node name "Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_Sessions - 2019/11/27 02:24:00.570535 [DEBUG] tlsutil: Update with version 1
TestExecCommand_Sessions - 2019/11/27 02:24:00.570608 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions - 2019/11/27 02:24:00.570767 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestExecCommand_Sessions - 2019/11/27 02:24:00.570867 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_UploadDestroy - 2019/11/27 02:24:00.612095 [WARN] agent: Node name "Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_UploadDestroy - 2019/11/27 02:24:00.613565 [DEBUG] tlsutil: Update with version 1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:00.613893 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:00.614179 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:00.615436 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d890d31f-bc25-fbf0-2057-110875fa1bc8 Address:127.0.0.1:37006}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37006 [Follower] entering Follower state (Leader: "")
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.052309 [INFO] serf: EventMemberJoin: Node d890d31f-bc25-fbf0-2057-110875fa1bc8.dc1 127.0.0.1
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.064725 [INFO] serf: EventMemberJoin: Node d890d31f-bc25-fbf0-2057-110875fa1bc8 127.0.0.1
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.066279 [INFO] consul: Adding LAN server Node d890d31f-bc25-fbf0-2057-110875fa1bc8 (Addr: tcp/127.0.0.1:37006) (DC: dc1)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.067346 [INFO] agent: Started DNS server 127.0.0.1:37001 (udp)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.067379 [INFO] consul: Handled member-join event for server "Node d890d31f-bc25-fbf0-2057-110875fa1bc8.dc1" in area "wan"
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.068495 [INFO] agent: Started DNS server 127.0.0.1:37001 (tcp)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.072034 [INFO] agent: Started HTTP server on 127.0.0.1:37002 (tcp)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.074316 [INFO] agent: started state syncer
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37006 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a98719ca-28cc-829a-e976-7e58f4b137fd Address:127.0.0.1:37012}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37012 [Follower] entering Follower state (Leader: "")
TestExecCommand_StreamResults - 2019/11/27 02:24:02.161038 [INFO] serf: EventMemberJoin: Node a98719ca-28cc-829a-e976-7e58f4b137fd.dc1 127.0.0.1
TestExecCommand_StreamResults - 2019/11/27 02:24:02.168292 [INFO] serf: EventMemberJoin: Node a98719ca-28cc-829a-e976-7e58f4b137fd 127.0.0.1
TestExecCommand_StreamResults - 2019/11/27 02:24:02.171176 [INFO] agent: Started DNS server 127.0.0.1:37007 (udp)
TestExecCommand_StreamResults - 2019/11/27 02:24:02.171793 [INFO] consul: Handled member-join event for server "Node a98719ca-28cc-829a-e976-7e58f4b137fd.dc1" in area "wan"
TestExecCommand_StreamResults - 2019/11/27 02:24:02.171854 [INFO] consul: Adding LAN server Node a98719ca-28cc-829a-e976-7e58f4b137fd (Addr: tcp/127.0.0.1:37012) (DC: dc1)
TestExecCommand_StreamResults - 2019/11/27 02:24:02.172297 [INFO] agent: Started DNS server 127.0.0.1:37007 (tcp)
TestExecCommand_StreamResults - 2019/11/27 02:24:02.174178 [INFO] agent: Started HTTP server on 127.0.0.1:37008 (tcp)
TestExecCommand_StreamResults - 2019/11/27 02:24:02.174317 [INFO] agent: started state syncer
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37012 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5d0e029e-9eef-bf8f-385d-867d07f6dff3 Address:127.0.0.1:37024}]
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37024 [Follower] entering Follower state (Leader: "")
2019/11/27 02:24:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea Address:127.0.0.1:37018}]
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.264390 [INFO] serf: EventMemberJoin: Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3.dc1 127.0.0.1
TestExecCommand_Sessions - 2019/11/27 02:24:02.265961 [INFO] serf: EventMemberJoin: Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea.dc1 127.0.0.1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.268180 [INFO] serf: EventMemberJoin: Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3 127.0.0.1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.269276 [INFO] consul: Adding LAN server Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3 (Addr: tcp/127.0.0.1:37024) (DC: dc1)
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.269644 [INFO] agent: Started DNS server 127.0.0.1:37019 (udp)
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.269883 [INFO] consul: Handled member-join event for server "Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3.dc1" in area "wan"
TestExecCommand_Sessions - 2019/11/27 02:24:02.269899 [INFO] serf: EventMemberJoin: Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea 127.0.0.1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.270064 [INFO] agent: Started DNS server 127.0.0.1:37019 (tcp)
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37018 [Follower] entering Follower state (Leader: "")
TestExecCommand_Sessions - 2019/11/27 02:24:02.271122 [INFO] consul: Handled member-join event for server "Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea.dc1" in area "wan"
TestExecCommand_Sessions - 2019/11/27 02:24:02.271464 [INFO] consul: Adding LAN server Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea (Addr: tcp/127.0.0.1:37018) (DC: dc1)
TestExecCommand_Sessions - 2019/11/27 02:24:02.274147 [INFO] agent: Started DNS server 127.0.0.1:37013 (udp)
TestExecCommand_Sessions - 2019/11/27 02:24:02.274400 [INFO] agent: Started DNS server 127.0.0.1:37013 (tcp)
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.277396 [INFO] agent: Started HTTP server on 127.0.0.1:37020 (tcp)
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.277510 [INFO] agent: started state syncer
TestExecCommand_Sessions - 2019/11/27 02:24:02.278671 [INFO] agent: Started HTTP server on 127.0.0.1:37014 (tcp)
TestExecCommand_Sessions - 2019/11/27 02:24:02.278907 [INFO] agent: started state syncer
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37024 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37018 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37006 [Leader] entering Leader state
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.790847 [INFO] consul: cluster leadership acquired
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.791437 [INFO] consul: New leader elected: Node d890d31f-bc25-fbf0-2057-110875fa1bc8
2019/11/27 02:24:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37012 [Leader] entering Leader state
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.873728 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (7.230597ms) from=127.0.0.1:54808
TestExecCommand_StreamResults - 2019/11/27 02:24:02.875077 [INFO] consul: cluster leadership acquired
TestExecCommand_StreamResults - 2019/11/27 02:24:02.875499 [INFO] consul: New leader elected: Node a98719ca-28cc-829a-e976-7e58f4b137fd
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.904984 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.203044ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.941801 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (8.973328ms) from=127.0.0.1:54808
2019/11/27 02:24:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37024 [Leader] entering Leader state
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.966959 [INFO] consul: cluster leadership acquired
2019/11/27 02:24:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:02 [INFO]  raft: Node at 127.0.0.1:37018 [Leader] entering Leader state
TestExecCommand_Sessions - 2019/11/27 02:24:02.967542 [INFO] consul: cluster leadership acquired
TestExecCommand_Sessions - 2019/11/27 02:24:02.968002 [INFO] consul: New leader elected: Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea
TestExecCommand_UploadDestroy - 2019/11/27 02:24:02.970283 [INFO] consul: New leader elected: Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:02.972729 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.245045ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.002374 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.08304ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.031589 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.713729ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.060365 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (863.365µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.092771 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.945738ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.124275 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (3.553796ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.152681 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (674.358µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.181149 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (651.357µs) from=127.0.0.1:54808
TestExecCommand_StreamResults - 2019/11/27 02:24:03.189247 [INFO] agent: Synced node info
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.211529 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (2.362087ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.240897 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (667.024µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.266644 [INFO] agent: Synced node info
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.266928 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.281071 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (738.027µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.318026 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (772.695µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.346873 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.217378ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.375166 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (697.692µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.403559 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (744.027µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.436852 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (5.754877ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.497295 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.043372ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.525844 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (765.361µs) from=127.0.0.1:54808
TestExecCommand_Sessions - 2019/11/27 02:24:03.535500 [INFO] agent: Synced node info
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.554462 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (925.7µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.582686 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (759.361µs) from=127.0.0.1:54808
TestExecCommand_UploadDestroy - 2019/11/27 02:24:03.611206 [INFO] agent: Synced node info
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.628428 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (7.017589ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.658422 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (777.695µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.687712 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (872.032µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.716859 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.01837ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.748000 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (820.697µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.777508 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (835.364µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.806045 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (647.69µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.835033 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (697.025µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.863667 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (780.029µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.893256 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.349382ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.924973 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (823.363µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.954351 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (766.695µs) from=127.0.0.1:54808
TestExecCommand_UploadDestroy - 2019/11/27 02:24:03.981942 [DEBUG] agent: Node info in sync
TestExecCommand_UploadDestroy - 2019/11/27 02:24:03.982074 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:03.983787 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (599.022µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.012265 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (852.031µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.041330 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (835.364µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.069826 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (874.699µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.109251 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (803.03µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.140412 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (782.695µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.169776 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (751.361µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.203567 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (952.368µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.232693 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (709.359µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.260908 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (692.025µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.293588 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (861.365µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.321948 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (838.698µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.351483 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.126041ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.380713 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.02337ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.413948 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.004036ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.453587 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (10.630388ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.483431 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (2.055408ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.512353 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.004036ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.541062 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (786.362µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.570179 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (942.701µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.601219 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.711729ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.630842 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.111041ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.667605 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (674.025µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.695991 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (718.359µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.723936 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (639.69µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.752052 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (755.694µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.780188 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (754.361µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.808225 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (656.024µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.836745 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.087373ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.846324 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.846876 [DEBUG] consul: Skipping self join check for "Node d890d31f-bc25-fbf0-2057-110875fa1bc8" since the cluster is too small
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.847090 [INFO] consul: member 'Node d890d31f-bc25-fbf0-2057-110875fa1bc8' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.894968 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.044038ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.924066 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (703.359µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.952387 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (724.027µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:04.980469 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (657.69µs) from=127.0.0.1:54808
TestExecCommand_StreamResults - 2019/11/27 02:24:05.004977 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_StreamResults - 2019/11/27 02:24:05.005523 [DEBUG] consul: Skipping self join check for "Node a98719ca-28cc-829a-e976-7e58f4b137fd" since the cluster is too small
TestExecCommand_StreamResults - 2019/11/27 02:24:05.005670 [INFO] consul: member 'Node a98719ca-28cc-829a-e976-7e58f4b137fd' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.009030 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (736.36µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.037627 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (831.03µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.067291 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.090373ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.107056 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (727.36µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.135132 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (606.688µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.162803 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (566.354µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.192869 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (2.171746ms) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.221492 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (674.358µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.249775 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (730.026µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.277622 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (629.356µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.305893 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (872.032µs) from=127.0.0.1:54808
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.335075 [DEBUG] http: Request GET /v1/health/service/consul?passing=1 (1.441386ms) from=127.0.0.1:54808
TestExecCommand_UploadDestroy - 2019/11/27 02:24:05.457907 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_UploadDestroy - 2019/11/27 02:24:05.458406 [DEBUG] consul: Skipping self join check for "Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3" since the cluster is too small
TestExecCommand_UploadDestroy - 2019/11/27 02:24:05.458599 [INFO] consul: member 'Node 5d0e029e-9eef-bf8f-385d-867d07f6dff3' joined, marking health alive
TestExecCommand_Sessions - 2019/11/27 02:24:05.464501 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_Sessions - 2019/11/27 02:24:05.464949 [DEBUG] consul: Skipping self join check for "Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea" since the cluster is too small
TestExecCommand_Sessions - 2019/11/27 02:24:05.465110 [INFO] consul: member 'Node 60ed0d43-71cb-72b6-bcbc-a3b2ba28e0ea' joined, marking health alive
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.742004 [DEBUG] http: Request PUT /v1/session/create (395.619428ms) from=127.0.0.1:54808
TestExecCommand_StreamResults - 2019/11/27 02:24:05.751089 [DEBUG] http: Request PUT /v1/session/create (409.713942ms) from=127.0.0.1:36212
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.754957 [DEBUG] http: Request GET /v1/session/info/de3e1e81-7a3a-e0b7-5d2c-a91746c4f9bb (1.27838ms) from=127.0.0.1:54820
TestExecCommand_StreamResults - 2019/11/27 02:24:05.764353 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?keys=&wait=2000ms (436.683µs) from=127.0.0.1:36212
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.891236 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:05.891308 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/11/27 02:24:05.903164 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_StreamResults - 2019/11/27 02:24:05.903309 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/11/27 02:24:05.903441 [DEBUG] agent: Node info in sync
TestExecCommand_StreamResults - 2019/11/27 02:24:06.078368 [DEBUG] http: Request PUT /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/ack?acquire=9ddbf0a0-85e8-9b08-c207-179aa3fee29c (312.120383ms) from=127.0.0.1:36218
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.080122 [DEBUG] http: Request PUT /v1/session/create (306.207501ms) from=127.0.0.1:51674
TestExecCommand_StreamResults - 2019/11/27 02:24:06.080195 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?index=1&keys=&wait=2000ms (313.041083ms) from=127.0.0.1:36212
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.084182 [DEBUG] http: Request PUT /v1/session/destroy/de3e1e81-7a3a-e0b7-5d2c-a91746c4f9bb (317.615917ms) from=127.0.0.1:54808
TestExecCommand_Sessions - 2019/11/27 02:24:06.088103 [DEBUG] http: Request PUT /v1/session/create (337.101294ms) from=127.0.0.1:52054
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.092268 [DEBUG] http: Request GET /v1/session/info/de3e1e81-7a3a-e0b7-5d2c-a91746c4f9bb (2.070408ms) from=127.0.0.1:54830
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.097915 [INFO] agent: Requesting shutdown
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.098016 [INFO] consul: shutting down server
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.098081 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions - 2019/11/27 02:24:06.101598 [DEBUG] http: Request GET /v1/session/info/e60b150a-2e11-4dc2-5dd0-22e96b6386e6 (839.697µs) from=127.0.0.1:52064
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.193544 [WARN] serf: Shutdown without a Leave
TestExecCommand_Sessions - 2019/11/27 02:24:06.428613 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestExecCommand_Sessions - 2019/11/27 02:24:06.428686 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions - 2019/11/27 02:24:06.428752 [DEBUG] agent: Node info in sync
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.432028 [INFO] manager: shutting down
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.432771 [INFO] agent: consul server down
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.432838 [INFO] agent: shutdown complete
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.432896 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (tcp)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.433036 [INFO] agent: Stopping DNS server 127.0.0.1:37001 (udp)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.433199 [INFO] agent: Stopping HTTP server 127.0.0.1:37002 (tcp)
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.434059 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_Sessions_Foreign - 2019/11/27 02:24:06.434205 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_Sessions_Foreign (6.63s)
=== CONT  TestExecCommand_NoShell
TestExecCommand_Sessions - 2019/11/27 02:24:06.439048 [DEBUG] http: Request PUT /v1/session/destroy/e60b150a-2e11-4dc2-5dd0-22e96b6386e6 (334.180852ms) from=127.0.0.1:52054
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.438891 [DEBUG] http: Request PUT /v1/kv/_rexec/72f1d046-5902-b000-2f2b-6a15e559fc29/job?acquire=72f1d046-5902-b000-2f2b-6a15e559fc29 (346.775312ms) from=127.0.0.1:51674
TestExecCommand_StreamResults - 2019/11/27 02:24:06.441518 [DEBUG] http: Request PUT /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/exit?acquire=9ddbf0a0-85e8-9b08-c207-179aa3fee29c (354.362255ms) from=127.0.0.1:36222
TestExecCommand_StreamResults - 2019/11/27 02:24:06.443570 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?index=12&keys=&wait=2000ms (352.77153ms) from=127.0.0.1:36212
TestExecCommand_Sessions - 2019/11/27 02:24:06.446087 [DEBUG] http: Request GET /v1/session/info/e60b150a-2e11-4dc2-5dd0-22e96b6386e6 (974.369µs) from=127.0.0.1:52066
TestExecCommand_StreamResults - 2019/11/27 02:24:06.448112 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/exit (1.66706ms) from=127.0.0.1:36212
TestExecCommand_Sessions - 2019/11/27 02:24:06.448293 [INFO] agent: Requesting shutdown
TestExecCommand_Sessions - 2019/11/27 02:24:06.448368 [INFO] consul: shutting down server
TestExecCommand_Sessions - 2019/11/27 02:24:06.448414 [WARN] serf: Shutdown without a Leave
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.450832 [DEBUG] http: Request GET /v1/kv/_rexec/72f1d046-5902-b000-2f2b-6a15e559fc29/job (2.46709ms) from=127.0.0.1:51684
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand_NoShell - 2019/11/27 02:24:06.507335 [WARN] agent: Node name "Node 5dac3932-7be2-aec3-f7d5-76635c878594" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand_NoShell - 2019/11/27 02:24:06.507742 [DEBUG] tlsutil: Update with version 1
TestExecCommand_NoShell - 2019/11/27 02:24:06.507850 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_NoShell - 2019/11/27 02:24:06.507998 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestExecCommand_NoShell - 2019/11/27 02:24:06.508166 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_Sessions - 2019/11/27 02:24:06.656134 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/11/27 02:24:06.979758 [DEBUG] http: Request PUT /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/random?acquire=9ddbf0a0-85e8-9b08-c207-179aa3fee29c (525.215151ms) from=127.0.0.1:36232
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.982008 [DEBUG] http: Request DELETE /v1/kv/_rexec/72f1d046-5902-b000-2f2b-6a15e559fc29?recurse= (527.339229ms) from=127.0.0.1:51674
TestExecCommand_Sessions - 2019/11/27 02:24:06.983288 [INFO] manager: shutting down
TestExecCommand_Sessions - 2019/11/27 02:24:06.984037 [INFO] agent: consul server down
TestExecCommand_Sessions - 2019/11/27 02:24:06.984090 [INFO] agent: shutdown complete
TestExecCommand_Sessions - 2019/11/27 02:24:06.984140 [INFO] agent: Stopping DNS server 127.0.0.1:37013 (tcp)
TestExecCommand_Sessions - 2019/11/27 02:24:06.984275 [INFO] agent: Stopping DNS server 127.0.0.1:37013 (udp)
TestExecCommand_Sessions - 2019/11/27 02:24:06.984418 [INFO] agent: Stopping HTTP server 127.0.0.1:37014 (tcp)
TestExecCommand_Sessions - 2019/11/27 02:24:06.985156 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_Sessions - 2019/11/27 02:24:06.985335 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_Sessions (7.18s)
=== CONT  TestExecCommand
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.990610 [DEBUG] http: Request GET /v1/kv/_rexec/72f1d046-5902-b000-2f2b-6a15e559fc29/job (399.681µs) from=127.0.0.1:51688
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.991767 [INFO] agent: Requesting shutdown
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.991947 [INFO] consul: shutting down server
TestExecCommand_UploadDestroy - 2019/11/27 02:24:06.992093 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/11/27 02:24:06.994967 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?index=13&keys=&wait=2000ms (541.698753ms) from=127.0.0.1:36212
WARNING: bootstrap = true: do not enable unless necessary
TestExecCommand - 2019/11/27 02:24:07.084974 [WARN] agent: Node name "Node 4081159e-97b0-7740-7f98-3fd9269013a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestExecCommand - 2019/11/27 02:24:07.085313 [DEBUG] tlsutil: Update with version 1
TestExecCommand - 2019/11/27 02:24:07.085376 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand - 2019/11/27 02:24:07.085550 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestExecCommand - 2019/11/27 02:24:07.085647 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.204337 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/11/27 02:24:07.300256 [DEBUG] http: Request PUT /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/out/00000?acquire=9ddbf0a0-85e8-9b08-c207-179aa3fee29c (312.351722ms) from=127.0.0.1:36236
TestExecCommand_StreamResults - 2019/11/27 02:24:07.302043 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?index=14&keys=&wait=2000ms (297.303839ms) from=127.0.0.1:36212
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.302062 [INFO] manager: shutting down
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.302977 [INFO] agent: consul server down
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.303991 [INFO] agent: shutdown complete
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.304313 [INFO] agent: Stopping DNS server 127.0.0.1:37019 (tcp)
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.304818 [INFO] agent: Stopping DNS server 127.0.0.1:37019 (udp)
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.305335 [INFO] agent: Stopping HTTP server 127.0.0.1:37020 (tcp)
TestExecCommand_StreamResults - 2019/11/27 02:24:07.307552 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/out/00000 (1.292047ms) from=127.0.0.1:36212
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.308190 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_UploadDestroy - 2019/11/27 02:24:07.308553 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_UploadDestroy (7.50s)
TestExecCommand_StreamResults - 2019/11/27 02:24:07.647339 [DEBUG] http: Request PUT /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/out/00001?acquire=9ddbf0a0-85e8-9b08-c207-179aa3fee29c (327.907621ms) from=127.0.0.1:36240
TestExecCommand_StreamResults - 2019/11/27 02:24:07.652521 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?index=15&keys=&wait=2000ms (335.454896ms) from=127.0.0.1:36212
TestExecCommand_StreamResults - 2019/11/27 02:24:07.655887 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/foo/out/00001 (798.695µs) from=127.0.0.1:36212
TestExecCommand_StreamResults - 2019/11/27 02:24:07.657879 [INFO] agent: Requesting shutdown
TestExecCommand_StreamResults - 2019/11/27 02:24:07.657985 [INFO] consul: shutting down server
TestExecCommand_StreamResults - 2019/11/27 02:24:07.658043 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/11/27 02:24:07.947436 [WARN] serf: Shutdown without a Leave
TestExecCommand_StreamResults - 2019/11/27 02:24:08.154459 [INFO] manager: shutting down
TestExecCommand_StreamResults - 2019/11/27 02:24:08.155546 [INFO] agent: consul server down
TestExecCommand_StreamResults - 2019/11/27 02:24:08.155620 [INFO] agent: shutdown complete
TestExecCommand_StreamResults - 2019/11/27 02:24:08.155683 [INFO] agent: Stopping DNS server 127.0.0.1:37007 (tcp)
TestExecCommand_StreamResults - 2019/11/27 02:24:08.155846 [INFO] agent: Stopping DNS server 127.0.0.1:37007 (udp)
TestExecCommand_StreamResults - 2019/11/27 02:24:08.156030 [INFO] agent: Stopping HTTP server 127.0.0.1:37008 (tcp)
2019/11/27 02:24:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5dac3932-7be2-aec3-f7d5-76635c878594 Address:127.0.0.1:37030}]
2019/11/27 02:24:08 [INFO]  raft: Node at 127.0.0.1:37030 [Follower] entering Follower state (Leader: "")
TestExecCommand_NoShell - 2019/11/27 02:24:08.806031 [INFO] serf: EventMemberJoin: Node 5dac3932-7be2-aec3-f7d5-76635c878594.dc1 127.0.0.1
TestExecCommand_NoShell - 2019/11/27 02:24:08.813929 [INFO] serf: EventMemberJoin: Node 5dac3932-7be2-aec3-f7d5-76635c878594 127.0.0.1
TestExecCommand_NoShell - 2019/11/27 02:24:08.814811 [INFO] consul: Handled member-join event for server "Node 5dac3932-7be2-aec3-f7d5-76635c878594.dc1" in area "wan"
TestExecCommand_NoShell - 2019/11/27 02:24:08.815133 [INFO] consul: Adding LAN server Node 5dac3932-7be2-aec3-f7d5-76635c878594 (Addr: tcp/127.0.0.1:37030) (DC: dc1)
TestExecCommand_NoShell - 2019/11/27 02:24:08.815517 [INFO] agent: Started DNS server 127.0.0.1:37025 (udp)
TestExecCommand_NoShell - 2019/11/27 02:24:08.815745 [INFO] agent: Started DNS server 127.0.0.1:37025 (tcp)
TestExecCommand_NoShell - 2019/11/27 02:24:08.818287 [INFO] agent: Started HTTP server on 127.0.0.1:37026 (tcp)
TestExecCommand_NoShell - 2019/11/27 02:24:08.818428 [INFO] agent: started state syncer
2019/11/27 02:24:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:08 [INFO]  raft: Node at 127.0.0.1:37030 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:09 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4081159e-97b0-7740-7f98-3fd9269013a9 Address:127.0.0.1:37036}]
2019/11/27 02:24:09 [INFO]  raft: Node at 127.0.0.1:37036 [Follower] entering Follower state (Leader: "")
TestExecCommand_StreamResults - 2019/11/27 02:24:09.163448 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:37008 (tcp)
TestExecCommand_StreamResults - 2019/11/27 02:24:09.163514 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_StreamResults - 2019/11/27 02:24:09.163550 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_StreamResults (9.35s)
TestExecCommand - 2019/11/27 02:24:09.165186 [INFO] serf: EventMemberJoin: Node 4081159e-97b0-7740-7f98-3fd9269013a9.dc1 127.0.0.1
TestExecCommand - 2019/11/27 02:24:09.173009 [INFO] serf: EventMemberJoin: Node 4081159e-97b0-7740-7f98-3fd9269013a9 127.0.0.1
TestExecCommand - 2019/11/27 02:24:09.174794 [INFO] consul: Adding LAN server Node 4081159e-97b0-7740-7f98-3fd9269013a9 (Addr: tcp/127.0.0.1:37036) (DC: dc1)
TestExecCommand - 2019/11/27 02:24:09.175519 [INFO] consul: Handled member-join event for server "Node 4081159e-97b0-7740-7f98-3fd9269013a9.dc1" in area "wan"
TestExecCommand - 2019/11/27 02:24:09.177996 [INFO] agent: Started DNS server 127.0.0.1:37031 (tcp)
TestExecCommand - 2019/11/27 02:24:09.183312 [INFO] agent: Started DNS server 127.0.0.1:37031 (udp)
TestExecCommand - 2019/11/27 02:24:09.185472 [INFO] agent: Started HTTP server on 127.0.0.1:37032 (tcp)
TestExecCommand - 2019/11/27 02:24:09.185558 [INFO] agent: started state syncer
2019/11/27 02:24:09 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:09 [INFO]  raft: Node at 127.0.0.1:37036 [Candidate] entering Candidate state in term 2
TestExecCommand_StreamResults - 2019/11/27 02:24:09.722249 [DEBUG] http: Request GET /v1/kv/_rexec/9ddbf0a0-85e8-9b08-c207-179aa3fee29c/?index=16&keys=&wait=2000ms (2.061736485s) from=127.0.0.1:36212
2019/11/27 02:24:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:09 [INFO]  raft: Node at 127.0.0.1:37030 [Leader] entering Leader state
TestExecCommand_NoShell - 2019/11/27 02:24:09.956800 [INFO] consul: cluster leadership acquired
TestExecCommand_NoShell - 2019/11/27 02:24:09.957261 [INFO] consul: New leader elected: Node 5dac3932-7be2-aec3-f7d5-76635c878594
2019/11/27 02:24:10 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:10 [INFO]  raft: Node at 127.0.0.1:37036 [Leader] entering Leader state
TestExecCommand - 2019/11/27 02:24:10.569417 [INFO] consul: cluster leadership acquired
TestExecCommand - 2019/11/27 02:24:10.569896 [INFO] consul: New leader elected: Node 4081159e-97b0-7740-7f98-3fd9269013a9
TestExecCommand_NoShell - 2019/11/27 02:24:10.688462 [INFO] agent: Synced node info
TestExecCommand - 2019/11/27 02:24:11.156187 [INFO] agent: Synced node info
TestExecCommand_NoShell - 2019/11/27 02:24:11.414522 [DEBUG] agent: Node info in sync
TestExecCommand_NoShell - 2019/11/27 02:24:11.414636 [DEBUG] agent: Node info in sync
TestExecCommand - 2019/11/27 02:24:11.983997 [DEBUG] agent: Node info in sync
TestExecCommand - 2019/11/27 02:24:11.984118 [DEBUG] agent: Node info in sync
TestExecCommand_NoShell - 2019/11/27 02:24:12.188066 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand_NoShell - 2019/11/27 02:24:12.188486 [DEBUG] consul: Skipping self join check for "Node 5dac3932-7be2-aec3-f7d5-76635c878594" since the cluster is too small
TestExecCommand_NoShell - 2019/11/27 02:24:12.188653 [INFO] consul: member 'Node 5dac3932-7be2-aec3-f7d5-76635c878594' joined, marking health alive
TestExecCommand_NoShell - 2019/11/27 02:24:12.383897 [DEBUG] http: Request GET /v1/agent/self (7.092925ms) from=127.0.0.1:55352
TestExecCommand - 2019/11/27 02:24:12.555941 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestExecCommand - 2019/11/27 02:24:12.569042 [DEBUG] consul: Skipping self join check for "Node 4081159e-97b0-7740-7f98-3fd9269013a9" since the cluster is too small
TestExecCommand - 2019/11/27 02:24:12.569389 [INFO] consul: member 'Node 4081159e-97b0-7740-7f98-3fd9269013a9' joined, marking health alive
TestExecCommand_NoShell - 2019/11/27 02:24:12.644278 [DEBUG] http: Request PUT /v1/session/create (248.989737ms) from=127.0.0.1:55352
TestExecCommand - 2019/11/27 02:24:12.876283 [DEBUG] http: Request GET /v1/agent/self (5.560869ms) from=127.0.0.1:35900
TestExecCommand_NoShell - 2019/11/27 02:24:13.022217 [DEBUG] http: Request PUT /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/job?acquire=27074dcb-8f85-80e0-044a-9299f32248ab (375.274337ms) from=127.0.0.1:55352
TestExecCommand - 2019/11/27 02:24:13.177739 [DEBUG] http: Request PUT /v1/session/create (287.159793ms) from=127.0.0.1:35900
TestExecCommand_NoShell - 2019/11/27 02:24:13.228240 [DEBUG] http: Request PUT /v1/event/fire/_rexec (1.359717ms) from=127.0.0.1:55352
TestExecCommand_NoShell - 2019/11/27 02:24:13.229386 [DEBUG] consul: User event: _rexec
TestExecCommand_NoShell - 2019/11/27 02:24:13.230351 [DEBUG] agent: received remote exec event (ID: 81db89c5-4314-6579-1f45-581d3c514298)
TestExecCommand_NoShell - 2019/11/27 02:24:13.232080 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/?keys=&wait=1000ms (866.031µs) from=127.0.0.1:55352
TestExecCommand_NoShell - 2019/11/27 02:24:13.454551 [INFO] agent: remote exec ''
TestExecCommand - 2019/11/27 02:24:13.455352 [DEBUG] http: Request PUT /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/job?acquire=71a2c0aa-df01-41db-25be-50ddbb9a888f (268.785456ms) from=127.0.0.1:35900
TestExecCommand_NoShell - 2019/11/27 02:24:13.460773 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/?index=12&keys=&wait=1000ms (225.601217ms) from=127.0.0.1:55352
TestExecCommand - 2019/11/27 02:24:13.663345 [DEBUG] consul: User event: _rexec
TestExecCommand - 2019/11/27 02:24:13.663594 [DEBUG] agent: received remote exec event (ID: 92947ac1-1cf3-86cf-b661-7c2e8893a7a9)
TestExecCommand - 2019/11/27 02:24:13.664425 [DEBUG] http: Request PUT /v1/event/fire/_rexec (1.751063ms) from=127.0.0.1:35900
TestExecCommand - 2019/11/27 02:24:13.673588 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/?keys=&wait=1000ms (5.013849ms) from=127.0.0.1:35900
TestExecCommand - 2019/11/27 02:24:13.910449 [INFO] agent: remote exec 'uptime'
TestExecCommand - 2019/11/27 02:24:13.922610 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/?index=12&keys=&wait=1000ms (244.932254ms) from=127.0.0.1:35900
TestExecCommand_NoShell - 2019/11/27 02:24:13.928656 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/?index=13&keys=&wait=1000ms (460.525106ms) from=127.0.0.1:55352
TestExecCommand_NoShell - 2019/11/27 02:24:13.941620 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/Node%205dac3932-7be2-aec3-f7d5-76635c878594/out/00000 (3.615798ms) from=127.0.0.1:55352
TestExecCommand_NoShell - 2019/11/27 02:24:14.123999 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/?index=14&keys=&wait=1000ms (171.532247ms) from=127.0.0.1:55352
TestExecCommand_NoShell - 2019/11/27 02:24:14.131106 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/Node%205dac3932-7be2-aec3-f7d5-76635c878594/exit (1.705395ms) from=127.0.0.1:55352
TestExecCommand - 2019/11/27 02:24:14.211263 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/?index=13&keys=&wait=1000ms (276.4434ms) from=127.0.0.1:35900
TestExecCommand - 2019/11/27 02:24:14.225039 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/Node%204081159e-97b0-7740-7f98-3fd9269013a9/out/00000 (805.696µs) from=127.0.0.1:35900
TestExecCommand - 2019/11/27 02:24:14.411364 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/?index=14&keys=&wait=1000ms (180.759582ms) from=127.0.0.1:35900
TestExecCommand - 2019/11/27 02:24:14.415788 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/Node%204081159e-97b0-7740-7f98-3fd9269013a9/exit (1.071039ms) from=127.0.0.1:35900
TestExecCommand_NoShell - 2019/11/27 02:24:15.157620 [DEBUG] http: Request GET /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab/?index=15&keys=&wait=1000ms (1.022609904s) from=127.0.0.1:55352
TestExecCommand_NoShell - 2019/11/27 02:24:15.318286 [DEBUG] http: Request PUT /v1/session/destroy/27074dcb-8f85-80e0-044a-9299f32248ab (178.5405ms) from=127.0.0.1:55410
TestExecCommand - 2019/11/27 02:24:15.449401 [DEBUG] http: Request GET /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f/?index=15&keys=&wait=1000ms (1.029582823s) from=127.0.0.1:35900
TestExecCommand_NoShell - 2019/11/27 02:24:15.488560 [DEBUG] http: Request DELETE /v1/kv/_rexec/27074dcb-8f85-80e0-044a-9299f32248ab?recurse= (162.097235ms) from=127.0.0.1:55410
TestExecCommand - 2019/11/27 02:24:15.834202 [DEBUG] http: Request PUT /v1/session/destroy/71a2c0aa-df01-41db-25be-50ddbb9a888f (413.108041ms) from=127.0.0.1:35966
TestExecCommand_NoShell - 2019/11/27 02:24:16.132845 [DEBUG] http: Request PUT /v1/session/destroy/27074dcb-8f85-80e0-044a-9299f32248ab (642.315052ms) from=127.0.0.1:55410
TestExecCommand_NoShell - 2019/11/27 02:24:16.134264 [INFO] agent: Requesting shutdown
TestExecCommand_NoShell - 2019/11/27 02:24:16.134406 [INFO] consul: shutting down server
TestExecCommand_NoShell - 2019/11/27 02:24:16.134517 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/11/27 02:24:16.199218 [DEBUG] http: Request DELETE /v1/kv/_rexec/71a2c0aa-df01-41db-25be-50ddbb9a888f?recurse= (362.699538ms) from=127.0.0.1:35966
TestExecCommand_NoShell - 2019/11/27 02:24:16.287082 [WARN] serf: Shutdown without a Leave
TestExecCommand_NoShell - 2019/11/27 02:24:16.364919 [INFO] manager: shutting down
TestExecCommand_NoShell - 2019/11/27 02:24:16.365853 [INFO] agent: consul server down
TestExecCommand_NoShell - 2019/11/27 02:24:16.365966 [INFO] agent: shutdown complete
TestExecCommand_NoShell - 2019/11/27 02:24:16.366054 [INFO] agent: Stopping DNS server 127.0.0.1:37025 (tcp)
TestExecCommand_NoShell - 2019/11/27 02:24:16.366341 [INFO] agent: Stopping DNS server 127.0.0.1:37025 (udp)
TestExecCommand - 2019/11/27 02:24:16.366342 [DEBUG] http: Request PUT /v1/session/destroy/71a2c0aa-df01-41db-25be-50ddbb9a888f (164.776665ms) from=127.0.0.1:35966
TestExecCommand_NoShell - 2019/11/27 02:24:16.366768 [INFO] agent: Stopping HTTP server 127.0.0.1:37026 (tcp)
TestExecCommand_NoShell - 2019/11/27 02:24:16.367616 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand_NoShell - 2019/11/27 02:24:16.367726 [INFO] agent: Endpoints down
--- PASS: TestExecCommand_NoShell (9.93s)
TestExecCommand - 2019/11/27 02:24:16.367908 [INFO] agent: Requesting shutdown
TestExecCommand - 2019/11/27 02:24:16.367972 [INFO] consul: shutting down server
TestExecCommand - 2019/11/27 02:24:16.368027 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/11/27 02:24:16.420384 [WARN] serf: Shutdown without a Leave
TestExecCommand - 2019/11/27 02:24:16.476018 [INFO] manager: shutting down
TestExecCommand - 2019/11/27 02:24:16.477143 [INFO] agent: consul server down
TestExecCommand - 2019/11/27 02:24:16.477208 [INFO] agent: shutdown complete
TestExecCommand - 2019/11/27 02:24:16.477265 [INFO] agent: Stopping DNS server 127.0.0.1:37031 (tcp)
TestExecCommand - 2019/11/27 02:24:16.477437 [INFO] agent: Stopping DNS server 127.0.0.1:37031 (udp)
TestExecCommand - 2019/11/27 02:24:16.477624 [INFO] agent: Stopping HTTP server 127.0.0.1:37032 (tcp)
TestExecCommand - 2019/11/27 02:24:16.478264 [INFO] agent: Waiting for endpoints to shut down
TestExecCommand - 2019/11/27 02:24:16.478354 [INFO] agent: Endpoints down
--- PASS: TestExecCommand (9.49s)
PASS
ok  	github.com/hashicorp/consul/command/exec	16.839s
=== RUN   TestConfigUtil_Values
=== PAUSE TestConfigUtil_Values
=== RUN   TestConfigUtil_Visit
=== PAUSE TestConfigUtil_Visit
=== RUN   TestFlagMapValueSet
=== PAUSE TestFlagMapValueSet
=== RUN   TestAppendSliceValue_implements
=== PAUSE TestAppendSliceValue_implements
=== RUN   TestAppendSliceValueSet
=== PAUSE TestAppendSliceValueSet
=== RUN   TestHTTPFlagsSetToken
--- PASS: TestHTTPFlagsSetToken (0.00s)
=== CONT  TestConfigUtil_Values
=== CONT  TestAppendSliceValue_implements
--- PASS: TestAppendSliceValue_implements (0.00s)
=== CONT  TestFlagMapValueSet
=== CONT  TestAppendSliceValueSet
=== CONT  TestConfigUtil_Visit
=== RUN   TestFlagMapValueSet/missing_=
--- PASS: TestAppendSliceValueSet (0.00s)
--- PASS: TestConfigUtil_Values (0.00s)
=== RUN   TestFlagMapValueSet/sets
=== RUN   TestFlagMapValueSet/sets_multiple
=== RUN   TestFlagMapValueSet/overwrites
--- PASS: TestFlagMapValueSet (0.00s)
    --- PASS: TestFlagMapValueSet/missing_= (0.00s)
    --- PASS: TestFlagMapValueSet/sets (0.00s)
    --- PASS: TestFlagMapValueSet/sets_multiple (0.00s)
    --- PASS: TestFlagMapValueSet/overwrites (0.00s)
--- PASS: TestConfigUtil_Visit (0.02s)
PASS
ok  	github.com/hashicorp/consul/command/flags	0.066s
=== RUN   TestForceLeaveCommand_noTabs
=== PAUSE TestForceLeaveCommand_noTabs
=== RUN   TestForceLeaveCommand
=== PAUSE TestForceLeaveCommand
=== RUN   TestForceLeaveCommand_noAddrs
=== PAUSE TestForceLeaveCommand_noAddrs
=== CONT  TestForceLeaveCommand_noTabs
=== CONT  TestForceLeaveCommand_noAddrs
--- PASS: TestForceLeaveCommand_noTabs (0.00s)
--- PASS: TestForceLeaveCommand_noAddrs (0.00s)
=== CONT  TestForceLeaveCommand
WARNING: bootstrap = true: do not enable unless necessary
TestForceLeaveCommand - 2019/11/27 02:24:44.310428 [WARN] agent: Node name "Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestForceLeaveCommand - 2019/11/27 02:24:44.311621 [DEBUG] tlsutil: Update with version 1
TestForceLeaveCommand - 2019/11/27 02:24:44.311782 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/11/27 02:24:44.312083 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestForceLeaveCommand - 2019/11/27 02:24:44.312244 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:24:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:0beaa9ba-6890-1740-399b-bd0b2b7a45cf Address:127.0.0.1:35506}]
2019/11/27 02:24:45 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
TestForceLeaveCommand - 2019/11/27 02:24:45.323689 [INFO] serf: EventMemberJoin: Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf.dc1 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:45.327238 [INFO] serf: EventMemberJoin: Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:45.328115 [INFO] consul: Adding LAN server Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestForceLeaveCommand - 2019/11/27 02:24:45.328474 [INFO] consul: Handled member-join event for server "Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf.dc1" in area "wan"
TestForceLeaveCommand - 2019/11/27 02:24:45.328992 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
TestForceLeaveCommand - 2019/11/27 02:24:45.329057 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:45.331115 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:45.331247 [INFO] agent: started state syncer
2019/11/27 02:24:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:45 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:45 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:45 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
TestForceLeaveCommand - 2019/11/27 02:24:45.820304 [INFO] consul: cluster leadership acquired
TestForceLeaveCommand - 2019/11/27 02:24:45.821049 [INFO] consul: New leader elected: Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf
WARNING: bootstrap = true: do not enable unless necessary
TestForceLeaveCommand - 2019/11/27 02:24:45.881320 [WARN] agent: Node name "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestForceLeaveCommand - 2019/11/27 02:24:45.881788 [DEBUG] tlsutil: Update with version 1
TestForceLeaveCommand - 2019/11/27 02:24:45.881864 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/11/27 02:24:45.882037 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestForceLeaveCommand - 2019/11/27 02:24:45.882140 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestForceLeaveCommand - 2019/11/27 02:24:46.541824 [INFO] agent: Synced node info
TestForceLeaveCommand - 2019/11/27 02:24:46.541960 [DEBUG] agent: Node info in sync
2019/11/27 02:24:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4cfc4c93-13a8-c29a-6e38-93d4d566896d Address:127.0.0.1:35512}]
2019/11/27 02:24:47 [INFO]  raft: Node at 127.0.0.1:35512 [Follower] entering Follower state (Leader: "")
TestForceLeaveCommand - 2019/11/27 02:24:47.256166 [INFO] serf: EventMemberJoin: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:47.262727 [INFO] serf: EventMemberJoin: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:47.263710 [INFO] consul: Adding LAN server Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d (Addr: tcp/127.0.0.1:35512) (DC: dc1)
TestForceLeaveCommand - 2019/11/27 02:24:47.263965 [INFO] consul: Handled member-join event for server "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1" in area "wan"
TestForceLeaveCommand - 2019/11/27 02:24:47.265411 [INFO] agent: Started DNS server 127.0.0.1:35507 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:47.265492 [INFO] agent: Started DNS server 127.0.0.1:35507 (udp)
TestForceLeaveCommand - 2019/11/27 02:24:47.267614 [INFO] agent: Started HTTP server on 127.0.0.1:35508 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:47.267773 [INFO] agent: started state syncer
2019/11/27 02:24:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:47 [INFO]  raft: Node at 127.0.0.1:35512 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:47 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:47 [INFO]  raft: Node at 127.0.0.1:35512 [Leader] entering Leader state
TestForceLeaveCommand - 2019/11/27 02:24:47.852461 [INFO] consul: cluster leadership acquired
TestForceLeaveCommand - 2019/11/27 02:24:47.852908 [INFO] consul: New leader elected: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d
TestForceLeaveCommand - 2019/11/27 02:24:47.991047 [INFO] agent: (LAN) joining: [127.0.0.1:35504]
TestForceLeaveCommand - 2019/11/27 02:24:47.992131 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:35504
TestForceLeaveCommand - 2019/11/27 02:24:47.992225 [DEBUG] memberlist: Stream connection from=127.0.0.1:47448
TestForceLeaveCommand - 2019/11/27 02:24:47.996519 [INFO] serf: EventMemberJoin: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:47.997304 [INFO] serf: EventMemberJoin: Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:47.997389 [INFO] consul: Adding LAN server Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d (Addr: tcp/127.0.0.1:35512) (DC: dc1)
TestForceLeaveCommand - 2019/11/27 02:24:47.997645 [INFO] consul: New leader elected: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d
TestForceLeaveCommand - 2019/11/27 02:24:47.997892 [INFO] consul: Adding LAN server Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestForceLeaveCommand - 2019/11/27 02:24:47.999124 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:35505
TestForceLeaveCommand - 2019/11/27 02:24:48.000489 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:35511
TestForceLeaveCommand - 2019/11/27 02:24:48.001956 [DEBUG] memberlist: Stream connection from=127.0.0.1:59934
TestForceLeaveCommand - 2019/11/27 02:24:48.004435 [INFO] serf: EventMemberJoin: Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf.dc1 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:48.005116 [INFO] consul: Handled member-join event for server "Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf.dc1" in area "wan"
TestForceLeaveCommand - 2019/11/27 02:24:48.007187 [DEBUG] memberlist: Stream connection from=127.0.0.1:55860
TestForceLeaveCommand - 2019/11/27 02:24:48.017062 [INFO] serf: EventMemberJoin: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:48.007840 [INFO] agent: (LAN) joined: 1 Err: <nil>
TestForceLeaveCommand - 2019/11/27 02:24:48.020636 [DEBUG] agent: systemd notify failed: No socket
TestForceLeaveCommand - 2019/11/27 02:24:48.020687 [INFO] agent: Requesting shutdown
TestForceLeaveCommand - 2019/11/27 02:24:48.020736 [INFO] consul: shutting down server
TestForceLeaveCommand - 2019/11/27 02:24:48.020786 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/11/27 02:24:48.021241 [ERR] agent: failed to sync remote state: No cluster leader
TestForceLeaveCommand - 2019/11/27 02:24:48.021664 [DEBUG] consul: Successfully performed flood-join for "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d" at 127.0.0.1:35511
TestForceLeaveCommand - 2019/11/27 02:24:48.026477 [DEBUG] consul: Successfully performed flood-join for "Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf" at 127.0.0.1:35505
TestForceLeaveCommand - 2019/11/27 02:24:48.028346 [INFO] consul: Handled member-join event for server "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1" in area "wan"
TestForceLeaveCommand - 2019/11/27 02:24:48.259174 [DEBUG] serf: messageJoinType: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1
TestForceLeaveCommand - 2019/11/27 02:24:48.329760 [DEBUG] serf: messageJoinType: Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf.dc1
TestForceLeaveCommand - 2019/11/27 02:24:48.329898 [DEBUG] serf: messageJoinType: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1
TestForceLeaveCommand - 2019/11/27 02:24:48.507304 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/11/27 02:24:48.508039 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestForceLeaveCommand - 2019/11/27 02:24:48.508406 [ERR] consul: 'Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf' and 'Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestForceLeaveCommand - 2019/11/27 02:24:48.508591 [INFO] consul: member 'Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf' joined, marking health alive
TestForceLeaveCommand - 2019/11/27 02:24:48.618507 [INFO] manager: shutting down
TestForceLeaveCommand - 2019/11/27 02:24:48.685353 [INFO] agent: consul server down
TestForceLeaveCommand - 2019/11/27 02:24:48.685435 [INFO] agent: shutdown complete
TestForceLeaveCommand - 2019/11/27 02:24:48.685490 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:48.685619 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (udp)
TestForceLeaveCommand - 2019/11/27 02:24:48.685761 [INFO] agent: Stopping HTTP server 127.0.0.1:35508 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:48.685935 [INFO] agent: Waiting for endpoints to shut down
TestForceLeaveCommand - 2019/11/27 02:24:48.685995 [INFO] agent: Endpoints down
TestForceLeaveCommand - 2019/11/27 02:24:48.689334 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestForceLeaveCommand - 2019/11/27 02:24:48.693823 [INFO] agent: Force leaving node: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d
TestForceLeaveCommand - 2019/11/27 02:24:48.743336 [ERR] consul: 'Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d' and 'Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestForceLeaveCommand - 2019/11/27 02:24:48.743471 [INFO] consul: member 'Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d' joined, marking health alive
TestForceLeaveCommand - 2019/11/27 02:24:48.832408 [DEBUG] memberlist: Failed ping: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d (timeout reached)
TestForceLeaveCommand - 2019/11/27 02:24:48.876506 [ERR] consul: 'Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d' and 'Node 0beaa9ba-6890-1740-399b-bd0b2b7a45cf' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestForceLeaveCommand - 2019/11/27 02:24:48.878437 [WARN] consul: error getting server health from "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:35512: connect: connection refused
TestForceLeaveCommand - 2019/11/27 02:24:49.319636 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestForceLeaveCommand - 2019/11/27 02:24:49.319729 [DEBUG] agent: Node info in sync
TestForceLeaveCommand - 2019/11/27 02:24:49.332099 [INFO] memberlist: Suspect Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d has failed, no acks received
TestForceLeaveCommand - 2019/11/27 02:24:49.877105 [WARN] consul: error getting server health from "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d": context deadline exceeded
TestForceLeaveCommand - 2019/11/27 02:24:50.324652 [DEBUG] http: Request PUT /v1/agent/force-leave/Node%204cfc4c93-13a8-c29a-6e38-93d4d566896d (1.630787059s) from=127.0.0.1:42780
TestForceLeaveCommand - 2019/11/27 02:24:50.626960 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestForceLeaveCommand - 2019/11/27 02:24:50.627031 [DEBUG] agent: Node info in sync
TestForceLeaveCommand - 2019/11/27 02:24:50.755968 [WARN] consul: error getting server health from "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:35512: connect: connection refused
TestForceLeaveCommand - 2019/11/27 02:24:50.828481 [DEBUG] memberlist: Failed ping: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d (timeout reached)
TestForceLeaveCommand - 2019/11/27 02:24:51.754378 [WARN] consul: error getting server health from "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d": context deadline exceeded
TestForceLeaveCommand - 2019/11/27 02:24:52.327997 [INFO] memberlist: Suspect Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d has failed, no acks received
TestForceLeaveCommand - 2019/11/27 02:24:52.755064 [WARN] consul: error getting server health from "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d": rpc error getting client: failed to get conn: dial tcp 127.0.0.1:0->127.0.0.1:35512: connect: connection refused
TestForceLeaveCommand - 2019/11/27 02:24:53.325040 [DEBUG] memberlist: Failed ping: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d.dc1 (timeout reached)
TestForceLeaveCommand - 2019/11/27 02:24:53.332836 [INFO] memberlist: Marking Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d as failed, suspect timeout reached (0 peer confirmations)
TestForceLeaveCommand - 2019/11/27 02:24:53.333318 [INFO] serf: EventMemberLeave: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d 127.0.0.1
TestForceLeaveCommand - 2019/11/27 02:24:53.333650 [INFO] consul: Removing LAN server Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d (Addr: tcp/127.0.0.1:35512) (DC: dc1)
TestForceLeaveCommand - 2019/11/27 02:24:53.334114 [INFO] consul: member 'Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d' left, deregistering
TestForceLeaveCommand - 2019/11/27 02:24:53.339893 [INFO] agent: Requesting shutdown
TestForceLeaveCommand - 2019/11/27 02:24:53.340028 [INFO] consul: shutting down server
TestForceLeaveCommand - 2019/11/27 02:24:53.340092 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/11/27 02:24:53.451475 [WARN] serf: Shutdown without a Leave
TestForceLeaveCommand - 2019/11/27 02:24:53.607191 [INFO] manager: shutting down
TestForceLeaveCommand - 2019/11/27 02:24:53.754409 [WARN] consul: error getting server health from "Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d": context deadline exceeded
2019/11/27 02:24:53 [WARN]  raft: could not get configuration for Stats: raft is already shutdown
TestForceLeaveCommand - 2019/11/27 02:24:53.829264 [DEBUG] memberlist: Failed ping: Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d (timeout reached)
TestForceLeaveCommand - 2019/11/27 02:24:53.864478 [ERR] consul: failed to reconcile member: {Node 4cfc4c93-13a8-c29a-6e38-93d4d566896d 127.0.0.1 35510 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:4cfc4c93-13a8-c29a-6e38-93d4d566896d port:35512 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:35511] left 1 5 2 2 5 4}: leadership lost while committing log
TestForceLeaveCommand - 2019/11/27 02:24:53.866808 [INFO] agent: consul server down
TestForceLeaveCommand - 2019/11/27 02:24:53.866993 [INFO] agent: shutdown complete
TestForceLeaveCommand - 2019/11/27 02:24:53.867926 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:53.868533 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestForceLeaveCommand - 2019/11/27 02:24:53.869045 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestForceLeaveCommand - 2019/11/27 02:24:53.870426 [INFO] agent: Waiting for endpoints to shut down
TestForceLeaveCommand - 2019/11/27 02:24:53.870511 [INFO] agent: Endpoints down
--- PASS: TestForceLeaveCommand (9.66s)
PASS
ok  	github.com/hashicorp/consul/command/forceleave	9.861s
?   	github.com/hashicorp/consul/command/helpers	[no test files]
=== RUN   TestInfoCommand_noTabs
=== PAUSE TestInfoCommand_noTabs
=== RUN   TestInfoCommand
=== PAUSE TestInfoCommand
=== CONT  TestInfoCommand_noTabs
=== CONT  TestInfoCommand
--- PASS: TestInfoCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestInfoCommand - 2019/11/27 02:24:49.059184 [WARN] agent: Node name "Node b68a32f0-5864-8258-2f18-6fc2a4594eed" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestInfoCommand - 2019/11/27 02:24:49.060126 [DEBUG] tlsutil: Update with version 1
TestInfoCommand - 2019/11/27 02:24:49.060207 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestInfoCommand - 2019/11/27 02:24:49.060519 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestInfoCommand - 2019/11/27 02:24:49.060647 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:24:49 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b68a32f0-5864-8258-2f18-6fc2a4594eed Address:127.0.0.1:14506}]
2019/11/27 02:24:49 [INFO]  raft: Node at 127.0.0.1:14506 [Follower] entering Follower state (Leader: "")
TestInfoCommand - 2019/11/27 02:24:49.738379 [INFO] serf: EventMemberJoin: Node b68a32f0-5864-8258-2f18-6fc2a4594eed.dc1 127.0.0.1
TestInfoCommand - 2019/11/27 02:24:49.749662 [INFO] serf: EventMemberJoin: Node b68a32f0-5864-8258-2f18-6fc2a4594eed 127.0.0.1
TestInfoCommand - 2019/11/27 02:24:49.750940 [INFO] consul: Adding LAN server Node b68a32f0-5864-8258-2f18-6fc2a4594eed (Addr: tcp/127.0.0.1:14506) (DC: dc1)
TestInfoCommand - 2019/11/27 02:24:49.751408 [INFO] consul: Handled member-join event for server "Node b68a32f0-5864-8258-2f18-6fc2a4594eed.dc1" in area "wan"
TestInfoCommand - 2019/11/27 02:24:49.753396 [INFO] agent: Started DNS server 127.0.0.1:14501 (udp)
TestInfoCommand - 2019/11/27 02:24:49.753810 [INFO] agent: Started DNS server 127.0.0.1:14501 (tcp)
TestInfoCommand - 2019/11/27 02:24:49.755833 [INFO] agent: Started HTTP server on 127.0.0.1:14502 (tcp)
TestInfoCommand - 2019/11/27 02:24:49.755981 [INFO] agent: started state syncer
2019/11/27 02:24:49 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:49 [INFO]  raft: Node at 127.0.0.1:14506 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:50 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:50 [INFO]  raft: Node at 127.0.0.1:14506 [Leader] entering Leader state
TestInfoCommand - 2019/11/27 02:24:50.185556 [INFO] consul: cluster leadership acquired
TestInfoCommand - 2019/11/27 02:24:50.185993 [INFO] consul: New leader elected: Node b68a32f0-5864-8258-2f18-6fc2a4594eed
TestInfoCommand - 2019/11/27 02:24:50.408273 [INFO] agent: Synced node info
TestInfoCommand - 2019/11/27 02:24:50.408385 [DEBUG] agent: Node info in sync
TestInfoCommand - 2019/11/27 02:24:50.523566 [DEBUG] http: Request GET /v1/agent/self (50.933511ms) from=127.0.0.1:49202
TestInfoCommand - 2019/11/27 02:24:50.545577 [INFO] agent: Requesting shutdown
TestInfoCommand - 2019/11/27 02:24:50.545820 [INFO] consul: shutting down server
TestInfoCommand - 2019/11/27 02:24:50.546067 [WARN] serf: Shutdown without a Leave
TestInfoCommand - 2019/11/27 02:24:50.684957 [WARN] serf: Shutdown without a Leave
TestInfoCommand - 2019/11/27 02:24:50.729533 [INFO] manager: shutting down
TestInfoCommand - 2019/11/27 02:24:50.929520 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestInfoCommand - 2019/11/27 02:24:50.929706 [INFO] agent: consul server down
TestInfoCommand - 2019/11/27 02:24:50.929759 [INFO] agent: shutdown complete
TestInfoCommand - 2019/11/27 02:24:50.929815 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (tcp)
TestInfoCommand - 2019/11/27 02:24:50.929948 [INFO] agent: Stopping DNS server 127.0.0.1:14501 (udp)
TestInfoCommand - 2019/11/27 02:24:50.930122 [INFO] agent: Stopping HTTP server 127.0.0.1:14502 (tcp)
TestInfoCommand - 2019/11/27 02:24:50.930920 [INFO] agent: Waiting for endpoints to shut down
TestInfoCommand - 2019/11/27 02:24:50.931002 [INFO] agent: Endpoints down
--- PASS: TestInfoCommand (2.01s)
PASS
ok  	github.com/hashicorp/consul/command/info	2.247s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.01s)
PASS
ok  	github.com/hashicorp/consul/command/intention	0.171s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== CONT  TestCommand_noTabs
=== CONT  TestCommand
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/1_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/1_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
--- PASS: TestCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/11/27 02:24:53.752368 [WARN] agent: Node name "Node 462fe41f-8314-a02a-f88f-bcfa4b6c59ee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/11/27 02:24:53.753370 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/11/27 02:24:53.753536 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand - 2019/11/27 02:24:53.753969 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand - 2019/11/27 02:24:53.754159 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:24:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:462fe41f-8314-a02a-f88f-bcfa4b6c59ee Address:127.0.0.1:43006}]
2019/11/27 02:24:54 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/11/27 02:24:54.784224 [INFO] serf: EventMemberJoin: Node 462fe41f-8314-a02a-f88f-bcfa4b6c59ee.dc1 127.0.0.1
TestCommand - 2019/11/27 02:24:54.793500 [INFO] serf: EventMemberJoin: Node 462fe41f-8314-a02a-f88f-bcfa4b6c59ee 127.0.0.1
TestCommand - 2019/11/27 02:24:54.796018 [INFO] consul: Handled member-join event for server "Node 462fe41f-8314-a02a-f88f-bcfa4b6c59ee.dc1" in area "wan"
TestCommand - 2019/11/27 02:24:54.796899 [INFO] consul: Adding LAN server Node 462fe41f-8314-a02a-f88f-bcfa4b6c59ee (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestCommand - 2019/11/27 02:24:54.797407 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestCommand - 2019/11/27 02:24:54.802408 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestCommand - 2019/11/27 02:24:54.804685 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestCommand - 2019/11/27 02:24:54.804840 [INFO] agent: started state syncer
2019/11/27 02:24:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:24:54 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/11/27 02:24:55 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:24:55 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestCommand - 2019/11/27 02:24:55.245161 [INFO] consul: cluster leadership acquired
TestCommand - 2019/11/27 02:24:55.245848 [INFO] consul: New leader elected: Node 462fe41f-8314-a02a-f88f-bcfa4b6c59ee
TestCommand - 2019/11/27 02:24:56.219129 [INFO] agent: Synced node info
TestCommand - 2019/11/27 02:24:56.223224 [DEBUG] http: Request POST /v1/connect/intentions (577.245885ms) from=127.0.0.1:37624
TestCommand - 2019/11/27 02:24:56.239861 [DEBUG] http: Request GET /v1/connect/intentions/check?destination=db&source=foo&source-type=consul (6.904917ms) from=127.0.0.1:37626
TestCommand - 2019/11/27 02:24:56.254615 [DEBUG] http: Request GET /v1/connect/intentions/check?destination=db&source=web&source-type=consul (1.161708ms) from=127.0.0.1:37628
TestCommand - 2019/11/27 02:24:56.256194 [INFO] agent: Requesting shutdown
TestCommand - 2019/11/27 02:24:56.256292 [INFO] consul: shutting down server
TestCommand - 2019/11/27 02:24:56.256344 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/11/27 02:24:56.362492 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/11/27 02:24:56.451426 [INFO] manager: shutting down
TestCommand - 2019/11/27 02:24:56.640305 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand - 2019/11/27 02:24:56.640618 [INFO] agent: consul server down
TestCommand - 2019/11/27 02:24:56.640682 [INFO] agent: shutdown complete
TestCommand - 2019/11/27 02:24:56.640743 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestCommand - 2019/11/27 02:24:56.640982 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestCommand - 2019/11/27 02:24:56.641178 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestCommand - 2019/11/27 02:24:56.642477 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/11/27 02:24:56.642569 [INFO] agent: Endpoints down
--- PASS: TestCommand (2.96s)
PASS
ok  	github.com/hashicorp/consul/command/intention/check	3.128s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== RUN   TestCommand_deny
=== PAUSE TestCommand_deny
=== RUN   TestCommand_meta
=== PAUSE TestCommand_meta
=== RUN   TestCommand_File
=== PAUSE TestCommand_File
=== RUN   TestCommand_FileNoExist
=== PAUSE TestCommand_FileNoExist
=== RUN   TestCommand_replace
=== PAUSE TestCommand_replace
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_File
=== CONT  TestCommand_replace
=== CONT  TestCommand_deny
--- PASS: TestCommand_noTabs (0.01s)
=== CONT  TestCommand_meta
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File - 2019/11/27 02:25:02.187667 [WARN] agent: Node name "Node 19ae22a4-f58c-1697-e840-f3488fb2ad4e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File - 2019/11/27 02:25:02.188728 [DEBUG] tlsutil: Update with version 1
TestCommand_File - 2019/11/27 02:25:02.188819 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File - 2019/11/27 02:25:02.189077 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_File - 2019/11/27 02:25:02.189217 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_meta - 2019/11/27 02:25:02.253778 [WARN] agent: Node name "Node 44e816a2-dc80-a09c-95a1-09db67db965f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_meta - 2019/11/27 02:25:02.254156 [DEBUG] tlsutil: Update with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_meta - 2019/11/27 02:25:02.254225 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_meta - 2019/11/27 02:25:02.254375 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_meta - 2019/11/27 02:25:02.254474 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_deny - 2019/11/27 02:25:02.254604 [WARN] agent: Node name "Node 99053f68-66ba-4ed2-717f-5793e17d7b7c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_deny - 2019/11/27 02:25:02.254955 [DEBUG] tlsutil: Update with version 1
TestCommand_deny - 2019/11/27 02:25:02.255018 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_deny - 2019/11/27 02:25:02.255213 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_deny - 2019/11/27 02:25:02.255306 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_replace - 2019/11/27 02:25:02.255611 [WARN] agent: Node name "Node e0dba059-f43d-815f-7845-0cd1c4b27fb5" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_replace - 2019/11/27 02:25:02.255992 [DEBUG] tlsutil: Update with version 1
TestCommand_replace - 2019/11/27 02:25:02.256051 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_replace - 2019/11/27 02:25:02.256227 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_replace - 2019/11/27 02:25:02.256349 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:25:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:19ae22a4-f58c-1697-e840-f3488fb2ad4e Address:127.0.0.1:40006}]
TestCommand_File - 2019/11/27 02:25:03.155864 [INFO] serf: EventMemberJoin: Node 19ae22a4-f58c-1697-e840-f3488fb2ad4e.dc1 127.0.0.1
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestCommand_File - 2019/11/27 02:25:03.165441 [INFO] serf: EventMemberJoin: Node 19ae22a4-f58c-1697-e840-f3488fb2ad4e 127.0.0.1
TestCommand_File - 2019/11/27 02:25:03.168842 [INFO] consul: Handled member-join event for server "Node 19ae22a4-f58c-1697-e840-f3488fb2ad4e.dc1" in area "wan"
TestCommand_File - 2019/11/27 02:25:03.168930 [INFO] consul: Adding LAN server Node 19ae22a4-f58c-1697-e840-f3488fb2ad4e (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestCommand_File - 2019/11/27 02:25:03.169096 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestCommand_File - 2019/11/27 02:25:03.175415 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestCommand_File - 2019/11/27 02:25:03.181163 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestCommand_File - 2019/11/27 02:25:03.181416 [INFO] agent: started state syncer
2019/11/27 02:25:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:44e816a2-dc80-a09c-95a1-09db67db965f Address:127.0.0.1:40024}]
2019/11/27 02:25:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e0dba059-f43d-815f-7845-0cd1c4b27fb5 Address:127.0.0.1:40018}]
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40018 [Follower] entering Follower state (Leader: "")
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40024 [Follower] entering Follower state (Leader: "")
TestCommand_meta - 2019/11/27 02:25:03.422575 [INFO] serf: EventMemberJoin: Node 44e816a2-dc80-a09c-95a1-09db67db965f.dc1 127.0.0.1
TestCommand_replace - 2019/11/27 02:25:03.424494 [INFO] serf: EventMemberJoin: Node e0dba059-f43d-815f-7845-0cd1c4b27fb5.dc1 127.0.0.1
TestCommand_meta - 2019/11/27 02:25:03.426081 [INFO] serf: EventMemberJoin: Node 44e816a2-dc80-a09c-95a1-09db67db965f 127.0.0.1
TestCommand_meta - 2019/11/27 02:25:03.427063 [INFO] consul: Handled member-join event for server "Node 44e816a2-dc80-a09c-95a1-09db67db965f.dc1" in area "wan"
TestCommand_meta - 2019/11/27 02:25:03.427358 [INFO] consul: Adding LAN server Node 44e816a2-dc80-a09c-95a1-09db67db965f (Addr: tcp/127.0.0.1:40024) (DC: dc1)
2019/11/27 02:25:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40018 [Candidate] entering Candidate state in term 2
TestCommand_meta - 2019/11/27 02:25:03.467868 [INFO] agent: Started DNS server 127.0.0.1:40019 (udp)
TestCommand_meta - 2019/11/27 02:25:03.467964 [INFO] agent: Started DNS server 127.0.0.1:40019 (tcp)
TestCommand_replace - 2019/11/27 02:25:03.469151 [INFO] serf: EventMemberJoin: Node e0dba059-f43d-815f-7845-0cd1c4b27fb5 127.0.0.1
TestCommand_replace - 2019/11/27 02:25:03.470130 [INFO] consul: Handled member-join event for server "Node e0dba059-f43d-815f-7845-0cd1c4b27fb5.dc1" in area "wan"
TestCommand_meta - 2019/11/27 02:25:03.470210 [INFO] agent: Started HTTP server on 127.0.0.1:40020 (tcp)
TestCommand_meta - 2019/11/27 02:25:03.470386 [INFO] agent: started state syncer
TestCommand_replace - 2019/11/27 02:25:03.470480 [INFO] consul: Adding LAN server Node e0dba059-f43d-815f-7845-0cd1c4b27fb5 (Addr: tcp/127.0.0.1:40018) (DC: dc1)
TestCommand_replace - 2019/11/27 02:25:03.470621 [INFO] agent: Started DNS server 127.0.0.1:40013 (udp)
2019/11/27 02:25:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40024 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:99053f68-66ba-4ed2-717f-5793e17d7b7c Address:127.0.0.1:40012}]
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40012 [Follower] entering Follower state (Leader: "")
TestCommand_replace - 2019/11/27 02:25:03.475434 [INFO] agent: Started DNS server 127.0.0.1:40013 (tcp)
TestCommand_deny - 2019/11/27 02:25:03.476615 [INFO] serf: EventMemberJoin: Node 99053f68-66ba-4ed2-717f-5793e17d7b7c.dc1 127.0.0.1
TestCommand_replace - 2019/11/27 02:25:03.477790 [INFO] agent: Started HTTP server on 127.0.0.1:40014 (tcp)
TestCommand_replace - 2019/11/27 02:25:03.477946 [INFO] agent: started state syncer
TestCommand_deny - 2019/11/27 02:25:03.489602 [INFO] serf: EventMemberJoin: Node 99053f68-66ba-4ed2-717f-5793e17d7b7c 127.0.0.1
TestCommand_deny - 2019/11/27 02:25:03.490931 [INFO] consul: Adding LAN server Node 99053f68-66ba-4ed2-717f-5793e17d7b7c (Addr: tcp/127.0.0.1:40012) (DC: dc1)
TestCommand_deny - 2019/11/27 02:25:03.491348 [INFO] consul: Handled member-join event for server "Node 99053f68-66ba-4ed2-717f-5793e17d7b7c.dc1" in area "wan"
TestCommand_deny - 2019/11/27 02:25:03.493829 [INFO] agent: Started DNS server 127.0.0.1:40007 (tcp)
TestCommand_deny - 2019/11/27 02:25:03.494385 [INFO] agent: Started DNS server 127.0.0.1:40007 (udp)
TestCommand_deny - 2019/11/27 02:25:03.496418 [INFO] agent: Started HTTP server on 127.0.0.1:40008 (tcp)
TestCommand_deny - 2019/11/27 02:25:03.496535 [INFO] agent: started state syncer
2019/11/27 02:25:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40012 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:03 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestCommand_File - 2019/11/27 02:25:03.841819 [INFO] consul: cluster leadership acquired
TestCommand_File - 2019/11/27 02:25:03.842379 [INFO] consul: New leader elected: Node 19ae22a4-f58c-1697-e840-f3488fb2ad4e
2019/11/27 02:25:04 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:04 [INFO]  raft: Node at 127.0.0.1:40024 [Leader] entering Leader state
2019/11/27 02:25:04 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:04 [INFO]  raft: Node at 127.0.0.1:40018 [Leader] entering Leader state
2019/11/27 02:25:04 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:04 [INFO]  raft: Node at 127.0.0.1:40012 [Leader] entering Leader state
TestCommand_meta - 2019/11/27 02:25:04.152510 [INFO] consul: cluster leadership acquired
TestCommand_meta - 2019/11/27 02:25:04.153115 [INFO] consul: New leader elected: Node 44e816a2-dc80-a09c-95a1-09db67db965f
TestCommand_deny - 2019/11/27 02:25:04.153422 [INFO] consul: cluster leadership acquired
TestCommand_deny - 2019/11/27 02:25:04.153792 [INFO] consul: New leader elected: Node 99053f68-66ba-4ed2-717f-5793e17d7b7c
TestCommand_replace - 2019/11/27 02:25:04.154048 [INFO] consul: cluster leadership acquired
TestCommand_replace - 2019/11/27 02:25:04.154410 [INFO] consul: New leader elected: Node e0dba059-f43d-815f-7845-0cd1c4b27fb5
TestCommand_File - 2019/11/27 02:25:04.253713 [INFO] agent: Synced node info
TestCommand_deny - 2019/11/27 02:25:04.562936 [INFO] agent: Synced node info
TestCommand_replace - 2019/11/27 02:25:04.564732 [INFO] agent: Synced node info
TestCommand_replace - 2019/11/27 02:25:04.564855 [DEBUG] agent: Node info in sync
TestCommand_replace - 2019/11/27 02:25:04.565791 [DEBUG] http: Request POST /v1/connect/intentions (369.579689ms) from=127.0.0.1:52208
TestCommand_replace - 2019/11/27 02:25:04.574284 [DEBUG] http: Request GET /v1/connect/intentions (4.459494ms) from=127.0.0.1:52216
TestCommand_File - 2019/11/27 02:25:04.665246 [DEBUG] http: Request POST /v1/connect/intentions (545.147034ms) from=127.0.0.1:49332
TestCommand_File - 2019/11/27 02:25:04.670454 [DEBUG] http: Request GET /v1/connect/intentions (1.309381ms) from=127.0.0.1:49346
TestCommand_File - 2019/11/27 02:25:04.674445 [INFO] agent: Requesting shutdown
TestCommand_File - 2019/11/27 02:25:04.674629 [INFO] consul: shutting down server
TestCommand_File - 2019/11/27 02:25:04.674721 [WARN] serf: Shutdown without a Leave
TestCommand_meta - 2019/11/27 02:25:04.754439 [INFO] agent: Synced node info
TestCommand_meta - 2019/11/27 02:25:04.754592 [DEBUG] agent: Node info in sync
TestCommand_meta - 2019/11/27 02:25:04.755485 [DEBUG] http: Request POST /v1/connect/intentions (394.034906ms) from=127.0.0.1:45722
TestCommand_meta - 2019/11/27 02:25:04.760065 [DEBUG] http: Request GET /v1/connect/intentions (1.148375ms) from=127.0.0.1:45734
TestCommand_meta - 2019/11/27 02:25:04.762626 [INFO] agent: Requesting shutdown
TestCommand_meta - 2019/11/27 02:25:04.762730 [INFO] consul: shutting down server
TestCommand_meta - 2019/11/27 02:25:04.762782 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/11/27 02:25:04.874898 [WARN] serf: Shutdown without a Leave
TestCommand_meta - 2019/11/27 02:25:04.930310 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/11/27 02:25:04.931926 [INFO] manager: shutting down
TestCommand_File - 2019/11/27 02:25:05.004462 [ERR] agent: failed to sync remote state: No cluster leader
TestCommand_meta - 2019/11/27 02:25:05.018959 [INFO] manager: shutting down
TestCommand_File - 2019/11/27 02:25:05.019167 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_meta - 2019/11/27 02:25:05.019587 [INFO] agent: consul server down
TestCommand_meta - 2019/11/27 02:25:05.019641 [INFO] agent: shutdown complete
TestCommand_meta - 2019/11/27 02:25:05.019693 [INFO] agent: Stopping DNS server 127.0.0.1:40019 (tcp)
TestCommand_meta - 2019/11/27 02:25:05.019606 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_meta - 2019/11/27 02:25:05.019821 [INFO] agent: Stopping DNS server 127.0.0.1:40019 (udp)
TestCommand_meta - 2019/11/27 02:25:05.019960 [INFO] agent: Stopping HTTP server 127.0.0.1:40020 (tcp)
TestCommand_meta - 2019/11/27 02:25:05.020640 [INFO] agent: Waiting for endpoints to shut down
TestCommand_deny - 2019/11/27 02:25:05.020701 [DEBUG] http: Request POST /v1/connect/intentions (516.899013ms) from=127.0.0.1:50270
TestCommand_meta - 2019/11/27 02:25:05.020757 [INFO] agent: Endpoints down
--- PASS: TestCommand_meta (3.07s)
=== CONT  TestCommand_FileNoExist
TestCommand_File - 2019/11/27 02:25:05.021315 [INFO] agent: consul server down
TestCommand_File - 2019/11/27 02:25:05.021386 [INFO] agent: shutdown complete
TestCommand_File - 2019/11/27 02:25:05.021446 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestCommand_File - 2019/11/27 02:25:05.021631 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestCommand_File - 2019/11/27 02:25:05.021886 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestCommand_File - 2019/11/27 02:25:05.024849 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File - 2019/11/27 02:25:05.025131 [INFO] agent: Endpoints down
--- PASS: TestCommand_File (3.08s)
=== CONT  TestCommand
TestCommand_deny - 2019/11/27 02:25:05.045727 [DEBUG] http: Request GET /v1/connect/intentions (1.375717ms) from=127.0.0.1:50282
TestCommand_deny - 2019/11/27 02:25:05.048126 [INFO] agent: Requesting shutdown
TestCommand_deny - 2019/11/27 02:25:05.048244 [INFO] consul: shutting down server
TestCommand_deny - 2019/11/27 02:25:05.048298 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/11/27 02:25:05.179529 [WARN] agent: Node name "Node 142b862d-f73a-d958-b0a1-ddfb4d62ae41" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/11/27 02:25:05.180032 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/11/27 02:25:05.180110 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand - 2019/11/27 02:25:05.180382 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand - 2019/11/27 02:25:05.180547 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_deny - 2019/11/27 02:25:05.184616 [WARN] serf: Shutdown without a Leave
TestCommand_replace - 2019/11/27 02:25:05.204259 [ERR] http: Request POST /v1/connect/intentions, error: duplicate intention found: ALLOW default/foo => default/bar (ID: b72e69c1-095e-7597-64d8-160cec2d2200, Precedence: 9) from=127.0.0.1:52218
TestCommand_replace - 2019/11/27 02:25:05.205012 [DEBUG] http: Request POST /v1/connect/intentions (603.739483ms) from=127.0.0.1:52218
TestCommand_replace - 2019/11/27 02:25:05.216053 [DEBUG] http: Request GET /v1/connect/intentions (1.331048ms) from=127.0.0.1:52226
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_FileNoExist - 2019/11/27 02:25:05.224115 [WARN] agent: Node name "Node a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_FileNoExist - 2019/11/27 02:25:05.224730 [DEBUG] tlsutil: Update with version 1
TestCommand_FileNoExist - 2019/11/27 02:25:05.224952 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_FileNoExist - 2019/11/27 02:25:05.225252 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_FileNoExist - 2019/11/27 02:25:05.225490 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_deny - 2019/11/27 02:25:05.350908 [INFO] manager: shutting down
TestCommand_deny - 2019/11/27 02:25:05.428864 [INFO] agent: consul server down
TestCommand_deny - 2019/11/27 02:25:05.428948 [INFO] agent: shutdown complete
TestCommand_deny - 2019/11/27 02:25:05.429010 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (tcp)
TestCommand_deny - 2019/11/27 02:25:05.429223 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (udp)
TestCommand_deny - 2019/11/27 02:25:05.429442 [INFO] agent: Stopping HTTP server 127.0.0.1:40008 (tcp)
TestCommand_deny - 2019/11/27 02:25:05.430250 [INFO] agent: Waiting for endpoints to shut down
TestCommand_deny - 2019/11/27 02:25:05.430380 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCommand_deny - 2019/11/27 02:25:05.430565 [INFO] agent: Endpoints down
--- PASS: TestCommand_deny (3.49s)
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/-allow_and_-deny
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/-allow_and_-deny (0.00s)
TestCommand_replace - 2019/11/27 02:25:05.609202 [DEBUG] http: Request PUT /v1/connect/intentions/b72e69c1-095e-7597-64d8-160cec2d2200 (388.478704ms) from=127.0.0.1:52226
TestCommand_replace - 2019/11/27 02:25:05.612844 [DEBUG] http: Request GET /v1/connect/intentions (1.334381ms) from=127.0.0.1:52216
TestCommand_replace - 2019/11/27 02:25:05.615048 [INFO] agent: Requesting shutdown
TestCommand_replace - 2019/11/27 02:25:05.615156 [INFO] consul: shutting down server
TestCommand_replace - 2019/11/27 02:25:05.615221 [WARN] serf: Shutdown without a Leave
TestCommand_replace - 2019/11/27 02:25:05.767244 [WARN] serf: Shutdown without a Leave
TestCommand_replace - 2019/11/27 02:25:05.857323 [INFO] manager: shutting down
TestCommand_replace - 2019/11/27 02:25:05.984411 [INFO] agent: consul server down
TestCommand_replace - 2019/11/27 02:25:05.984490 [INFO] agent: shutdown complete
TestCommand_replace - 2019/11/27 02:25:05.984560 [INFO] agent: Stopping DNS server 127.0.0.1:40013 (tcp)
TestCommand_replace - 2019/11/27 02:25:05.984813 [INFO] agent: Stopping DNS server 127.0.0.1:40013 (udp)
TestCommand_replace - 2019/11/27 02:25:05.985069 [INFO] agent: Stopping HTTP server 127.0.0.1:40014 (tcp)
TestCommand_replace - 2019/11/27 02:25:05.985453 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCommand_replace - 2019/11/27 02:25:05.986241 [INFO] agent: Waiting for endpoints to shut down
TestCommand_replace - 2019/11/27 02:25:05.986340 [INFO] agent: Endpoints down
--- PASS: TestCommand_replace (4.04s)
2019/11/27 02:25:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584 Address:127.0.0.1:40030}]
2019/11/27 02:25:06 [INFO]  raft: Node at 127.0.0.1:40030 [Follower] entering Follower state (Leader: "")
2019/11/27 02:25:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:142b862d-f73a-d958-b0a1-ddfb4d62ae41 Address:127.0.0.1:40036}]
TestCommand_FileNoExist - 2019/11/27 02:25:06.232905 [INFO] serf: EventMemberJoin: Node a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584.dc1 127.0.0.1
TestCommand - 2019/11/27 02:25:06.233555 [INFO] serf: EventMemberJoin: Node 142b862d-f73a-d958-b0a1-ddfb4d62ae41.dc1 127.0.0.1
TestCommand_FileNoExist - 2019/11/27 02:25:06.235926 [INFO] serf: EventMemberJoin: Node a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584 127.0.0.1
2019/11/27 02:25:06 [INFO]  raft: Node at 127.0.0.1:40036 [Follower] entering Follower state (Leader: "")
TestCommand_FileNoExist - 2019/11/27 02:25:06.237983 [INFO] consul: Handled member-join event for server "Node a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584.dc1" in area "wan"
TestCommand_FileNoExist - 2019/11/27 02:25:06.238455 [INFO] agent: Started DNS server 127.0.0.1:40025 (tcp)
TestCommand_FileNoExist - 2019/11/27 02:25:06.238523 [INFO] consul: Adding LAN server Node a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584 (Addr: tcp/127.0.0.1:40030) (DC: dc1)
TestCommand - 2019/11/27 02:25:06.239055 [INFO] serf: EventMemberJoin: Node 142b862d-f73a-d958-b0a1-ddfb4d62ae41 127.0.0.1
TestCommand - 2019/11/27 02:25:06.240305 [INFO] consul: Adding LAN server Node 142b862d-f73a-d958-b0a1-ddfb4d62ae41 (Addr: tcp/127.0.0.1:40036) (DC: dc1)
TestCommand - 2019/11/27 02:25:06.240512 [INFO] consul: Handled member-join event for server "Node 142b862d-f73a-d958-b0a1-ddfb4d62ae41.dc1" in area "wan"
TestCommand - 2019/11/27 02:25:06.241240 [INFO] agent: Started DNS server 127.0.0.1:40031 (udp)
TestCommand - 2019/11/27 02:25:06.241346 [INFO] agent: Started DNS server 127.0.0.1:40031 (tcp)
TestCommand_FileNoExist - 2019/11/27 02:25:06.245108 [INFO] agent: Started DNS server 127.0.0.1:40025 (udp)
TestCommand_FileNoExist - 2019/11/27 02:25:06.247987 [INFO] agent: Started HTTP server on 127.0.0.1:40026 (tcp)
TestCommand_FileNoExist - 2019/11/27 02:25:06.248102 [INFO] agent: started state syncer
TestCommand - 2019/11/27 02:25:06.250807 [INFO] agent: Started HTTP server on 127.0.0.1:40032 (tcp)
TestCommand - 2019/11/27 02:25:06.250913 [INFO] agent: started state syncer
2019/11/27 02:25:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:06 [INFO]  raft: Node at 127.0.0.1:40030 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:06 [INFO]  raft: Node at 127.0.0.1:40036 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:07 [INFO]  raft: Node at 127.0.0.1:40030 [Leader] entering Leader state
TestCommand_FileNoExist - 2019/11/27 02:25:07.051253 [INFO] consul: cluster leadership acquired
TestCommand_FileNoExist - 2019/11/27 02:25:07.052043 [INFO] consul: New leader elected: Node a22850f2-b8d0-7f7b-df1d-b8bf3a7a9584
TestCommand_FileNoExist - 2019/11/27 02:25:07.092052 [INFO] agent: Requesting shutdown
TestCommand_FileNoExist - 2019/11/27 02:25:07.092165 [INFO] consul: shutting down server
TestCommand_FileNoExist - 2019/11/27 02:25:07.092222 [WARN] serf: Shutdown without a Leave
TestCommand_FileNoExist - 2019/11/27 02:25:07.092314 [ERR] agent: failed to sync remote state: No cluster leader
2019/11/27 02:25:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:07 [INFO]  raft: Node at 127.0.0.1:40036 [Leader] entering Leader state
TestCommand - 2019/11/27 02:25:07.117931 [INFO] consul: cluster leadership acquired
TestCommand - 2019/11/27 02:25:07.118434 [INFO] consul: New leader elected: Node 142b862d-f73a-d958-b0a1-ddfb4d62ae41
TestCommand_FileNoExist - 2019/11/27 02:25:07.195188 [WARN] serf: Shutdown without a Leave
TestCommand_FileNoExist - 2019/11/27 02:25:07.273103 [INFO] manager: shutting down
TestCommand_FileNoExist - 2019/11/27 02:25:07.417355 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestCommand_FileNoExist - 2019/11/27 02:25:07.417715 [INFO] agent: consul server down
TestCommand_FileNoExist - 2019/11/27 02:25:07.417774 [INFO] agent: shutdown complete
TestCommand_FileNoExist - 2019/11/27 02:25:07.417828 [INFO] agent: Stopping DNS server 127.0.0.1:40025 (tcp)
TestCommand_FileNoExist - 2019/11/27 02:25:07.417996 [INFO] agent: Stopping DNS server 127.0.0.1:40025 (udp)
TestCommand_FileNoExist - 2019/11/27 02:25:07.418191 [INFO] agent: Stopping HTTP server 127.0.0.1:40026 (tcp)
TestCommand_FileNoExist - 2019/11/27 02:25:07.418445 [INFO] agent: Waiting for endpoints to shut down
TestCommand_FileNoExist - 2019/11/27 02:25:07.418524 [INFO] agent: Endpoints down
--- PASS: TestCommand_FileNoExist (2.40s)
TestCommand - 2019/11/27 02:25:07.419211 [INFO] agent: Synced node info
TestCommand - 2019/11/27 02:25:07.419629 [DEBUG] http: Request POST /v1/connect/intentions (280.595135ms) from=127.0.0.1:42460
TestCommand - 2019/11/27 02:25:07.424479 [DEBUG] http: Request GET /v1/connect/intentions (1.39305ms) from=127.0.0.1:42462
TestCommand - 2019/11/27 02:25:07.426978 [INFO] agent: Requesting shutdown
TestCommand - 2019/11/27 02:25:07.427092 [INFO] consul: shutting down server
TestCommand - 2019/11/27 02:25:07.427147 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/11/27 02:25:07.584557 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/11/27 02:25:07.639794 [INFO] manager: shutting down
TestCommand - 2019/11/27 02:25:07.685292 [ERR] agent: failed to sync remote state: No cluster leader
TestCommand - 2019/11/27 02:25:07.839588 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand - 2019/11/27 02:25:07.839844 [INFO] agent: consul server down
TestCommand - 2019/11/27 02:25:07.839894 [INFO] agent: shutdown complete
TestCommand - 2019/11/27 02:25:07.839946 [INFO] agent: Stopping DNS server 127.0.0.1:40031 (tcp)
TestCommand - 2019/11/27 02:25:07.840076 [INFO] agent: Stopping DNS server 127.0.0.1:40031 (udp)
TestCommand - 2019/11/27 02:25:07.840216 [INFO] agent: Stopping HTTP server 127.0.0.1:40032 (tcp)
TestCommand - 2019/11/27 02:25:07.840827 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/11/27 02:25:07.841002 [INFO] agent: Endpoints down
--- PASS: TestCommand (2.82s)
PASS
ok  	github.com/hashicorp/consul/command/intention/create	6.229s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand
=== PAUSE TestCommand
=== CONT  TestCommand_noTabs
=== CONT  TestCommand
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/3_args
=== RUN   TestCommand_Validation/0_args
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
--- PASS: TestCommand_noTabs (0.03s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand - 2019/11/27 02:25:30.403014 [WARN] agent: Node name "Node a4b22b20-d828-a31e-9078-053ba0f14f4d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand - 2019/11/27 02:25:30.404422 [DEBUG] tlsutil: Update with version 1
TestCommand - 2019/11/27 02:25:30.404497 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand - 2019/11/27 02:25:30.404732 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand - 2019/11/27 02:25:30.404937 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:25:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:a4b22b20-d828-a31e-9078-053ba0f14f4d Address:127.0.0.1:26506}]
2019/11/27 02:25:31 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
TestCommand - 2019/11/27 02:25:31.344211 [INFO] serf: EventMemberJoin: Node a4b22b20-d828-a31e-9078-053ba0f14f4d.dc1 127.0.0.1
TestCommand - 2019/11/27 02:25:31.348907 [INFO] serf: EventMemberJoin: Node a4b22b20-d828-a31e-9078-053ba0f14f4d 127.0.0.1
TestCommand - 2019/11/27 02:25:31.349980 [INFO] consul: Adding LAN server Node a4b22b20-d828-a31e-9078-053ba0f14f4d (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestCommand - 2019/11/27 02:25:31.354038 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestCommand - 2019/11/27 02:25:31.354382 [INFO] consul: Handled member-join event for server "Node a4b22b20-d828-a31e-9078-053ba0f14f4d.dc1" in area "wan"
TestCommand - 2019/11/27 02:25:31.354918 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestCommand - 2019/11/27 02:25:31.357428 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestCommand - 2019/11/27 02:25:31.358052 [INFO] agent: started state syncer
2019/11/27 02:25:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:31 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:32 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestCommand - 2019/11/27 02:25:32.371785 [INFO] consul: cluster leadership acquired
TestCommand - 2019/11/27 02:25:32.372411 [INFO] consul: New leader elected: Node a4b22b20-d828-a31e-9078-053ba0f14f4d
TestCommand - 2019/11/27 02:25:32.839589 [INFO] agent: Synced node info
TestCommand - 2019/11/27 02:25:33.075290 [DEBUG] http: Request POST /v1/connect/intentions (565.863373ms) from=127.0.0.1:49208
TestCommand - 2019/11/27 02:25:33.100638 [DEBUG] http: Request GET /v1/connect/intentions (8.70398ms) from=127.0.0.1:49210
TestCommand - 2019/11/27 02:25:33.451539 [DEBUG] http: Request DELETE /v1/connect/intentions/11b26c36-f9f5-d017-46d7-c637636961fb (346.161128ms) from=127.0.0.1:49210
TestCommand - 2019/11/27 02:25:33.454962 [DEBUG] http: Request GET /v1/connect/intentions (1.309047ms) from=127.0.0.1:49208
TestCommand - 2019/11/27 02:25:33.456259 [INFO] agent: Requesting shutdown
TestCommand - 2019/11/27 02:25:33.456351 [INFO] consul: shutting down server
TestCommand - 2019/11/27 02:25:33.456399 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/11/27 02:25:33.537996 [WARN] serf: Shutdown without a Leave
TestCommand - 2019/11/27 02:25:33.600851 [INFO] manager: shutting down
TestCommand - 2019/11/27 02:25:33.649177 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand - 2019/11/27 02:25:33.649796 [INFO] agent: consul server down
TestCommand - 2019/11/27 02:25:33.649960 [INFO] agent: shutdown complete
TestCommand - 2019/11/27 02:25:33.650056 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestCommand - 2019/11/27 02:25:33.650404 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestCommand - 2019/11/27 02:25:33.650642 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestCommand - 2019/11/27 02:25:33.651549 [INFO] agent: Waiting for endpoints to shut down
TestCommand - 2019/11/27 02:25:33.651603 [INFO] agent: Endpoints down
--- PASS: TestCommand (3.36s)
PASS
ok  	github.com/hashicorp/consul/command/intention/delete	3.513s
=== RUN   TestFinder
=== PAUSE TestFinder
=== CONT  TestFinder
WARNING: bootstrap = true: do not enable unless necessary
TestFinder - 2019/11/27 02:25:34.152050 [WARN] agent: Node name "Node fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestFinder - 2019/11/27 02:25:34.152748 [DEBUG] tlsutil: Update with version 1
TestFinder - 2019/11/27 02:25:34.153038 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestFinder - 2019/11/27 02:25:34.153270 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestFinder - 2019/11/27 02:25:34.153435 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:25:35 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc Address:127.0.0.1:17506}]
2019/11/27 02:25:35 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestFinder - 2019/11/27 02:25:35.020217 [INFO] serf: EventMemberJoin: Node fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc.dc1 127.0.0.1
TestFinder - 2019/11/27 02:25:35.024185 [INFO] serf: EventMemberJoin: Node fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc 127.0.0.1
TestFinder - 2019/11/27 02:25:35.025372 [INFO] consul: Adding LAN server Node fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestFinder - 2019/11/27 02:25:35.026120 [INFO] consul: Handled member-join event for server "Node fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc.dc1" in area "wan"
TestFinder - 2019/11/27 02:25:35.028619 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestFinder - 2019/11/27 02:25:35.029179 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestFinder - 2019/11/27 02:25:35.031530 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestFinder - 2019/11/27 02:25:35.031868 [INFO] agent: started state syncer
2019/11/27 02:25:35 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:35 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:35 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:35 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestFinder - 2019/11/27 02:25:35.571737 [INFO] consul: cluster leadership acquired
TestFinder - 2019/11/27 02:25:35.572287 [INFO] consul: New leader elected: Node fd1d0eaa-4171-a5b1-6967-f8d476e3a4dc
TestFinder - 2019/11/27 02:25:35.929686 [DEBUG] http: Request POST /v1/connect/intentions (234.913788ms) from=127.0.0.1:45134
TestFinder - 2019/11/27 02:25:35.931558 [INFO] agent: Synced node info
TestFinder - 2019/11/27 02:25:35.931675 [DEBUG] agent: Node info in sync
TestFinder - 2019/11/27 02:25:36.037962 [DEBUG] http: Request GET /v1/connect/intentions (104.545096ms) from=127.0.0.1:45134
TestFinder - 2019/11/27 02:25:36.041496 [INFO] agent: Requesting shutdown
TestFinder - 2019/11/27 02:25:36.041593 [INFO] consul: shutting down server
TestFinder - 2019/11/27 02:25:36.041647 [WARN] serf: Shutdown without a Leave
TestFinder - 2019/11/27 02:25:36.304607 [WARN] serf: Shutdown without a Leave
TestFinder - 2019/11/27 02:25:36.382349 [INFO] manager: shutting down
TestFinder - 2019/11/27 02:25:36.449355 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestFinder - 2019/11/27 02:25:36.450008 [INFO] agent: consul server down
TestFinder - 2019/11/27 02:25:36.450073 [INFO] agent: shutdown complete
TestFinder - 2019/11/27 02:25:36.450133 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestFinder - 2019/11/27 02:25:36.450293 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestFinder - 2019/11/27 02:25:36.450470 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestFinder - 2019/11/27 02:25:36.451233 [INFO] agent: Waiting for endpoints to shut down
TestFinder - 2019/11/27 02:25:36.451372 [INFO] agent: Endpoints down
--- PASS: TestFinder (2.37s)
PASS
ok  	github.com/hashicorp/consul/command/intention/finder	2.521s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_id
=== PAUSE TestCommand_id
=== RUN   TestCommand_srcDst
=== PAUSE TestCommand_srcDst
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_Validation
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestCommand_srcDst
=== CONT  TestCommand_id
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_id - 2019/11/27 02:25:38.288048 [WARN] agent: Node name "Node bfa643f8-22d5-2f1b-48b9-4a2bc7732c12" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_id - 2019/11/27 02:25:38.288862 [DEBUG] tlsutil: Update with version 1
TestCommand_id - 2019/11/27 02:25:38.288934 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_id - 2019/11/27 02:25:38.289152 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_id - 2019/11/27 02:25:38.289272 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_srcDst - 2019/11/27 02:25:38.289702 [WARN] agent: Node name "Node 5b9ff575-759d-0c66-7856-640ba361b65b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_srcDst - 2019/11/27 02:25:38.290234 [DEBUG] tlsutil: Update with version 1
TestCommand_srcDst - 2019/11/27 02:25:38.290407 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_srcDst - 2019/11/27 02:25:38.290743 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_srcDst - 2019/11/27 02:25:38.290944 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:25:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bfa643f8-22d5-2f1b-48b9-4a2bc7732c12 Address:127.0.0.1:40012}]
2019/11/27 02:25:39 [INFO]  raft: Node at 127.0.0.1:40012 [Follower] entering Follower state (Leader: "")
2019/11/27 02:25:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5b9ff575-759d-0c66-7856-640ba361b65b Address:127.0.0.1:40006}]
2019/11/27 02:25:39 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestCommand_id - 2019/11/27 02:25:39.142445 [INFO] serf: EventMemberJoin: Node bfa643f8-22d5-2f1b-48b9-4a2bc7732c12.dc1 127.0.0.1
TestCommand_id - 2019/11/27 02:25:39.154281 [INFO] serf: EventMemberJoin: Node bfa643f8-22d5-2f1b-48b9-4a2bc7732c12 127.0.0.1
TestCommand_srcDst - 2019/11/27 02:25:39.155589 [INFO] serf: EventMemberJoin: Node 5b9ff575-759d-0c66-7856-640ba361b65b.dc1 127.0.0.1
TestCommand_id - 2019/11/27 02:25:39.156968 [INFO] agent: Started DNS server 127.0.0.1:40007 (udp)
TestCommand_srcDst - 2019/11/27 02:25:39.160507 [INFO] serf: EventMemberJoin: Node 5b9ff575-759d-0c66-7856-640ba361b65b 127.0.0.1
TestCommand_id - 2019/11/27 02:25:39.164479 [INFO] consul: Adding LAN server Node bfa643f8-22d5-2f1b-48b9-4a2bc7732c12 (Addr: tcp/127.0.0.1:40012) (DC: dc1)
TestCommand_id - 2019/11/27 02:25:39.165126 [INFO] consul: Handled member-join event for server "Node bfa643f8-22d5-2f1b-48b9-4a2bc7732c12.dc1" in area "wan"
TestCommand_srcDst - 2019/11/27 02:25:39.169683 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestCommand_id - 2019/11/27 02:25:39.175049 [INFO] agent: Started DNS server 127.0.0.1:40007 (tcp)
TestCommand_srcDst - 2019/11/27 02:25:39.175610 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestCommand_id - 2019/11/27 02:25:39.177191 [INFO] agent: Started HTTP server on 127.0.0.1:40008 (tcp)
TestCommand_id - 2019/11/27 02:25:39.177349 [INFO] agent: started state syncer
TestCommand_srcDst - 2019/11/27 02:25:39.178078 [INFO] consul: Handled member-join event for server "Node 5b9ff575-759d-0c66-7856-640ba361b65b.dc1" in area "wan"
TestCommand_srcDst - 2019/11/27 02:25:39.178685 [INFO] consul: Adding LAN server Node 5b9ff575-759d-0c66-7856-640ba361b65b (Addr: tcp/127.0.0.1:40006) (DC: dc1)
2019/11/27 02:25:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:39 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
TestCommand_srcDst - 2019/11/27 02:25:39.186857 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestCommand_srcDst - 2019/11/27 02:25:39.187152 [INFO] agent: started state syncer
2019/11/27 02:25:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:39 [INFO]  raft: Node at 127.0.0.1:40012 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:40 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
2019/11/27 02:25:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:40 [INFO]  raft: Node at 127.0.0.1:40012 [Leader] entering Leader state
TestCommand_srcDst - 2019/11/27 02:25:40.383541 [INFO] consul: cluster leadership acquired
TestCommand_srcDst - 2019/11/27 02:25:40.384273 [INFO] consul: New leader elected: Node 5b9ff575-759d-0c66-7856-640ba361b65b
TestCommand_id - 2019/11/27 02:25:40.384506 [INFO] consul: cluster leadership acquired
TestCommand_id - 2019/11/27 02:25:40.384841 [INFO] consul: New leader elected: Node bfa643f8-22d5-2f1b-48b9-4a2bc7732c12
TestCommand_id - 2019/11/27 02:25:40.862489 [INFO] agent: Synced node info
TestCommand_id - 2019/11/27 02:25:40.862656 [DEBUG] agent: Node info in sync
TestCommand_id - 2019/11/27 02:25:40.867833 [DEBUG] http: Request POST /v1/connect/intentions (183.461265ms) from=127.0.0.1:50304
TestCommand_id - 2019/11/27 02:25:40.901490 [DEBUG] http: Request GET /v1/connect/intentions/0c9406c4-18a4-7ac2-af22-0911f3b10950 (2.934772ms) from=127.0.0.1:50306
TestCommand_id - 2019/11/27 02:25:40.905804 [INFO] agent: Requesting shutdown
TestCommand_id - 2019/11/27 02:25:40.905910 [INFO] consul: shutting down server
TestCommand_id - 2019/11/27 02:25:40.905961 [WARN] serf: Shutdown without a Leave
TestCommand_srcDst - 2019/11/27 02:25:40.928632 [INFO] agent: Synced node info
TestCommand_srcDst - 2019/11/27 02:25:40.929652 [DEBUG] http: Request POST /v1/connect/intentions (363.289067ms) from=127.0.0.1:49370
TestCommand_srcDst - 2019/11/27 02:25:40.937943 [DEBUG] http: Request GET /v1/connect/intentions (1.56939ms) from=127.0.0.1:49376
TestCommand_srcDst - 2019/11/27 02:25:40.941665 [DEBUG] http: Request GET /v1/connect/intentions/72925eb9-87b3-8a09-60de-c4dfc5fa63c6 (799.695µs) from=127.0.0.1:49376
TestCommand_srcDst - 2019/11/27 02:25:40.945421 [INFO] agent: Requesting shutdown
TestCommand_srcDst - 2019/11/27 02:25:40.945526 [INFO] consul: shutting down server
TestCommand_srcDst - 2019/11/27 02:25:40.945574 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/11/27 02:25:41.004147 [WARN] serf: Shutdown without a Leave
TestCommand_srcDst - 2019/11/27 02:25:41.070886 [WARN] serf: Shutdown without a Leave
TestCommand_id - 2019/11/27 02:25:41.071920 [INFO] manager: shutting down
TestCommand_id - 2019/11/27 02:25:41.072391 [INFO] agent: consul server down
TestCommand_id - 2019/11/27 02:25:41.072445 [INFO] agent: shutdown complete
TestCommand_id - 2019/11/27 02:25:41.072505 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (tcp)
TestCommand_id - 2019/11/27 02:25:41.072648 [INFO] agent: Stopping DNS server 127.0.0.1:40007 (udp)
TestCommand_id - 2019/11/27 02:25:41.072802 [INFO] agent: Stopping HTTP server 127.0.0.1:40008 (tcp)
TestCommand_id - 2019/11/27 02:25:41.073695 [INFO] agent: Waiting for endpoints to shut down
TestCommand_id - 2019/11/27 02:25:41.073864 [INFO] agent: Endpoints down
--- PASS: TestCommand_id (2.87s)
TestCommand_id - 2019/11/27 02:25:41.074267 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_srcDst - 2019/11/27 02:25:41.140462 [INFO] manager: shutting down
TestCommand_srcDst - 2019/11/27 02:25:41.182170 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestCommand_srcDst - 2019/11/27 02:25:41.182843 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_srcDst - 2019/11/27 02:25:41.183736 [INFO] agent: consul server down
TestCommand_srcDst - 2019/11/27 02:25:41.183865 [INFO] agent: shutdown complete
TestCommand_srcDst - 2019/11/27 02:25:41.184173 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestCommand_srcDst - 2019/11/27 02:25:41.184476 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestCommand_srcDst - 2019/11/27 02:25:41.184774 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestCommand_srcDst - 2019/11/27 02:25:41.185843 [INFO] agent: Waiting for endpoints to shut down
TestCommand_srcDst - 2019/11/27 02:25:41.185936 [INFO] agent: Endpoints down
--- PASS: TestCommand_srcDst (2.98s)
PASS
ok  	github.com/hashicorp/consul/command/intention/get	3.164s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_matchDst
=== PAUSE TestCommand_matchDst
=== RUN   TestCommand_matchSource
=== PAUSE TestCommand_matchSource
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_matchSource
=== CONT  TestCommand_matchDst
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/0_args
=== RUN   TestCommand_Validation/3_args
=== RUN   TestCommand_Validation/both_source_and_dest
--- PASS: TestCommand_noTabs (0.01s)
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/0_args (0.00s)
    --- PASS: TestCommand_Validation/3_args (0.00s)
    --- PASS: TestCommand_Validation/both_source_and_dest (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_matchSource - 2019/11/27 02:25:46.374101 [WARN] agent: Node name "Node f9f84a48-472e-74a2-9901-74c99d7f2046" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_matchSource - 2019/11/27 02:25:46.375722 [DEBUG] tlsutil: Update with version 1
TestCommand_matchSource - 2019/11/27 02:25:46.375948 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_matchSource - 2019/11/27 02:25:46.376467 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_matchSource - 2019/11/27 02:25:46.376845 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_matchDst - 2019/11/27 02:25:46.399813 [WARN] agent: Node name "Node 38194470-78e2-306b-2944-64a0c43aef3c" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_matchDst - 2019/11/27 02:25:46.400690 [DEBUG] tlsutil: Update with version 1
TestCommand_matchDst - 2019/11/27 02:25:46.400913 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_matchDst - 2019/11/27 02:25:46.401369 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_matchDst - 2019/11/27 02:25:46.401817 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:25:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f9f84a48-472e-74a2-9901-74c99d7f2046 Address:127.0.0.1:13006}]
2019/11/27 02:25:47 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestCommand_matchSource - 2019/11/27 02:25:47.155548 [INFO] serf: EventMemberJoin: Node f9f84a48-472e-74a2-9901-74c99d7f2046.dc1 127.0.0.1
TestCommand_matchSource - 2019/11/27 02:25:47.160907 [INFO] serf: EventMemberJoin: Node f9f84a48-472e-74a2-9901-74c99d7f2046 127.0.0.1
TestCommand_matchSource - 2019/11/27 02:25:47.176102 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestCommand_matchSource - 2019/11/27 02:25:47.179219 [INFO] consul: Adding LAN server Node f9f84a48-472e-74a2-9901-74c99d7f2046 (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestCommand_matchSource - 2019/11/27 02:25:47.179882 [INFO] consul: Handled member-join event for server "Node f9f84a48-472e-74a2-9901-74c99d7f2046.dc1" in area "wan"
TestCommand_matchSource - 2019/11/27 02:25:47.180412 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestCommand_matchSource - 2019/11/27 02:25:47.188673 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestCommand_matchSource - 2019/11/27 02:25:47.189139 [INFO] agent: started state syncer
2019/11/27 02:25:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:47 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:38194470-78e2-306b-2944-64a0c43aef3c Address:127.0.0.1:13012}]
2019/11/27 02:25:47 [INFO]  raft: Node at 127.0.0.1:13012 [Follower] entering Follower state (Leader: "")
TestCommand_matchDst - 2019/11/27 02:25:47.547826 [INFO] serf: EventMemberJoin: Node 38194470-78e2-306b-2944-64a0c43aef3c.dc1 127.0.0.1
TestCommand_matchDst - 2019/11/27 02:25:47.555389 [INFO] serf: EventMemberJoin: Node 38194470-78e2-306b-2944-64a0c43aef3c 127.0.0.1
TestCommand_matchDst - 2019/11/27 02:25:47.558046 [INFO] consul: Adding LAN server Node 38194470-78e2-306b-2944-64a0c43aef3c (Addr: tcp/127.0.0.1:13012) (DC: dc1)
TestCommand_matchDst - 2019/11/27 02:25:47.559497 [INFO] consul: Handled member-join event for server "Node 38194470-78e2-306b-2944-64a0c43aef3c.dc1" in area "wan"
TestCommand_matchDst - 2019/11/27 02:25:47.563357 [INFO] agent: Started DNS server 127.0.0.1:13007 (tcp)
TestCommand_matchDst - 2019/11/27 02:25:47.564612 [INFO] agent: Started DNS server 127.0.0.1:13007 (udp)
TestCommand_matchDst - 2019/11/27 02:25:47.569732 [INFO] agent: Started HTTP server on 127.0.0.1:13008 (tcp)
TestCommand_matchDst - 2019/11/27 02:25:47.569866 [INFO] agent: started state syncer
2019/11/27 02:25:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:25:47 [INFO]  raft: Node at 127.0.0.1:13012 [Candidate] entering Candidate state in term 2
2019/11/27 02:25:47 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:47 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestCommand_matchSource - 2019/11/27 02:25:47.694352 [INFO] consul: cluster leadership acquired
TestCommand_matchSource - 2019/11/27 02:25:47.694865 [INFO] consul: New leader elected: Node f9f84a48-472e-74a2-9901-74c99d7f2046
TestCommand_matchSource - 2019/11/27 02:25:48.006896 [INFO] agent: Synced node info
TestCommand_matchSource - 2019/11/27 02:25:48.007015 [DEBUG] agent: Node info in sync
2019/11/27 02:25:48 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:25:48 [INFO]  raft: Node at 127.0.0.1:13012 [Leader] entering Leader state
TestCommand_matchDst - 2019/11/27 02:25:48.083399 [INFO] consul: cluster leadership acquired
TestCommand_matchDst - 2019/11/27 02:25:48.083838 [INFO] consul: New leader elected: Node 38194470-78e2-306b-2944-64a0c43aef3c
TestCommand_matchSource - 2019/11/27 02:25:48.162827 [DEBUG] http: Request POST /v1/connect/intentions (276.115589ms) from=127.0.0.1:45016
TestCommand_matchDst - 2019/11/27 02:25:48.550125 [INFO] agent: Synced node info
TestCommand_matchDst - 2019/11/27 02:25:48.550280 [DEBUG] agent: Node info in sync
TestCommand_matchSource - 2019/11/27 02:25:48.550351 [DEBUG] http: Request POST /v1/connect/intentions (384.011466ms) from=127.0.0.1:45016
TestCommand_matchDst - 2019/11/27 02:25:48.551272 [DEBUG] http: Request POST /v1/connect/intentions (226.768149ms) from=127.0.0.1:46078
TestCommand_matchSource - 2019/11/27 02:25:48.806438 [DEBUG] http: Request POST /v1/connect/intentions (252.141727ms) from=127.0.0.1:45016
TestCommand_matchSource - 2019/11/27 02:25:48.861016 [DEBUG] http: Request GET /v1/connect/intentions/match?by=source&name=foo (3.168114ms) from=127.0.0.1:45020
TestCommand_matchSource - 2019/11/27 02:25:48.864016 [INFO] agent: Requesting shutdown
TestCommand_matchSource - 2019/11/27 02:25:48.864127 [INFO] consul: shutting down server
TestCommand_matchSource - 2019/11/27 02:25:48.864173 [WARN] serf: Shutdown without a Leave
TestCommand_matchSource - 2019/11/27 02:25:48.970421 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/11/27 02:25:48.975029 [DEBUG] http: Request POST /v1/connect/intentions (420.489443ms) from=127.0.0.1:46078
TestCommand_matchSource - 2019/11/27 02:25:49.059394 [INFO] manager: shutting down
TestCommand_matchSource - 2019/11/27 02:25:49.259555 [INFO] agent: consul server down
TestCommand_matchSource - 2019/11/27 02:25:49.259634 [INFO] agent: shutdown complete
TestCommand_matchSource - 2019/11/27 02:25:49.259690 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestCommand_matchSource - 2019/11/27 02:25:49.259815 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestCommand_matchSource - 2019/11/27 02:25:49.259961 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestCommand_matchSource - 2019/11/27 02:25:49.260566 [INFO] agent: Waiting for endpoints to shut down
TestCommand_matchSource - 2019/11/27 02:25:49.260675 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCommand_matchSource - 2019/11/27 02:25:49.260817 [INFO] agent: Endpoints down
--- PASS: TestCommand_matchSource (3.02s)
TestCommand_matchDst - 2019/11/27 02:25:49.433075 [DEBUG] http: Request POST /v1/connect/intentions (455.167354ms) from=127.0.0.1:46078
TestCommand_matchDst - 2019/11/27 02:25:49.444053 [DEBUG] http: Request GET /v1/connect/intentions/match?by=destination&name=db (1.468386ms) from=127.0.0.1:46082
TestCommand_matchDst - 2019/11/27 02:25:49.447291 [INFO] agent: Requesting shutdown
TestCommand_matchDst - 2019/11/27 02:25:49.447400 [INFO] consul: shutting down server
TestCommand_matchDst - 2019/11/27 02:25:49.447449 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/11/27 02:25:49.648362 [WARN] serf: Shutdown without a Leave
TestCommand_matchDst - 2019/11/27 02:25:49.742482 [INFO] manager: shutting down
TestCommand_matchDst - 2019/11/27 02:25:49.926060 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCommand_matchDst - 2019/11/27 02:25:49.927404 [INFO] agent: consul server down
TestCommand_matchDst - 2019/11/27 02:25:49.927627 [INFO] agent: shutdown complete
TestCommand_matchDst - 2019/11/27 02:25:49.927851 [INFO] agent: Stopping DNS server 127.0.0.1:13007 (tcp)
TestCommand_matchDst - 2019/11/27 02:25:49.928580 [INFO] agent: Stopping DNS server 127.0.0.1:13007 (udp)
TestCommand_matchDst - 2019/11/27 02:25:49.929100 [INFO] agent: Stopping HTTP server 127.0.0.1:13008 (tcp)
TestCommand_matchDst - 2019/11/27 02:25:49.930438 [INFO] agent: Waiting for endpoints to shut down
TestCommand_matchDst - 2019/11/27 02:25:49.930554 [INFO] agent: Endpoints down
--- PASS: TestCommand_matchDst (3.69s)
PASS
ok  	github.com/hashicorp/consul/command/intention/match	3.880s
=== RUN   TestJoinCommand_noTabs
=== PAUSE TestJoinCommand_noTabs
=== RUN   TestJoinCommandJoin_lan
=== PAUSE TestJoinCommandJoin_lan
=== RUN   TestJoinCommand_wan
=== PAUSE TestJoinCommand_wan
=== RUN   TestJoinCommand_noAddrs
=== PAUSE TestJoinCommand_noAddrs
=== CONT  TestJoinCommand_noTabs
=== CONT  TestJoinCommand_wan
--- PASS: TestJoinCommand_noTabs (0.01s)
=== CONT  TestJoinCommand_noAddrs
--- PASS: TestJoinCommand_noAddrs (0.01s)
=== CONT  TestJoinCommandJoin_lan
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommand_wan - 2019/11/27 02:26:09.811667 [WARN] agent: Node name "Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/11/27 02:26:09.812685 [WARN] agent: Node name "Node db5a845f-4682-74c6-6820-a5e54e4654c3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/11/27 02:26:09.813751 [DEBUG] tlsutil: Update with version 1
TestJoinCommandJoin_lan - 2019/11/27 02:26:09.813845 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/11/27 02:26:09.814117 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestJoinCommandJoin_lan - 2019/11/27 02:26:09.814238 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommand_wan - 2019/11/27 02:26:09.816286 [DEBUG] tlsutil: Update with version 1
TestJoinCommand_wan - 2019/11/27 02:26:09.816391 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommand_wan - 2019/11/27 02:26:09.817003 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestJoinCommand_wan - 2019/11/27 02:26:09.817140 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee Address:127.0.0.1:26506}]
TestJoinCommand_wan - 2019/11/27 02:26:11.385528 [INFO] serf: EventMemberJoin: Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee.dc1 127.0.0.1
2019/11/27 02:26:11 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
TestJoinCommand_wan - 2019/11/27 02:26:11.402571 [INFO] serf: EventMemberJoin: Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee 127.0.0.1
TestJoinCommand_wan - 2019/11/27 02:26:11.405976 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestJoinCommand_wan - 2019/11/27 02:26:11.406866 [INFO] consul: Adding LAN server Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestJoinCommand_wan - 2019/11/27 02:26:11.407208 [INFO] consul: Handled member-join event for server "Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee.dc1" in area "wan"
TestJoinCommand_wan - 2019/11/27 02:26:11.408248 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:11.419349 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:11.419536 [INFO] agent: started state syncer
2019/11/27 02:26:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:db5a845f-4682-74c6-6820-a5e54e4654c3 Address:127.0.0.1:26512}]
2019/11/27 02:26:11 [INFO]  raft: Node at 127.0.0.1:26512 [Follower] entering Follower state (Leader: "")
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.441367 [INFO] serf: EventMemberJoin: Node db5a845f-4682-74c6-6820-a5e54e4654c3.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.446523 [INFO] serf: EventMemberJoin: Node db5a845f-4682-74c6-6820-a5e54e4654c3 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.447806 [INFO] consul: Adding LAN server Node db5a845f-4682-74c6-6820-a5e54e4654c3 (Addr: tcp/127.0.0.1:26512) (DC: dc1)
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.448109 [INFO] consul: Handled member-join event for server "Node db5a845f-4682-74c6-6820-a5e54e4654c3.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.449365 [INFO] agent: Started DNS server 127.0.0.1:26507 (udp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.449463 [INFO] agent: Started DNS server 127.0.0.1:26507 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.451672 [INFO] agent: Started HTTP server on 127.0.0.1:26508 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:11.451934 [INFO] agent: started state syncer
2019/11/27 02:26:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:11 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:11 [INFO]  raft: Node at 127.0.0.1:26512 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:11 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestJoinCommand_wan - 2019/11/27 02:26:11.936185 [INFO] consul: cluster leadership acquired
TestJoinCommand_wan - 2019/11/27 02:26:11.936824 [INFO] consul: New leader elected: Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommand_wan - 2019/11/27 02:26:12.410916 [WARN] agent: Node name "Node 47e82a8f-c860-da44-d03d-4269da29251e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommand_wan - 2019/11/27 02:26:12.414621 [DEBUG] tlsutil: Update with version 1
TestJoinCommand_wan - 2019/11/27 02:26:12.414719 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommand_wan - 2019/11/27 02:26:12.414934 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestJoinCommand_wan - 2019/11/27 02:26:12.415116 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:12 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:12 [INFO]  raft: Node at 127.0.0.1:26512 [Leader] entering Leader state
TestJoinCommandJoin_lan - 2019/11/27 02:26:12.793695 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/11/27 02:26:12.794293 [INFO] consul: New leader elected: Node db5a845f-4682-74c6-6820-a5e54e4654c3
TestJoinCommand_wan - 2019/11/27 02:26:12.892545 [INFO] agent: Synced node info
TestJoinCommand_wan - 2019/11/27 02:26:12.892675 [DEBUG] agent: Node info in sync
TestJoinCommandJoin_lan - 2019/11/27 02:26:13.247580 [INFO] agent: Synced node info
WARNING: bootstrap = true: do not enable unless necessary
TestJoinCommandJoin_lan - 2019/11/27 02:26:13.270620 [WARN] agent: Node name "Node edb3fea3-e86a-ec9c-e679-42369bb2a158" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestJoinCommandJoin_lan - 2019/11/27 02:26:13.271040 [DEBUG] tlsutil: Update with version 1
TestJoinCommandJoin_lan - 2019/11/27 02:26:13.271110 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestJoinCommandJoin_lan - 2019/11/27 02:26:13.271310 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestJoinCommandJoin_lan - 2019/11/27 02:26:13.271428 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:13 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:47e82a8f-c860-da44-d03d-4269da29251e Address:127.0.0.1:26518}]
2019/11/27 02:26:13 [INFO]  raft: Node at 127.0.0.1:26518 [Follower] entering Follower state (Leader: "")
TestJoinCommand_wan - 2019/11/27 02:26:13.774744 [INFO] serf: EventMemberJoin: Node 47e82a8f-c860-da44-d03d-4269da29251e.dc1 127.0.0.1
TestJoinCommand_wan - 2019/11/27 02:26:13.780531 [INFO] serf: EventMemberJoin: Node 47e82a8f-c860-da44-d03d-4269da29251e 127.0.0.1
TestJoinCommand_wan - 2019/11/27 02:26:13.781414 [INFO] consul: Adding LAN server Node 47e82a8f-c860-da44-d03d-4269da29251e (Addr: tcp/127.0.0.1:26518) (DC: dc1)
TestJoinCommand_wan - 2019/11/27 02:26:13.781970 [INFO] consul: Handled member-join event for server "Node 47e82a8f-c860-da44-d03d-4269da29251e.dc1" in area "wan"
TestJoinCommand_wan - 2019/11/27 02:26:13.782442 [INFO] agent: Started DNS server 127.0.0.1:26513 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:13.784990 [INFO] agent: Started DNS server 127.0.0.1:26513 (udp)
TestJoinCommand_wan - 2019/11/27 02:26:13.787141 [INFO] agent: Started HTTP server on 127.0.0.1:26514 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:13.787441 [INFO] agent: started state syncer
2019/11/27 02:26:13 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:13 [INFO]  raft: Node at 127.0.0.1:26518 [Candidate] entering Candidate state in term 2
TestJoinCommand_wan - 2019/11/27 02:26:14.317708 [DEBUG] agent: Node info in sync
2019/11/27 02:26:14 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:edb3fea3-e86a-ec9c-e679-42369bb2a158 Address:127.0.0.1:26524}]
2019/11/27 02:26:14 [INFO]  raft: Node at 127.0.0.1:26524 [Follower] entering Follower state (Leader: "")
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.342554 [INFO] serf: EventMemberJoin: Node edb3fea3-e86a-ec9c-e679-42369bb2a158.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.348218 [INFO] serf: EventMemberJoin: Node edb3fea3-e86a-ec9c-e679-42369bb2a158 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.349464 [INFO] consul: Handled member-join event for server "Node edb3fea3-e86a-ec9c-e679-42369bb2a158.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.349805 [INFO] consul: Adding LAN server Node edb3fea3-e86a-ec9c-e679-42369bb2a158 (Addr: tcp/127.0.0.1:26524) (DC: dc1)
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.352781 [INFO] agent: Started DNS server 127.0.0.1:26519 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.353137 [INFO] agent: Started DNS server 127.0.0.1:26519 (udp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.355203 [INFO] agent: Started HTTP server on 127.0.0.1:26520 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.355320 [INFO] agent: started state syncer
2019/11/27 02:26:14 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:14 [INFO]  raft: Node at 127.0.0.1:26524 [Candidate] entering Candidate state in term 2
TestJoinCommand_wan - 2019/11/27 02:26:14.453621 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestJoinCommand_wan - 2019/11/27 02:26:14.454160 [DEBUG] consul: Skipping self join check for "Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee" since the cluster is too small
TestJoinCommand_wan - 2019/11/27 02:26:14.454323 [INFO] consul: member 'Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee' joined, marking health alive
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.562721 [DEBUG] agent: Node info in sync
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.562836 [DEBUG] agent: Node info in sync
2019/11/27 02:26:14 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:14 [INFO]  raft: Node at 127.0.0.1:26518 [Leader] entering Leader state
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.614290 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.614847 [DEBUG] consul: Skipping self join check for "Node db5a845f-4682-74c6-6820-a5e54e4654c3" since the cluster is too small
TestJoinCommand_wan - 2019/11/27 02:26:14.614927 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/11/27 02:26:14.615014 [INFO] consul: member 'Node db5a845f-4682-74c6-6820-a5e54e4654c3' joined, marking health alive
TestJoinCommand_wan - 2019/11/27 02:26:14.615290 [INFO] consul: New leader elected: Node 47e82a8f-c860-da44-d03d-4269da29251e
TestJoinCommand_wan - 2019/11/27 02:26:14.683553 [INFO] agent: (WAN) joining: [127.0.0.1:26517]
TestJoinCommand_wan - 2019/11/27 02:26:14.684473 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:26517
TestJoinCommand_wan - 2019/11/27 02:26:14.684549 [DEBUG] memberlist: Stream connection from=127.0.0.1:48522
TestJoinCommand_wan - 2019/11/27 02:26:14.693900 [INFO] serf: EventMemberJoin: Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee.dc1 127.0.0.1
TestJoinCommand_wan - 2019/11/27 02:26:14.694341 [INFO] serf: EventMemberJoin: Node 47e82a8f-c860-da44-d03d-4269da29251e.dc1 127.0.0.1
TestJoinCommand_wan - 2019/11/27 02:26:14.694946 [INFO] agent: (WAN) joined: 1 Err: <nil>
TestJoinCommand_wan - 2019/11/27 02:26:14.695066 [DEBUG] http: Request PUT /v1/agent/join/127.0.0.1:26517?wan=1 (11.512412ms) from=127.0.0.1:49230
TestJoinCommand_wan - 2019/11/27 02:26:14.695105 [INFO] consul: Handled member-join event for server "Node 2a42309a-bc4b-4ee2-64dc-f1e7b738b6ee.dc1" in area "wan"
TestJoinCommand_wan - 2019/11/27 02:26:14.696991 [INFO] consul: Handled member-join event for server "Node 47e82a8f-c860-da44-d03d-4269da29251e.dc1" in area "wan"
TestJoinCommand_wan - 2019/11/27 02:26:14.699423 [INFO] agent: Requesting shutdown
TestJoinCommand_wan - 2019/11/27 02:26:14.699509 [INFO] consul: shutting down server
TestJoinCommand_wan - 2019/11/27 02:26:14.699560 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/11/27 02:26:14.701607 [ERR] agent: failed to sync remote state: No cluster leader
TestJoinCommand_wan - 2019/11/27 02:26:14.813210 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/11/27 02:26:14.879971 [INFO] manager: shutting down
TestJoinCommand_wan - 2019/11/27 02:26:14.880166 [ERR] consul: failed to wait for barrier: raft is already shutdown
TestJoinCommand_wan - 2019/11/27 02:26:14.880686 [INFO] agent: consul server down
TestJoinCommand_wan - 2019/11/27 02:26:14.880743 [INFO] agent: shutdown complete
TestJoinCommand_wan - 2019/11/27 02:26:14.880799 [INFO] agent: Stopping DNS server 127.0.0.1:26513 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:14.880946 [INFO] agent: Stopping DNS server 127.0.0.1:26513 (udp)
TestJoinCommand_wan - 2019/11/27 02:26:14.881100 [INFO] agent: Stopping HTTP server 127.0.0.1:26514 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:14.881292 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommand_wan - 2019/11/27 02:26:14.881360 [INFO] agent: Endpoints down
TestJoinCommand_wan - 2019/11/27 02:26:14.881400 [INFO] agent: Requesting shutdown
TestJoinCommand_wan - 2019/11/27 02:26:14.881455 [INFO] consul: shutting down server
TestJoinCommand_wan - 2019/11/27 02:26:14.881495 [WARN] serf: Shutdown without a Leave
TestJoinCommand_wan - 2019/11/27 02:26:14.969186 [WARN] serf: Shutdown without a Leave
2019/11/27 02:26:15 [INFO]  raft: Election won. Tally: 1
TestJoinCommand_wan - 2019/11/27 02:26:15.035538 [INFO] manager: shutting down
2019/11/27 02:26:15 [INFO]  raft: Node at 127.0.0.1:26524 [Leader] entering Leader state
TestJoinCommand_wan - 2019/11/27 02:26:15.035976 [INFO] agent: consul server down
TestJoinCommand_wan - 2019/11/27 02:26:15.036036 [INFO] agent: shutdown complete
TestJoinCommand_wan - 2019/11/27 02:26:15.036098 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:15.036255 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestJoinCommand_wan - 2019/11/27 02:26:15.036425 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestJoinCommand_wan - 2019/11/27 02:26:15.037103 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommand_wan - 2019/11/27 02:26:15.037308 [INFO] agent: Endpoints down
--- PASS: TestJoinCommand_wan (5.42s)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.037414 [INFO] consul: cluster leadership acquired
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.037850 [INFO] consul: New leader elected: Node edb3fea3-e86a-ec9c-e679-42369bb2a158
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.066889 [INFO] agent: (LAN) joining: [127.0.0.1:26522]
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.067753 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:26522
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.067815 [DEBUG] memberlist: Stream connection from=127.0.0.1:34682
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.076543 [INFO] serf: EventMemberJoin: Node edb3fea3-e86a-ec9c-e679-42369bb2a158 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.077108 [INFO] agent: (LAN) joined: 1 Err: <nil>
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.077182 [DEBUG] agent: systemd notify failed: No socket
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.077240 [DEBUG] http: Request PUT /v1/agent/join/127.0.0.1:26522 (10.371038ms) from=127.0.0.1:46260
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.077907 [INFO] agent: Requesting shutdown
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.077977 [INFO] consul: shutting down server
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.078021 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.078370 [ERR] agent: failed to sync remote state: No cluster leader
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.079994 [INFO] consul: Adding LAN server Node edb3fea3-e86a-ec9c-e679-42369bb2a158 (Addr: tcp/127.0.0.1:26524) (DC: dc1)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.080420 [ERR] consul: 'Node edb3fea3-e86a-ec9c-e679-42369bb2a158' and 'Node db5a845f-4682-74c6-6820-a5e54e4654c3' are both in bootstrap mode. Only one node should be in bootstrap mode, not adding Raft peer.
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.080554 [INFO] consul: member 'Node edb3fea3-e86a-ec9c-e679-42369bb2a158' joined, marking health alive
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.084019 [DEBUG] memberlist: Initiating push/pull sync with: 127.0.0.1:26523
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.085411 [DEBUG] memberlist: Stream connection from=127.0.0.1:35172
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.087920 [INFO] serf: EventMemberJoin: Node db5a845f-4682-74c6-6820-a5e54e4654c3.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.088577 [INFO] serf: EventMemberJoin: Node db5a845f-4682-74c6-6820-a5e54e4654c3 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.092897 [INFO] serf: EventMemberJoin: Node edb3fea3-e86a-ec9c-e679-42369bb2a158.dc1 127.0.0.1
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.093378 [DEBUG] consul: Successfully performed flood-join for "Node edb3fea3-e86a-ec9c-e679-42369bb2a158" at 127.0.0.1:26523
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.093521 [INFO] consul: Handled member-join event for server "Node db5a845f-4682-74c6-6820-a5e54e4654c3.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.093557 [INFO] consul: Handled member-join event for server "Node edb3fea3-e86a-ec9c-e679-42369bb2a158.dc1" in area "wan"
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.202105 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.291139 [INFO] manager: shutting down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.357728 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358023 [INFO] agent: consul server down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358072 [INFO] agent: shutdown complete
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358128 [INFO] agent: Stopping DNS server 127.0.0.1:26519 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358278 [INFO] agent: Stopping DNS server 127.0.0.1:26519 (udp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358438 [INFO] agent: Stopping HTTP server 127.0.0.1:26520 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358661 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358724 [INFO] agent: Endpoints down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358790 [INFO] agent: Requesting shutdown
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358849 [INFO] consul: shutting down server
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.358892 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.402060 [WARN] serf: Shutdown without a Leave
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.446599 [INFO] manager: shutting down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.447114 [INFO] agent: consul server down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.447171 [INFO] agent: shutdown complete
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.447227 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.447353 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (udp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.447502 [INFO] agent: Stopping HTTP server 127.0.0.1:26508 (tcp)
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.447926 [INFO] agent: Waiting for endpoints to shut down
TestJoinCommandJoin_lan - 2019/11/27 02:26:15.448088 [INFO] agent: Endpoints down
--- PASS: TestJoinCommandJoin_lan (5.81s)
PASS
ok  	github.com/hashicorp/consul/command/join	6.035s
=== RUN   TestKeygenCommand_noTabs
=== PAUSE TestKeygenCommand_noTabs
=== RUN   TestKeygenCommand
=== PAUSE TestKeygenCommand
=== CONT  TestKeygenCommand_noTabs
=== CONT  TestKeygenCommand
--- PASS: TestKeygenCommand_noTabs (0.00s)
--- PASS: TestKeygenCommand (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/keygen	0.052s
=== RUN   TestKeyringCommand_noTabs
=== PAUSE TestKeyringCommand_noTabs
=== RUN   TestKeyringCommand
=== PAUSE TestKeyringCommand
=== RUN   TestKeyringCommand_help
=== PAUSE TestKeyringCommand_help
=== RUN   TestKeyringCommand_failedConnection
=== PAUSE TestKeyringCommand_failedConnection
=== RUN   TestKeyringCommand_invalidRelayFactor
=== PAUSE TestKeyringCommand_invalidRelayFactor
=== CONT  TestKeyringCommand_noTabs
=== CONT  TestKeyringCommand_failedConnection
=== CONT  TestKeyringCommand_invalidRelayFactor
--- PASS: TestKeyringCommand_failedConnection (0.01s)
=== CONT  TestKeyringCommand
--- PASS: TestKeyringCommand_invalidRelayFactor (0.00s)
=== CONT  TestKeyringCommand_help
--- PASS: TestKeyringCommand_help (0.00s)
--- PASS: TestKeyringCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestKeyringCommand - 2019/11/27 02:26:19.264491 [WARN] agent: Node name "Node 74fd374a-9998-8612-c599-33aaa4d6a621" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKeyringCommand - 2019/11/27 02:26:19.266967 [DEBUG] tlsutil: Update with version 1
TestKeyringCommand - 2019/11/27 02:26:19.269601 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyringCommand - 2019/11/27 02:26:19.269896 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKeyringCommand - 2019/11/27 02:26:19.270017 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:20 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:74fd374a-9998-8612-c599-33aaa4d6a621 Address:127.0.0.1:13006}]
2019/11/27 02:26:20 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestKeyringCommand - 2019/11/27 02:26:20.896195 [INFO] serf: EventMemberJoin: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1 127.0.0.1
TestKeyringCommand - 2019/11/27 02:26:20.900140 [INFO] serf: EventMemberJoin: Node 74fd374a-9998-8612-c599-33aaa4d6a621 127.0.0.1
TestKeyringCommand - 2019/11/27 02:26:20.903344 [INFO] consul: Adding LAN server Node 74fd374a-9998-8612-c599-33aaa4d6a621 (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestKeyringCommand - 2019/11/27 02:26:20.903665 [INFO] consul: Handled member-join event for server "Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1" in area "wan"
TestKeyringCommand - 2019/11/27 02:26:20.905075 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestKeyringCommand - 2019/11/27 02:26:20.905486 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestKeyringCommand - 2019/11/27 02:26:20.908104 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestKeyringCommand - 2019/11/27 02:26:20.908302 [INFO] agent: started state syncer
2019/11/27 02:26:20 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:20 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:21 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:21 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestKeyringCommand - 2019/11/27 02:26:21.514029 [INFO] consul: cluster leadership acquired
TestKeyringCommand - 2019/11/27 02:26:21.514999 [INFO] consul: New leader elected: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.637386 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/11/27 02:26:21.639267 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1
TestKeyringCommand - 2019/11/27 02:26:21.642877 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/11/27 02:26:21.644648 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.649917 [DEBUG] http: Request GET /v1/operator/keyring (13.428814ms) from=127.0.0.1:45036
TestKeyringCommand - 2019/11/27 02:26:21.666313 [INFO] serf: Received install-key query
TestKeyringCommand - 2019/11/27 02:26:21.668880 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1
TestKeyringCommand - 2019/11/27 02:26:21.670496 [INFO] serf: Received install-key query
TestKeyringCommand - 2019/11/27 02:26:21.673439 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.674618 [DEBUG] http: Request POST /v1/operator/keyring (9.422338ms) from=127.0.0.1:45038
TestKeyringCommand - 2019/11/27 02:26:21.679451 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/11/27 02:26:21.680764 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1
TestKeyringCommand - 2019/11/27 02:26:21.682170 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/11/27 02:26:21.683840 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.685669 [DEBUG] http: Request GET /v1/operator/keyring (6.771909ms) from=127.0.0.1:45040
TestKeyringCommand - 2019/11/27 02:26:21.693245 [INFO] serf: Received use-key query
TestKeyringCommand - 2019/11/27 02:26:21.695347 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1
TestKeyringCommand - 2019/11/27 02:26:21.696871 [INFO] serf: Received use-key query
TestKeyringCommand - 2019/11/27 02:26:21.699088 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.699956 [DEBUG] http: Request PUT /v1/operator/keyring (7.595939ms) from=127.0.0.1:45042
TestKeyringCommand - 2019/11/27 02:26:21.724568 [INFO] serf: Received remove-key query
TestKeyringCommand - 2019/11/27 02:26:21.727977 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1
TestKeyringCommand - 2019/11/27 02:26:21.729394 [INFO] serf: Received remove-key query
TestKeyringCommand - 2019/11/27 02:26:21.731186 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.732108 [DEBUG] http: Request DELETE /v1/operator/keyring (8.57364ms) from=127.0.0.1:45044
TestKeyringCommand - 2019/11/27 02:26:21.746023 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/11/27 02:26:21.752706 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621.dc1
TestKeyringCommand - 2019/11/27 02:26:21.754812 [INFO] serf: Received list-keys query
TestKeyringCommand - 2019/11/27 02:26:21.756433 [DEBUG] serf: messageQueryResponseType: Node 74fd374a-9998-8612-c599-33aaa4d6a621
TestKeyringCommand - 2019/11/27 02:26:21.760548 [DEBUG] http: Request GET /v1/operator/keyring (15.111208ms) from=127.0.0.1:45046
TestKeyringCommand - 2019/11/27 02:26:21.763238 [INFO] agent: Requesting shutdown
TestKeyringCommand - 2019/11/27 02:26:21.763339 [INFO] consul: shutting down server
TestKeyringCommand - 2019/11/27 02:26:21.763391 [WARN] serf: Shutdown without a Leave
TestKeyringCommand - 2019/11/27 02:26:21.763769 [ERR] agent: failed to sync remote state: No cluster leader
TestKeyringCommand - 2019/11/27 02:26:21.948267 [WARN] serf: Shutdown without a Leave
TestKeyringCommand - 2019/11/27 02:26:22.079716 [INFO] manager: shutting down
TestKeyringCommand - 2019/11/27 02:26:22.080556 [INFO] agent: consul server down
TestKeyringCommand - 2019/11/27 02:26:22.080619 [INFO] agent: shutdown complete
TestKeyringCommand - 2019/11/27 02:26:22.080679 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestKeyringCommand - 2019/11/27 02:26:22.080824 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestKeyringCommand - 2019/11/27 02:26:22.080897 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestKeyringCommand - 2019/11/27 02:26:22.080984 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestKeyringCommand - 2019/11/27 02:26:22.082053 [INFO] agent: Waiting for endpoints to shut down
TestKeyringCommand - 2019/11/27 02:26:22.082239 [INFO] agent: Endpoints down
--- PASS: TestKeyringCommand (3.00s)
PASS
ok  	github.com/hashicorp/consul/command/keyring	3.205s
=== RUN   TestKVCommand_noTabs
=== PAUSE TestKVCommand_noTabs
=== CONT  TestKVCommand_noTabs
--- PASS: TestKVCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/kv	0.045s
=== RUN   TestKVDeleteCommand_noTabs
=== PAUSE TestKVDeleteCommand_noTabs
=== RUN   TestKVDeleteCommand_Validation
=== PAUSE TestKVDeleteCommand_Validation
=== RUN   TestKVDeleteCommand
=== PAUSE TestKVDeleteCommand
=== RUN   TestKVDeleteCommand_Recurse
=== PAUSE TestKVDeleteCommand_Recurse
=== RUN   TestKVDeleteCommand_CAS
=== PAUSE TestKVDeleteCommand_CAS
=== CONT  TestKVDeleteCommand_noTabs
=== CONT  TestKVDeleteCommand_Recurse
=== CONT  TestKVDeleteCommand
=== CONT  TestKVDeleteCommand_Validation
--- PASS: TestKVDeleteCommand_noTabs (0.00s)
=== CONT  TestKVDeleteCommand_CAS
--- PASS: TestKVDeleteCommand_Validation (0.06s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand_CAS - 2019/11/27 02:26:26.564592 [WARN] agent: Node name "Node 6fe1dc71-944e-ca7e-7cf3-14ade7988142" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_CAS - 2019/11/27 02:26:26.565466 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand_CAS - 2019/11/27 02:26:26.565647 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVDeleteCommand_CAS - 2019/11/27 02:26:26.566105 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVDeleteCommand_CAS - 2019/11/27 02:26:26.566315 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand - 2019/11/27 02:26:26.594777 [WARN] agent: Node name "Node d9e3e719-5287-c1c9-f283-1f52de6f6344" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand - 2019/11/27 02:26:26.595241 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand - 2019/11/27 02:26:26.595366 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVDeleteCommand - 2019/11/27 02:26:26.596563 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVDeleteCommand - 2019/11/27 02:26:26.596881 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:26.615634 [WARN] agent: Node name "Node e46efc26-35d5-94e9-c928-066238a01e23" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:26.616114 [DEBUG] tlsutil: Update with version 1
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:26.616197 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:26.616635 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:26.616821 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6fe1dc71-944e-ca7e-7cf3-14ade7988142 Address:127.0.0.1:43012}]
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43012 [Follower] entering Follower state (Leader: "")
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.354696 [INFO] serf: EventMemberJoin: Node 6fe1dc71-944e-ca7e-7cf3-14ade7988142.dc1 127.0.0.1
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.359687 [INFO] serf: EventMemberJoin: Node 6fe1dc71-944e-ca7e-7cf3-14ade7988142 127.0.0.1
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.360627 [INFO] consul: Adding LAN server Node 6fe1dc71-944e-ca7e-7cf3-14ade7988142 (Addr: tcp/127.0.0.1:43012) (DC: dc1)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.360999 [INFO] consul: Handled member-join event for server "Node 6fe1dc71-944e-ca7e-7cf3-14ade7988142.dc1" in area "wan"
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.361485 [INFO] agent: Started DNS server 127.0.0.1:43007 (tcp)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.361545 [INFO] agent: Started DNS server 127.0.0.1:43007 (udp)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.363530 [INFO] agent: Started HTTP server on 127.0.0.1:43008 (tcp)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.363662 [INFO] agent: started state syncer
2019/11/27 02:26:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43012 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e46efc26-35d5-94e9-c928-066238a01e23 Address:127.0.0.1:43006}]
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
2019/11/27 02:26:27 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d9e3e719-5287-c1c9-f283-1f52de6f6344 Address:127.0.0.1:43018}]
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.506065 [INFO] serf: EventMemberJoin: Node e46efc26-35d5-94e9-c928-066238a01e23.dc1 127.0.0.1
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.511882 [INFO] serf: EventMemberJoin: Node e46efc26-35d5-94e9-c928-066238a01e23 127.0.0.1
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43018 [Follower] entering Follower state (Leader: "")
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.513628 [INFO] consul: Handled member-join event for server "Node e46efc26-35d5-94e9-c928-066238a01e23.dc1" in area "wan"
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.513637 [INFO] consul: Adding LAN server Node e46efc26-35d5-94e9-c928-066238a01e23 (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.514334 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.514682 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.516754 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:27.516849 [INFO] agent: started state syncer
TestKVDeleteCommand - 2019/11/27 02:26:27.517406 [INFO] serf: EventMemberJoin: Node d9e3e719-5287-c1c9-f283-1f52de6f6344.dc1 127.0.0.1
TestKVDeleteCommand - 2019/11/27 02:26:27.539782 [INFO] serf: EventMemberJoin: Node d9e3e719-5287-c1c9-f283-1f52de6f6344 127.0.0.1
TestKVDeleteCommand - 2019/11/27 02:26:27.543415 [INFO] consul: Handled member-join event for server "Node d9e3e719-5287-c1c9-f283-1f52de6f6344.dc1" in area "wan"
TestKVDeleteCommand - 2019/11/27 02:26:27.549437 [INFO] agent: Started DNS server 127.0.0.1:43013 (udp)
TestKVDeleteCommand - 2019/11/27 02:26:27.549936 [INFO] agent: Started DNS server 127.0.0.1:43013 (tcp)
TestKVDeleteCommand - 2019/11/27 02:26:27.553258 [INFO] consul: Adding LAN server Node d9e3e719-5287-c1c9-f283-1f52de6f6344 (Addr: tcp/127.0.0.1:43018) (DC: dc1)
TestKVDeleteCommand - 2019/11/27 02:26:27.559343 [INFO] agent: Started HTTP server on 127.0.0.1:43014 (tcp)
TestKVDeleteCommand - 2019/11/27 02:26:27.559508 [INFO] agent: started state syncer
2019/11/27 02:26:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:27 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43018 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:27 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:27 [INFO]  raft: Node at 127.0.0.1:43012 [Leader] entering Leader state
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.980526 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand_CAS - 2019/11/27 02:26:27.981115 [INFO] consul: New leader elected: Node 6fe1dc71-944e-ca7e-7cf3-14ade7988142
2019/11/27 02:26:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:28 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:28.046331 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:28.047139 [INFO] consul: New leader elected: Node e46efc26-35d5-94e9-c928-066238a01e23
2019/11/27 02:26:28 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:28 [INFO]  raft: Node at 127.0.0.1:43018 [Leader] entering Leader state
TestKVDeleteCommand - 2019/11/27 02:26:28.136147 [INFO] consul: cluster leadership acquired
TestKVDeleteCommand - 2019/11/27 02:26:28.137366 [INFO] consul: New leader elected: Node d9e3e719-5287-c1c9-f283-1f52de6f6344
TestKVDeleteCommand_CAS - 2019/11/27 02:26:28.292733 [DEBUG] http: Request PUT /v1/kv/foo (256.178499ms) from=127.0.0.1:48554
TestKVDeleteCommand_CAS - 2019/11/27 02:26:28.292997 [INFO] agent: Synced node info
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:28.369451 [INFO] agent: Synced node info
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:28.373980 [DEBUG] http: Request PUT /v1/kv/foo/a (260.95567ms) from=127.0.0.1:37710
TestKVDeleteCommand - 2019/11/27 02:26:28.613841 [INFO] agent: Synced node info
TestKVDeleteCommand - 2019/11/27 02:26:28.613962 [DEBUG] agent: Node info in sync
TestKVDeleteCommand - 2019/11/27 02:26:28.617378 [DEBUG] http: Request PUT /v1/kv/foo (439.522391ms) from=127.0.0.1:47208
TestKVDeleteCommand_CAS - 2019/11/27 02:26:28.703852 [DEBUG] http: Request DELETE /v1/kv/foo?cas=1 (407.420576ms) from=127.0.0.1:48560
TestKVDeleteCommand_CAS - 2019/11/27 02:26:28.707766 [DEBUG] http: Request GET /v1/kv/foo (1.667726ms) from=127.0.0.1:48554
TestKVDeleteCommand_CAS - 2019/11/27 02:26:28.806650 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_CAS - 2019/11/27 02:26:28.806841 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.017779 [DEBUG] http: Request DELETE /v1/kv/foo?cas=4 (305.11625ms) from=127.0.0.1:48564
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.020443 [DEBUG] http: Request PUT /v1/kv/foo/b (644.095711ms) from=127.0.0.1:37710
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.022396 [DEBUG] http: Request GET /v1/kv/foo (224.674µs) from=127.0.0.1:48554
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.023704 [INFO] agent: Requesting shutdown
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.023775 [INFO] consul: shutting down server
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.023814 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.091734 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand - 2019/11/27 02:26:29.093844 [DEBUG] http: Request DELETE /v1/kv/foo (472.902586ms) from=127.0.0.1:47212
TestKVDeleteCommand - 2019/11/27 02:26:29.096423 [DEBUG] http: Request GET /v1/kv/foo (201.007µs) from=127.0.0.1:47208
TestKVDeleteCommand - 2019/11/27 02:26:29.097079 [INFO] agent: Requesting shutdown
TestKVDeleteCommand - 2019/11/27 02:26:29.097141 [INFO] consul: shutting down server
TestKVDeleteCommand - 2019/11/27 02:26:29.097184 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand - 2019/11/27 02:26:29.172054 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.173515 [INFO] manager: shutting down
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.183205 [INFO] agent: consul server down
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.183443 [INFO] agent: shutdown complete
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.183600 [INFO] agent: Stopping DNS server 127.0.0.1:43007 (tcp)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.183854 [INFO] agent: Stopping DNS server 127.0.0.1:43007 (udp)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.184034 [INFO] agent: Stopping HTTP server 127.0.0.1:43008 (tcp)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.184801 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.184928 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand_CAS (2.73s)
TestKVDeleteCommand_CAS - 2019/11/27 02:26:29.204484 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.240419 [DEBUG] agent: Node info in sync
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.240704 [DEBUG] agent: Node info in sync
TestKVDeleteCommand - 2019/11/27 02:26:29.323604 [INFO] manager: shutting down
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.380581 [DEBUG] http: Request PUT /v1/kv/food (357.946805ms) from=127.0.0.1:37710
TestKVDeleteCommand - 2019/11/27 02:26:29.457023 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestKVDeleteCommand - 2019/11/27 02:26:29.457462 [INFO] agent: consul server down
TestKVDeleteCommand - 2019/11/27 02:26:29.457712 [INFO] agent: shutdown complete
TestKVDeleteCommand - 2019/11/27 02:26:29.457776 [INFO] agent: Stopping DNS server 127.0.0.1:43013 (tcp)
TestKVDeleteCommand - 2019/11/27 02:26:29.457951 [INFO] agent: Stopping DNS server 127.0.0.1:43013 (udp)
TestKVDeleteCommand - 2019/11/27 02:26:29.458124 [INFO] agent: Stopping HTTP server 127.0.0.1:43014 (tcp)
TestKVDeleteCommand - 2019/11/27 02:26:29.458843 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand - 2019/11/27 02:26:29.459043 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand (3.01s)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.692402 [DEBUG] http: Request DELETE /v1/kv/foo?recurse= (307.327327ms) from=127.0.0.1:37720
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.696234 [DEBUG] http: Request GET /v1/kv/foo/a (224.341µs) from=127.0.0.1:37710
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.697729 [DEBUG] http: Request GET /v1/kv/foo/b (176.006µs) from=127.0.0.1:37710
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.699030 [DEBUG] http: Request GET /v1/kv/food (183.674µs) from=127.0.0.1:37710
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.699739 [INFO] agent: Requesting shutdown
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.699824 [INFO] consul: shutting down server
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.699885 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.829388 [WARN] serf: Shutdown without a Leave
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.891053 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.891507 [DEBUG] consul: Skipping self join check for "Node e46efc26-35d5-94e9-c928-066238a01e23" since the cluster is too small
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.891644 [INFO] consul: member 'Node e46efc26-35d5-94e9-c928-066238a01e23' joined, marking health alive
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:29.967974 [INFO] manager: shutting down
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.059212 [ERR] consul: failed to reconcile member: {Node e46efc26-35d5-94e9-c928-066238a01e23 127.0.0.1 43004 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:e46efc26-35d5-94e9-c928-066238a01e23 port:43006 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:43005] alive 1 5 2 2 5 4}: leadership lost while committing log
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.060113 [INFO] agent: consul server down
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.060370 [INFO] agent: shutdown complete
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.060610 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.061154 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.061590 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.063052 [INFO] agent: Waiting for endpoints to shut down
TestKVDeleteCommand_Recurse - 2019/11/27 02:26:30.063151 [INFO] agent: Endpoints down
--- PASS: TestKVDeleteCommand_Recurse (3.62s)
PASS
ok  	github.com/hashicorp/consul/command/kv/del	3.932s
=== RUN   TestKVExportCommand_noTabs
=== PAUSE TestKVExportCommand_noTabs
=== RUN   TestKVExportCommand
=== PAUSE TestKVExportCommand
=== CONT  TestKVExportCommand_noTabs
=== CONT  TestKVExportCommand
--- PASS: TestKVExportCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVExportCommand - 2019/11/27 02:26:28.964927 [WARN] agent: Node name "Node e013f225-1320-a5ff-10cd-e1a03f3ea13d" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVExportCommand - 2019/11/27 02:26:28.966161 [DEBUG] tlsutil: Update with version 1
TestKVExportCommand - 2019/11/27 02:26:28.966227 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVExportCommand - 2019/11/27 02:26:28.966959 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVExportCommand - 2019/11/27 02:26:28.967185 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:29 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e013f225-1320-a5ff-10cd-e1a03f3ea13d Address:127.0.0.1:13006}]
2019/11/27 02:26:29 [INFO]  raft: Node at 127.0.0.1:13006 [Follower] entering Follower state (Leader: "")
TestKVExportCommand - 2019/11/27 02:26:29.829025 [INFO] serf: EventMemberJoin: Node e013f225-1320-a5ff-10cd-e1a03f3ea13d.dc1 127.0.0.1
TestKVExportCommand - 2019/11/27 02:26:29.832873 [INFO] serf: EventMemberJoin: Node e013f225-1320-a5ff-10cd-e1a03f3ea13d 127.0.0.1
TestKVExportCommand - 2019/11/27 02:26:29.834984 [INFO] agent: Started DNS server 127.0.0.1:13001 (udp)
TestKVExportCommand - 2019/11/27 02:26:29.835452 [INFO] agent: Started DNS server 127.0.0.1:13001 (tcp)
TestKVExportCommand - 2019/11/27 02:26:29.835902 [INFO] consul: Adding LAN server Node e013f225-1320-a5ff-10cd-e1a03f3ea13d (Addr: tcp/127.0.0.1:13006) (DC: dc1)
TestKVExportCommand - 2019/11/27 02:26:29.836323 [INFO] consul: Handled member-join event for server "Node e013f225-1320-a5ff-10cd-e1a03f3ea13d.dc1" in area "wan"
TestKVExportCommand - 2019/11/27 02:26:29.842742 [INFO] agent: Started HTTP server on 127.0.0.1:13002 (tcp)
TestKVExportCommand - 2019/11/27 02:26:29.843108 [INFO] agent: started state syncer
2019/11/27 02:26:29 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:29 [INFO]  raft: Node at 127.0.0.1:13006 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:30 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:30 [INFO]  raft: Node at 127.0.0.1:13006 [Leader] entering Leader state
TestKVExportCommand - 2019/11/27 02:26:30.414263 [INFO] consul: cluster leadership acquired
TestKVExportCommand - 2019/11/27 02:26:30.414710 [INFO] consul: New leader elected: Node e013f225-1320-a5ff-10cd-e1a03f3ea13d
TestKVExportCommand - 2019/11/27 02:26:30.881930 [INFO] agent: Synced node info
TestKVExportCommand - 2019/11/27 02:26:30.883456 [DEBUG] http: Request PUT /v1/kv/foo/a (280.425031ms) from=127.0.0.1:45062
TestKVExportCommand - 2019/11/27 02:26:31.238968 [DEBUG] http: Request PUT /v1/kv/foo/b (352.247266ms) from=127.0.0.1:45062
TestKVExportCommand - 2019/11/27 02:26:31.537846 [DEBUG] http: Request PUT /v1/kv/foo/c (294.574202ms) from=127.0.0.1:45062
TestKVExportCommand - 2019/11/27 02:26:31.825329 [DEBUG] http: Request PUT /v1/kv/bar (284.586511ms) from=127.0.0.1:45062
TestKVExportCommand - 2019/11/27 02:26:31.829979 [DEBUG] http: Request GET /v1/kv/foo?recurse= (1.556389ms) from=127.0.0.1:45064
TestKVExportCommand - 2019/11/27 02:26:31.832602 [INFO] agent: Requesting shutdown
TestKVExportCommand - 2019/11/27 02:26:31.832712 [INFO] consul: shutting down server
TestKVExportCommand - 2019/11/27 02:26:31.832761 [WARN] serf: Shutdown without a Leave
TestKVExportCommand - 2019/11/27 02:26:31.914904 [WARN] serf: Shutdown without a Leave
TestKVExportCommand - 2019/11/27 02:26:32.156800 [INFO] manager: shutting down
TestKVExportCommand - 2019/11/27 02:26:32.235029 [ERR] agent: failed to sync remote state: No cluster leader
TestKVExportCommand - 2019/11/27 02:26:33.156758 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestKVExportCommand - 2019/11/27 02:26:33.157092 [INFO] agent: consul server down
TestKVExportCommand - 2019/11/27 02:26:33.157148 [INFO] agent: shutdown complete
TestKVExportCommand - 2019/11/27 02:26:33.157210 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (tcp)
TestKVExportCommand - 2019/11/27 02:26:33.157382 [INFO] agent: Stopping DNS server 127.0.0.1:13001 (udp)
TestKVExportCommand - 2019/11/27 02:26:33.157551 [INFO] agent: Stopping HTTP server 127.0.0.1:13002 (tcp)
TestKVExportCommand - 2019/11/27 02:26:33.158292 [INFO] agent: Waiting for endpoints to shut down
TestKVExportCommand - 2019/11/27 02:26:33.158386 [INFO] agent: Endpoints down
--- PASS: TestKVExportCommand (4.29s)
PASS
ok  	github.com/hashicorp/consul/command/kv/exp	4.435s
=== RUN   TestKVGetCommand_noTabs
=== PAUSE TestKVGetCommand_noTabs
=== RUN   TestKVGetCommand_Validation
=== PAUSE TestKVGetCommand_Validation
=== RUN   TestKVGetCommand
=== PAUSE TestKVGetCommand
=== RUN   TestKVGetCommand_Base64
=== PAUSE TestKVGetCommand_Base64
=== RUN   TestKVGetCommand_Missing
=== PAUSE TestKVGetCommand_Missing
=== RUN   TestKVGetCommand_Empty
=== PAUSE TestKVGetCommand_Empty
=== RUN   TestKVGetCommand_Detailed
=== PAUSE TestKVGetCommand_Detailed
=== RUN   TestKVGetCommand_Keys
=== PAUSE TestKVGetCommand_Keys
=== RUN   TestKVGetCommand_Recurse
=== PAUSE TestKVGetCommand_Recurse
=== RUN   TestKVGetCommand_RecurseBase64
=== PAUSE TestKVGetCommand_RecurseBase64
=== RUN   TestKVGetCommand_DetailedBase64
--- SKIP: TestKVGetCommand_DetailedBase64 (0.00s)
    kv_get_test.go:338: DM-skipped
=== CONT  TestKVGetCommand_noTabs
--- PASS: TestKVGetCommand_noTabs (0.00s)
=== CONT  TestKVGetCommand_RecurseBase64
=== CONT  TestKVGetCommand_Recurse
=== CONT  TestKVGetCommand_Keys
=== CONT  TestKVGetCommand_Detailed
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Recurse - 2019/11/27 02:26:52.839388 [WARN] agent: Node name "Node c413f205-f202-1975-b4a8-21e144b73538" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Recurse - 2019/11/27 02:26:52.840395 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Recurse - 2019/11/27 02:26:52.840498 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Recurse - 2019/11/27 02:26:52.840945 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_Recurse - 2019/11/27 02:26:52.841155 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:52.881591 [WARN] agent: Node name "Node b58f6e7d-e1db-5039-43b4-a11960f66ce4" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:52.882940 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:52.883695 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:52.884535 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:52.885296 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Detailed - 2019/11/27 02:26:52.917096 [WARN] agent: Node name "Node 6a9e1530-94ea-7b48-bbd6-4dd49e466770" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Keys - 2019/11/27 02:26:52.917307 [WARN] agent: Node name "Node 57f652c9-b2ee-8962-93b3-521133ed0027" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Detailed - 2019/11/27 02:26:52.917651 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Detailed - 2019/11/27 02:26:52.917736 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Detailed - 2019/11/27 02:26:52.918119 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_Detailed - 2019/11/27 02:26:52.918439 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Keys - 2019/11/27 02:26:52.917651 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Keys - 2019/11/27 02:26:52.927032 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Keys - 2019/11/27 02:26:52.927213 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_Keys - 2019/11/27 02:26:52.927313 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c413f205-f202-1975-b4a8-21e144b73538 Address:127.0.0.1:35512}]
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35512 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.674694 [INFO] serf: EventMemberJoin: Node c413f205-f202-1975-b4a8-21e144b73538.dc1 127.0.0.1
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.685292 [INFO] serf: EventMemberJoin: Node c413f205-f202-1975-b4a8-21e144b73538 127.0.0.1
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.687030 [INFO] consul: Adding LAN server Node c413f205-f202-1975-b4a8-21e144b73538 (Addr: tcp/127.0.0.1:35512) (DC: dc1)
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.688123 [INFO] consul: Handled member-join event for server "Node c413f205-f202-1975-b4a8-21e144b73538.dc1" in area "wan"
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.694679 [INFO] agent: Started DNS server 127.0.0.1:35507 (udp)
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.694771 [INFO] agent: Started DNS server 127.0.0.1:35507 (tcp)
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.703836 [INFO] agent: Started HTTP server on 127.0.0.1:35508 (tcp)
TestKVGetCommand_Recurse - 2019/11/27 02:26:54.704004 [INFO] agent: started state syncer
2019/11/27 02:26:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35512 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6a9e1530-94ea-7b48-bbd6-4dd49e466770 Address:127.0.0.1:35524}]
2019/11/27 02:26:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:57f652c9-b2ee-8962-93b3-521133ed0027 Address:127.0.0.1:35518}]
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35524 [Follower] entering Follower state (Leader: "")
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35518 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Keys - 2019/11/27 02:26:54.863620 [INFO] serf: EventMemberJoin: Node 57f652c9-b2ee-8962-93b3-521133ed0027.dc1 127.0.0.1
2019/11/27 02:26:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b58f6e7d-e1db-5039-43b4-a11960f66ce4 Address:127.0.0.1:35506}]
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35506 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Keys - 2019/11/27 02:26:54.872167 [INFO] serf: EventMemberJoin: Node 57f652c9-b2ee-8962-93b3-521133ed0027 127.0.0.1
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.893868 [INFO] serf: EventMemberJoin: Node 6a9e1530-94ea-7b48-bbd6-4dd49e466770.dc1 127.0.0.1
TestKVGetCommand_Keys - 2019/11/27 02:26:54.894161 [INFO] consul: Handled member-join event for server "Node 57f652c9-b2ee-8962-93b3-521133ed0027.dc1" in area "wan"
TestKVGetCommand_Keys - 2019/11/27 02:26:54.894565 [INFO] consul: Adding LAN server Node 57f652c9-b2ee-8962-93b3-521133ed0027 (Addr: tcp/127.0.0.1:35518) (DC: dc1)
TestKVGetCommand_Keys - 2019/11/27 02:26:54.912811 [INFO] agent: Started DNS server 127.0.0.1:35513 (udp)
TestKVGetCommand_Keys - 2019/11/27 02:26:54.913196 [INFO] agent: Started DNS server 127.0.0.1:35513 (tcp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.914575 [INFO] serf: EventMemberJoin: Node 6a9e1530-94ea-7b48-bbd6-4dd49e466770 127.0.0.1
2019/11/27 02:26:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35518 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Keys - 2019/11/27 02:26:54.915189 [INFO] agent: Started HTTP server on 127.0.0.1:35514 (tcp)
TestKVGetCommand_Keys - 2019/11/27 02:26:54.915339 [INFO] agent: started state syncer
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.916154 [INFO] consul: Adding LAN server Node 6a9e1530-94ea-7b48-bbd6-4dd49e466770 (Addr: tcp/127.0.0.1:35524) (DC: dc1)
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.916243 [INFO] consul: Handled member-join event for server "Node 6a9e1530-94ea-7b48-bbd6-4dd49e466770.dc1" in area "wan"
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:54.918147 [INFO] serf: EventMemberJoin: Node b58f6e7d-e1db-5039-43b4-a11960f66ce4.dc1 127.0.0.1
2019/11/27 02:26:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35524 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:54 [INFO]  raft: Node at 127.0.0.1:35506 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.959362 [INFO] agent: Started DNS server 127.0.0.1:35519 (tcp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.959456 [INFO] agent: Started DNS server 127.0.0.1:35519 (udp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.961635 [INFO] agent: Started HTTP server on 127.0.0.1:35520 (tcp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:54.961826 [INFO] agent: started state syncer
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:54.980327 [INFO] serf: EventMemberJoin: Node b58f6e7d-e1db-5039-43b4-a11960f66ce4 127.0.0.1
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:54.984980 [INFO] agent: Started DNS server 127.0.0.1:35501 (udp)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:54.993733 [INFO] agent: Started DNS server 127.0.0.1:35501 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:54.986571 [INFO] consul: Adding LAN server Node b58f6e7d-e1db-5039-43b4-a11960f66ce4 (Addr: tcp/127.0.0.1:35506) (DC: dc1)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:54.987630 [INFO] consul: Handled member-join event for server "Node b58f6e7d-e1db-5039-43b4-a11960f66ce4.dc1" in area "wan"
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:55.003011 [INFO] agent: Started HTTP server on 127.0.0.1:35502 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:55.003536 [INFO] agent: started state syncer
2019/11/27 02:26:55 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:55 [INFO]  raft: Node at 127.0.0.1:35512 [Leader] entering Leader state
TestKVGetCommand_Recurse - 2019/11/27 02:26:55.401555 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Recurse - 2019/11/27 02:26:55.402156 [INFO] consul: New leader elected: Node c413f205-f202-1975-b4a8-21e144b73538
2019/11/27 02:26:55 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:55 [INFO]  raft: Node at 127.0.0.1:35524 [Leader] entering Leader state
2019/11/27 02:26:55 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:55 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:55 [INFO]  raft: Node at 127.0.0.1:35506 [Leader] entering Leader state
2019/11/27 02:26:55 [INFO]  raft: Node at 127.0.0.1:35518 [Leader] entering Leader state
TestKVGetCommand_Keys - 2019/11/27 02:26:55.634640 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Keys - 2019/11/27 02:26:55.635197 [INFO] consul: New leader elected: Node 57f652c9-b2ee-8962-93b3-521133ed0027
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:55.636177 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Detailed - 2019/11/27 02:26:55.636177 [INFO] consul: cluster leadership acquired
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:55.636508 [INFO] consul: New leader elected: Node b58f6e7d-e1db-5039-43b4-a11960f66ce4
TestKVGetCommand_Detailed - 2019/11/27 02:26:55.636508 [INFO] consul: New leader elected: Node 6a9e1530-94ea-7b48-bbd6-4dd49e466770
TestKVGetCommand_Recurse - 2019/11/27 02:26:55.837331 [DEBUG] http: Request PUT /v1/kv/foo/c (329.405421ms) from=127.0.0.1:35340
TestKVGetCommand_Recurse - 2019/11/27 02:26:55.848923 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.134304 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.134494 [DEBUG] agent: Node info in sync
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:56.137319 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.142931 [DEBUG] http: Request PUT /v1/kv/foo (458.447026ms) from=127.0.0.1:48970
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:56.147540 [DEBUG] http: Request PUT /v1/kv/foo/a (369.135171ms) from=127.0.0.1:42898
TestKVGetCommand_Keys - 2019/11/27 02:26:56.152037 [INFO] agent: Synced node info
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.183988 [DEBUG] http: Request GET /v1/kv/foo (16.050239ms) from=127.0.0.1:48976
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.186373 [INFO] agent: Requesting shutdown
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.186487 [INFO] consul: shutting down server
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.186540 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.388581 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.488565 [INFO] manager: shutting down
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.488588 [ERR] autopilot: failed to initialize config: raft is already shutdown
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.488949 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.488988 [INFO] agent: consul server down
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.489034 [INFO] agent: shutdown complete
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.489089 [INFO] agent: Stopping DNS server 127.0.0.1:35519 (tcp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.489245 [INFO] agent: Stopping DNS server 127.0.0.1:35519 (udp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.489427 [INFO] agent: Stopping HTTP server 127.0.0.1:35520 (tcp)
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.490253 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Detailed - 2019/11/27 02:26:56.490395 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Detailed (3.87s)
=== CONT  TestKVGetCommand_Empty
TestKVGetCommand_Recurse - 2019/11/27 02:26:56.578531 [DEBUG] http: Request PUT /v1/kv/foo/a (729.912378ms) from=127.0.0.1:35340
TestKVGetCommand_Keys - 2019/11/27 02:26:56.582241 [DEBUG] http: Request PUT /v1/kv/foo/bar (620.181129ms) from=127.0.0.1:39790
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Empty - 2019/11/27 02:26:56.603706 [WARN] agent: Node name "Node 28fa8a43-b889-4937-bec0-ec8272adbf51" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Empty - 2019/11/27 02:26:56.604139 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Empty - 2019/11/27 02:26:56.604203 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Empty - 2019/11/27 02:26:56.604336 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_Empty - 2019/11/27 02:26:56.604429 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:56.847447 [DEBUG] http: Request PUT /v1/kv/foo/b (678.247201ms) from=127.0.0.1:42898
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.037460 [DEBUG] http: Request PUT /v1/kv/foo/b (454.140537ms) from=127.0.0.1:35340
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.043912 [DEBUG] http: Request GET /v1/kv/foo?recurse= (1.794064ms) from=127.0.0.1:35350
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.048279 [INFO] agent: Requesting shutdown
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.048739 [INFO] consul: shutting down server
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.048903 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.188841 [DEBUG] agent: Node info in sync
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.188963 [DEBUG] agent: Node info in sync
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.221949 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.288529 [INFO] manager: shutting down
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.290541 [DEBUG] http: Request PUT /v1/kv/foo/c (440.635388ms) from=127.0.0.1:42898
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.292162 [INFO] agent: consul server down
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.292243 [INFO] agent: shutdown complete
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.292299 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (tcp)
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.292439 [INFO] agent: Stopping DNS server 127.0.0.1:35507 (udp)
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.292610 [INFO] agent: Stopping HTTP server 127.0.0.1:35508 (tcp)
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.293290 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.293477 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestKVGetCommand_Recurse - 2019/11/27 02:26:57.293694 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Recurse (4.68s)
=== CONT  TestKVGetCommand
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.303011 [DEBUG] http: Request GET /v1/kv/foo?recurse= (1.480386ms) from=127.0.0.1:42906
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.305426 [INFO] agent: Requesting shutdown
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.305518 [INFO] consul: shutting down server
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.305560 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand - 2019/11/27 02:26:57.372815 [WARN] agent: Node name "Node d5ea61d2-c6a5-0f43-1273-ac1031cc83ee" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand - 2019/11/27 02:26:57.373246 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand - 2019/11/27 02:26:57.373403 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand - 2019/11/27 02:26:57.373638 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand - 2019/11/27 02:26:57.373834 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.521896 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/11/27 02:26:57.525605 [DEBUG] http: Request PUT /v1/kv/foo/baz (941.329919ms) from=127.0.0.1:39790
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.813225 [INFO] manager: shutting down
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.823441 [INFO] agent: consul server down
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.823537 [INFO] agent: shutdown complete
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.823603 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.824758 [INFO] agent: Stopping DNS server 127.0.0.1:35501 (udp)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.825036 [INFO] agent: Stopping HTTP server 127.0.0.1:35502 (tcp)
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.827036 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.843553 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_RecurseBase64 (5.24s)
=== CONT  TestKVGetCommand_Missing
TestKVGetCommand_RecurseBase64 - 2019/11/27 02:26:57.859075 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Missing - 2019/11/27 02:26:57.910870 [WARN] agent: Node name "Node d889d12a-1de1-a2f8-2359-73202b14bb7a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Missing - 2019/11/27 02:26:57.911272 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Missing - 2019/11/27 02:26:57.911345 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Missing - 2019/11/27 02:26:57.911587 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_Missing - 2019/11/27 02:26:57.911789 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Keys - 2019/11/27 02:26:58.092658 [DEBUG] http: Request PUT /v1/kv/foo/zip (564.510807ms) from=127.0.0.1:39790
TestKVGetCommand_Keys - 2019/11/27 02:26:58.096913 [DEBUG] http: Request GET /v1/kv/foo/?keys=&separator=%2F (1.124707ms) from=127.0.0.1:39798
TestKVGetCommand_Keys - 2019/11/27 02:26:58.098444 [INFO] agent: Requesting shutdown
TestKVGetCommand_Keys - 2019/11/27 02:26:58.098559 [INFO] consul: shutting down server
TestKVGetCommand_Keys - 2019/11/27 02:26:58.098612 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Keys - 2019/11/27 02:26:58.177324 [WARN] serf: Shutdown without a Leave
2019/11/27 02:26:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:28fa8a43-b889-4937-bec0-ec8272adbf51 Address:127.0.0.1:35530}]
2019/11/27 02:26:58 [INFO]  raft: Node at 127.0.0.1:35530 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Empty - 2019/11/27 02:26:58.186090 [INFO] serf: EventMemberJoin: Node 28fa8a43-b889-4937-bec0-ec8272adbf51.dc1 127.0.0.1
TestKVGetCommand_Empty - 2019/11/27 02:26:58.194078 [INFO] serf: EventMemberJoin: Node 28fa8a43-b889-4937-bec0-ec8272adbf51 127.0.0.1
TestKVGetCommand_Empty - 2019/11/27 02:26:58.196420 [INFO] consul: Adding LAN server Node 28fa8a43-b889-4937-bec0-ec8272adbf51 (Addr: tcp/127.0.0.1:35530) (DC: dc1)
TestKVGetCommand_Empty - 2019/11/27 02:26:58.198450 [INFO] consul: Handled member-join event for server "Node 28fa8a43-b889-4937-bec0-ec8272adbf51.dc1" in area "wan"
TestKVGetCommand_Empty - 2019/11/27 02:26:58.203084 [INFO] agent: Started DNS server 127.0.0.1:35525 (tcp)
TestKVGetCommand_Empty - 2019/11/27 02:26:58.206725 [INFO] agent: Started DNS server 127.0.0.1:35525 (udp)
TestKVGetCommand_Empty - 2019/11/27 02:26:58.208979 [INFO] agent: Started HTTP server on 127.0.0.1:35526 (tcp)
TestKVGetCommand_Empty - 2019/11/27 02:26:58.209076 [INFO] agent: started state syncer
2019/11/27 02:26:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:58 [INFO]  raft: Node at 127.0.0.1:35530 [Candidate] entering Candidate state in term 2
TestKVGetCommand_Keys - 2019/11/27 02:26:58.285430 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKVGetCommand_Keys - 2019/11/27 02:26:58.285889 [DEBUG] consul: Skipping self join check for "Node 57f652c9-b2ee-8962-93b3-521133ed0027" since the cluster is too small
TestKVGetCommand_Keys - 2019/11/27 02:26:58.286046 [INFO] consul: member 'Node 57f652c9-b2ee-8962-93b3-521133ed0027' joined, marking health alive
TestKVGetCommand_Keys - 2019/11/27 02:26:58.287672 [INFO] manager: shutting down
TestKVGetCommand_Keys - 2019/11/27 02:26:58.414251 [ERR] agent: failed to sync remote state: No cluster leader
TestKVGetCommand_Keys - 2019/11/27 02:26:58.511163 [INFO] agent: consul server down
TestKVGetCommand_Keys - 2019/11/27 02:26:58.511238 [INFO] agent: shutdown complete
TestKVGetCommand_Keys - 2019/11/27 02:26:58.511301 [INFO] agent: Stopping DNS server 127.0.0.1:35513 (tcp)
TestKVGetCommand_Keys - 2019/11/27 02:26:58.511449 [INFO] agent: Stopping DNS server 127.0.0.1:35513 (udp)
TestKVGetCommand_Keys - 2019/11/27 02:26:58.511614 [INFO] agent: Stopping HTTP server 127.0.0.1:35514 (tcp)
TestKVGetCommand_Keys - 2019/11/27 02:26:58.512429 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Keys - 2019/11/27 02:26:58.512734 [ERR] consul: failed to reconcile member: {Node 57f652c9-b2ee-8962-93b3-521133ed0027 127.0.0.1 35516 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:57f652c9-b2ee-8962-93b3-521133ed0027 port:35518 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:35517] alive 1 5 2 2 5 4}: leadership lost while committing log
TestKVGetCommand_Keys - 2019/11/27 02:26:58.513026 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Keys (5.90s)
=== CONT  TestKVGetCommand_Validation
--- PASS: TestKVGetCommand_Validation (0.00s)
=== CONT  TestKVGetCommand_Base64
WARNING: bootstrap = true: do not enable unless necessary
TestKVGetCommand_Base64 - 2019/11/27 02:26:58.647126 [WARN] agent: Node name "Node abc0430f-a36f-bceb-1bd0-813e66e8bf30" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVGetCommand_Base64 - 2019/11/27 02:26:58.647542 [DEBUG] tlsutil: Update with version 1
TestKVGetCommand_Base64 - 2019/11/27 02:26:58.647608 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVGetCommand_Base64 - 2019/11/27 02:26:58.647834 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVGetCommand_Base64 - 2019/11/27 02:26:58.647927 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:26:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d5ea61d2-c6a5-0f43-1273-ac1031cc83ee Address:127.0.0.1:35536}]
2019/11/27 02:26:58 [INFO]  raft: Node at 127.0.0.1:35536 [Follower] entering Follower state (Leader: "")
TestKVGetCommand - 2019/11/27 02:26:58.779935 [INFO] serf: EventMemberJoin: Node d5ea61d2-c6a5-0f43-1273-ac1031cc83ee.dc1 127.0.0.1
TestKVGetCommand - 2019/11/27 02:26:58.797301 [INFO] serf: EventMemberJoin: Node d5ea61d2-c6a5-0f43-1273-ac1031cc83ee 127.0.0.1
TestKVGetCommand - 2019/11/27 02:26:58.798958 [INFO] consul: Handled member-join event for server "Node d5ea61d2-c6a5-0f43-1273-ac1031cc83ee.dc1" in area "wan"
TestKVGetCommand - 2019/11/27 02:26:58.799350 [INFO] consul: Adding LAN server Node d5ea61d2-c6a5-0f43-1273-ac1031cc83ee (Addr: tcp/127.0.0.1:35536) (DC: dc1)
TestKVGetCommand - 2019/11/27 02:26:58.804641 [INFO] agent: Started DNS server 127.0.0.1:35531 (tcp)
TestKVGetCommand - 2019/11/27 02:26:58.804719 [INFO] agent: Started DNS server 127.0.0.1:35531 (udp)
TestKVGetCommand - 2019/11/27 02:26:58.806798 [INFO] agent: Started HTTP server on 127.0.0.1:35532 (tcp)
TestKVGetCommand - 2019/11/27 02:26:58.806889 [INFO] agent: started state syncer
2019/11/27 02:26:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:58 [INFO]  raft: Node at 127.0.0.1:35536 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:59 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:59 [INFO]  raft: Node at 127.0.0.1:35530 [Leader] entering Leader state
TestKVGetCommand_Empty - 2019/11/27 02:26:59.223377 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Empty - 2019/11/27 02:26:59.223842 [INFO] consul: New leader elected: Node 28fa8a43-b889-4937-bec0-ec8272adbf51
2019/11/27 02:26:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d889d12a-1de1-a2f8-2359-73202b14bb7a Address:127.0.0.1:35542}]
2019/11/27 02:26:59 [INFO]  raft: Node at 127.0.0.1:35542 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Missing - 2019/11/27 02:26:59.543744 [INFO] serf: EventMemberJoin: Node d889d12a-1de1-a2f8-2359-73202b14bb7a.dc1 127.0.0.1
TestKVGetCommand_Missing - 2019/11/27 02:26:59.550734 [INFO] serf: EventMemberJoin: Node d889d12a-1de1-a2f8-2359-73202b14bb7a 127.0.0.1
TestKVGetCommand_Missing - 2019/11/27 02:26:59.552089 [INFO] consul: Adding LAN server Node d889d12a-1de1-a2f8-2359-73202b14bb7a (Addr: tcp/127.0.0.1:35542) (DC: dc1)
TestKVGetCommand_Missing - 2019/11/27 02:26:59.552786 [INFO] consul: Handled member-join event for server "Node d889d12a-1de1-a2f8-2359-73202b14bb7a.dc1" in area "wan"
TestKVGetCommand_Missing - 2019/11/27 02:26:59.555812 [INFO] agent: Started DNS server 127.0.0.1:35537 (tcp)
TestKVGetCommand_Missing - 2019/11/27 02:26:59.556013 [INFO] agent: Started DNS server 127.0.0.1:35537 (udp)
TestKVGetCommand_Missing - 2019/11/27 02:26:59.559841 [INFO] agent: Started HTTP server on 127.0.0.1:35538 (tcp)
TestKVGetCommand_Missing - 2019/11/27 02:26:59.559967 [INFO] agent: started state syncer
2019/11/27 02:26:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:26:59 [INFO]  raft: Node at 127.0.0.1:35542 [Candidate] entering Candidate state in term 2
2019/11/27 02:26:59 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:26:59 [INFO]  raft: Node at 127.0.0.1:35536 [Leader] entering Leader state
TestKVGetCommand - 2019/11/27 02:26:59.981180 [INFO] consul: cluster leadership acquired
TestKVGetCommand - 2019/11/27 02:26:59.981741 [INFO] consul: New leader elected: Node d5ea61d2-c6a5-0f43-1273-ac1031cc83ee
TestKVGetCommand_Empty - 2019/11/27 02:26:59.982682 [INFO] agent: Synced node info
TestKVGetCommand_Empty - 2019/11/27 02:26:59.983667 [DEBUG] http: Request PUT /v1/kv/empty (512.189604ms) from=127.0.0.1:51880
TestKVGetCommand_Empty - 2019/11/27 02:26:59.989241 [DEBUG] http: Request GET /v1/kv/empty (1.757729ms) from=127.0.0.1:51882
TestKVGetCommand_Empty - 2019/11/27 02:26:59.991318 [INFO] agent: Requesting shutdown
TestKVGetCommand_Empty - 2019/11/27 02:26:59.991401 [INFO] consul: shutting down server
TestKVGetCommand_Empty - 2019/11/27 02:26:59.991445 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/11/27 02:27:00.054989 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Empty - 2019/11/27 02:27:00.133038 [INFO] manager: shutting down
TestKVGetCommand_Empty - 2019/11/27 02:27:00.134363 [INFO] agent: consul server down
TestKVGetCommand_Empty - 2019/11/27 02:27:00.134428 [INFO] agent: shutdown complete
TestKVGetCommand_Empty - 2019/11/27 02:27:00.134482 [INFO] agent: Stopping DNS server 127.0.0.1:35525 (tcp)
TestKVGetCommand_Empty - 2019/11/27 02:27:00.134671 [INFO] agent: Stopping DNS server 127.0.0.1:35525 (udp)
TestKVGetCommand_Empty - 2019/11/27 02:27:00.134903 [INFO] agent: Stopping HTTP server 127.0.0.1:35526 (tcp)
TestKVGetCommand_Empty - 2019/11/27 02:27:00.136038 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVGetCommand_Empty - 2019/11/27 02:27:00.136366 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Empty - 2019/11/27 02:27:00.136605 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Empty - 2019/11/27 02:27:00.136761 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Empty (3.65s)
2019/11/27 02:27:00 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:abc0430f-a36f-bceb-1bd0-813e66e8bf30 Address:127.0.0.1:35548}]
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.226208 [INFO] serf: EventMemberJoin: Node abc0430f-a36f-bceb-1bd0-813e66e8bf30.dc1 127.0.0.1
2019/11/27 02:27:00 [INFO]  raft: Node at 127.0.0.1:35548 [Follower] entering Follower state (Leader: "")
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.229965 [INFO] serf: EventMemberJoin: Node abc0430f-a36f-bceb-1bd0-813e66e8bf30 127.0.0.1
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.236049 [INFO] consul: Adding LAN server Node abc0430f-a36f-bceb-1bd0-813e66e8bf30 (Addr: tcp/127.0.0.1:35548) (DC: dc1)
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.236668 [INFO] consul: Handled member-join event for server "Node abc0430f-a36f-bceb-1bd0-813e66e8bf30.dc1" in area "wan"
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.238160 [INFO] agent: Started DNS server 127.0.0.1:35543 (tcp)
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.238706 [INFO] agent: Started DNS server 127.0.0.1:35543 (udp)
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.241541 [INFO] agent: Started HTTP server on 127.0.0.1:35544 (tcp)
TestKVGetCommand_Base64 - 2019/11/27 02:27:00.241646 [INFO] agent: started state syncer
2019/11/27 02:27:00 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:00 [INFO]  raft: Node at 127.0.0.1:35548 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:00 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:00 [INFO]  raft: Node at 127.0.0.1:35542 [Leader] entering Leader state
TestKVGetCommand_Missing - 2019/11/27 02:27:00.301834 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Missing - 2019/11/27 02:27:00.302591 [INFO] consul: New leader elected: Node d889d12a-1de1-a2f8-2359-73202b14bb7a
TestKVGetCommand_Missing - 2019/11/27 02:27:00.373500 [DEBUG] http: Request GET /v1/kv/not-a-real-key (453.016µs) from=127.0.0.1:33178
TestKVGetCommand_Missing - 2019/11/27 02:27:00.379818 [INFO] agent: Requesting shutdown
TestKVGetCommand_Missing - 2019/11/27 02:27:00.380203 [INFO] consul: shutting down server
TestKVGetCommand_Missing - 2019/11/27 02:27:00.380488 [WARN] serf: Shutdown without a Leave
TestKVGetCommand - 2019/11/27 02:27:01.224724 [INFO] agent: Synced node info
TestKVGetCommand - 2019/11/27 02:27:01.224863 [DEBUG] agent: Node info in sync
TestKVGetCommand_Missing - 2019/11/27 02:27:01.227104 [WARN] serf: Shutdown without a Leave
TestKVGetCommand - 2019/11/27 02:27:01.232654 [DEBUG] http: Request PUT /v1/kv/foo (1.069174802s) from=127.0.0.1:35798
TestKVGetCommand - 2019/11/27 02:27:01.239360 [DEBUG] http: Request GET /v1/kv/foo (728.693µs) from=127.0.0.1:35802
TestKVGetCommand - 2019/11/27 02:27:01.240675 [INFO] agent: Requesting shutdown
TestKVGetCommand - 2019/11/27 02:27:01.240759 [INFO] consul: shutting down server
TestKVGetCommand - 2019/11/27 02:27:01.240806 [WARN] serf: Shutdown without a Leave
TestKVGetCommand - 2019/11/27 02:27:01.678342 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Missing - 2019/11/27 02:27:01.679216 [INFO] manager: shutting down
TestKVGetCommand_Missing - 2019/11/27 02:27:01.777519 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestKVGetCommand_Missing - 2019/11/27 02:27:01.777762 [INFO] agent: consul server down
TestKVGetCommand_Missing - 2019/11/27 02:27:01.777812 [INFO] agent: shutdown complete
TestKVGetCommand_Missing - 2019/11/27 02:27:01.777862 [INFO] agent: Stopping DNS server 127.0.0.1:35537 (tcp)
TestKVGetCommand_Missing - 2019/11/27 02:27:01.778000 [INFO] agent: Stopping DNS server 127.0.0.1:35537 (udp)
TestKVGetCommand_Missing - 2019/11/27 02:27:01.778144 [INFO] agent: Stopping HTTP server 127.0.0.1:35538 (tcp)
TestKVGetCommand_Missing - 2019/11/27 02:27:01.778576 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Missing - 2019/11/27 02:27:01.778658 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Missing (3.93s)
TestKVGetCommand - 2019/11/27 02:27:01.779000 [INFO] manager: shutting down
TestKVGetCommand_Missing - 2019/11/27 02:27:01.777567 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestKVGetCommand_Missing - 2019/11/27 02:27:01.779790 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestKVGetCommand - 2019/11/27 02:27:02.088578 [INFO] agent: consul server down
TestKVGetCommand - 2019/11/27 02:27:02.088675 [INFO] agent: shutdown complete
TestKVGetCommand - 2019/11/27 02:27:02.088748 [INFO] agent: Stopping DNS server 127.0.0.1:35531 (tcp)
TestKVGetCommand - 2019/11/27 02:27:02.088918 [INFO] agent: Stopping DNS server 127.0.0.1:35531 (udp)
TestKVGetCommand - 2019/11/27 02:27:02.089149 [INFO] agent: Stopping HTTP server 127.0.0.1:35532 (tcp)
TestKVGetCommand - 2019/11/27 02:27:02.089984 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand - 2019/11/27 02:27:02.090106 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVGetCommand - 2019/11/27 02:27:02.090316 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand (4.80s)
2019/11/27 02:27:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:02 [INFO]  raft: Node at 127.0.0.1:35548 [Leader] entering Leader state
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.091370 [INFO] consul: cluster leadership acquired
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.091889 [INFO] consul: New leader elected: Node abc0430f-a36f-bceb-1bd0-813e66e8bf30
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.600785 [INFO] agent: Synced node info
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.600919 [DEBUG] agent: Node info in sync
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.602934 [DEBUG] http: Request PUT /v1/kv/foo (361.507559ms) from=127.0.0.1:47132
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.607103 [DEBUG] http: Request GET /v1/kv/foo (1.304047ms) from=127.0.0.1:47134
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.609209 [INFO] agent: Requesting shutdown
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.609300 [INFO] consul: shutting down server
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.609347 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.732784 [WARN] serf: Shutdown without a Leave
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.736563 [DEBUG] agent: Node info in sync
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.799320 [INFO] manager: shutting down
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.800335 [INFO] agent: consul server down
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.800395 [INFO] agent: shutdown complete
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.800450 [INFO] agent: Stopping DNS server 127.0.0.1:35543 (tcp)
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.800585 [INFO] agent: Stopping DNS server 127.0.0.1:35543 (udp)
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.800734 [INFO] agent: Stopping HTTP server 127.0.0.1:35544 (tcp)
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.801452 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.801499 [INFO] agent: Waiting for endpoints to shut down
TestKVGetCommand_Base64 - 2019/11/27 02:27:02.801631 [INFO] agent: Endpoints down
--- PASS: TestKVGetCommand_Base64 (4.29s)
PASS
ok  	github.com/hashicorp/consul/command/kv/get	10.492s
=== RUN   TestKVImportCommand_noTabs
=== PAUSE TestKVImportCommand_noTabs
=== RUN   TestKVImportCommand
=== PAUSE TestKVImportCommand
=== CONT  TestKVImportCommand_noTabs
=== CONT  TestKVImportCommand
--- PASS: TestKVImportCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestKVImportCommand - 2019/11/27 02:26:59.888929 [WARN] agent: Node name "Node 3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVImportCommand - 2019/11/27 02:26:59.890237 [DEBUG] tlsutil: Update with version 1
TestKVImportCommand - 2019/11/27 02:26:59.890317 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVImportCommand - 2019/11/27 02:26:59.890566 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVImportCommand - 2019/11/27 02:26:59.890789 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:02 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea Address:127.0.0.1:40006}]
2019/11/27 02:27:02 [INFO]  raft: Node at 127.0.0.1:40006 [Follower] entering Follower state (Leader: "")
TestKVImportCommand - 2019/11/27 02:27:02.093974 [INFO] serf: EventMemberJoin: Node 3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea.dc1 127.0.0.1
TestKVImportCommand - 2019/11/27 02:27:02.098146 [INFO] serf: EventMemberJoin: Node 3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea 127.0.0.1
TestKVImportCommand - 2019/11/27 02:27:02.099080 [INFO] consul: Adding LAN server Node 3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea (Addr: tcp/127.0.0.1:40006) (DC: dc1)
TestKVImportCommand - 2019/11/27 02:27:02.099612 [INFO] consul: Handled member-join event for server "Node 3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea.dc1" in area "wan"
TestKVImportCommand - 2019/11/27 02:27:02.100010 [INFO] agent: Started DNS server 127.0.0.1:40001 (tcp)
TestKVImportCommand - 2019/11/27 02:27:02.100447 [INFO] agent: Started DNS server 127.0.0.1:40001 (udp)
TestKVImportCommand - 2019/11/27 02:27:02.102753 [INFO] agent: Started HTTP server on 127.0.0.1:40002 (tcp)
TestKVImportCommand - 2019/11/27 02:27:02.102890 [INFO] agent: started state syncer
2019/11/27 02:27:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:02 [INFO]  raft: Node at 127.0.0.1:40006 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:02 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:02 [INFO]  raft: Node at 127.0.0.1:40006 [Leader] entering Leader state
TestKVImportCommand - 2019/11/27 02:27:02.799866 [INFO] consul: cluster leadership acquired
TestKVImportCommand - 2019/11/27 02:27:02.800510 [INFO] consul: New leader elected: Node 3fbf3f96-9f27-15ee-6e8c-8fde4567e5ea
TestKVImportCommand - 2019/11/27 02:27:03.142322 [DEBUG] http: Request PUT /v1/kv/foo (208.535103ms) from=127.0.0.1:49458
TestKVImportCommand - 2019/11/27 02:27:03.149307 [INFO] agent: Synced node info
TestKVImportCommand - 2019/11/27 02:27:04.037466 [DEBUG] http: Request PUT /v1/kv/foo/a (888.52035ms) from=127.0.0.1:49458
TestKVImportCommand - 2019/11/27 02:27:04.058174 [DEBUG] http: Request GET /v1/kv/foo (1.806064ms) from=127.0.0.1:49460
TestKVImportCommand - 2019/11/27 02:27:04.063053 [DEBUG] http: Request GET /v1/kv/foo/a (1.764063ms) from=127.0.0.1:49460
TestKVImportCommand - 2019/11/27 02:27:04.064789 [INFO] agent: Requesting shutdown
TestKVImportCommand - 2019/11/27 02:27:04.064901 [INFO] consul: shutting down server
TestKVImportCommand - 2019/11/27 02:27:04.064955 [WARN] serf: Shutdown without a Leave
TestKVImportCommand - 2019/11/27 02:27:04.165714 [DEBUG] agent: Node info in sync
TestKVImportCommand - 2019/11/27 02:27:04.165829 [DEBUG] agent: Node info in sync
TestKVImportCommand - 2019/11/27 02:27:04.176954 [WARN] serf: Shutdown without a Leave
TestKVImportCommand - 2019/11/27 02:27:04.277090 [INFO] manager: shutting down
TestKVImportCommand - 2019/11/27 02:27:04.377138 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestKVImportCommand - 2019/11/27 02:27:04.377382 [INFO] agent: consul server down
TestKVImportCommand - 2019/11/27 02:27:04.377434 [INFO] agent: shutdown complete
TestKVImportCommand - 2019/11/27 02:27:04.377489 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (tcp)
TestKVImportCommand - 2019/11/27 02:27:04.377624 [INFO] agent: Stopping DNS server 127.0.0.1:40001 (udp)
TestKVImportCommand - 2019/11/27 02:27:04.377809 [INFO] agent: Stopping HTTP server 127.0.0.1:40002 (tcp)
TestKVImportCommand - 2019/11/27 02:27:04.378359 [INFO] agent: Waiting for endpoints to shut down
TestKVImportCommand - 2019/11/27 02:27:04.378466 [INFO] agent: Endpoints down
--- PASS: TestKVImportCommand (4.56s)
PASS
ok  	github.com/hashicorp/consul/command/kv/imp	4.687s
?   	github.com/hashicorp/consul/command/kv/impexp	[no test files]
=== RUN   TestKVPutCommand_noTabs
=== PAUSE TestKVPutCommand_noTabs
=== RUN   TestKVPutCommand_Validation
=== PAUSE TestKVPutCommand_Validation
=== RUN   TestKVPutCommand
=== PAUSE TestKVPutCommand
=== RUN   TestKVPutCommand_EmptyDataQuoted
--- SKIP: TestKVPutCommand_EmptyDataQuoted (0.00s)
    kv_put_test.go:108: DM-skipped
=== RUN   TestKVPutCommand_Base64
=== PAUSE TestKVPutCommand_Base64
=== RUN   TestKVPutCommand_File
=== PAUSE TestKVPutCommand_File
=== RUN   TestKVPutCommand_FileNoExist
=== PAUSE TestKVPutCommand_FileNoExist
=== RUN   TestKVPutCommand_Stdin
=== PAUSE TestKVPutCommand_Stdin
=== RUN   TestKVPutCommand_NegativeVal
=== PAUSE TestKVPutCommand_NegativeVal
=== RUN   TestKVPutCommand_Flags
=== PAUSE TestKVPutCommand_Flags
=== RUN   TestKVPutCommand_CAS
=== PAUSE TestKVPutCommand_CAS
=== CONT  TestKVPutCommand_noTabs
=== CONT  TestKVPutCommand_Stdin
--- PASS: TestKVPutCommand_noTabs (0.00s)
=== CONT  TestKVPutCommand_NegativeVal
=== CONT  TestKVPutCommand_CAS
=== CONT  TestKVPutCommand_Flags
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:07.169403 [WARN] agent: Node name "Node 09feaff4-334c-b7b1-859b-4e123149e56b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:07.194419 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:07.194527 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:07.194780 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:07.194957 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Flags - 2019/11/27 02:27:07.229510 [WARN] agent: Node name "Node 539aa565-88ba-7406-ad43-99d5987854a2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Flags - 2019/11/27 02:27:07.230035 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Flags - 2019/11/27 02:27:07.230102 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Stdin - 2019/11/27 02:27:07.239883 [WARN] agent: Node name "Node bd5ec137-571b-e00c-4a9d-7a9ced33ed8a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Stdin - 2019/11/27 02:27:07.240263 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Stdin - 2019/11/27 02:27:07.240328 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_Stdin - 2019/11/27 02:27:07.240497 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand_Stdin - 2019/11/27 02:27:07.240601 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_Flags - 2019/11/27 02:27:07.241955 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand_Flags - 2019/11/27 02:27:07.242109 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_CAS - 2019/11/27 02:27:07.247255 [WARN] agent: Node name "Node e079756d-569d-e09d-b1af-c311711eb1f9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_CAS - 2019/11/27 02:27:07.247648 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_CAS - 2019/11/27 02:27:07.247739 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_CAS - 2019/11/27 02:27:07.247927 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand_CAS - 2019/11/27 02:27:07.248028 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:09feaff4-334c-b7b1-859b-4e123149e56b Address:127.0.0.1:32512}]
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32512 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.270934 [INFO] serf: EventMemberJoin: Node 09feaff4-334c-b7b1-859b-4e123149e56b.dc1 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.274429 [INFO] serf: EventMemberJoin: Node 09feaff4-334c-b7b1-859b-4e123149e56b 127.0.0.1
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.275236 [INFO] consul: Handled member-join event for server "Node 09feaff4-334c-b7b1-859b-4e123149e56b.dc1" in area "wan"
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.275646 [INFO] consul: Adding LAN server Node 09feaff4-334c-b7b1-859b-4e123149e56b (Addr: tcp/127.0.0.1:32512) (DC: dc1)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.275942 [INFO] agent: Started DNS server 127.0.0.1:32507 (tcp)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.276372 [INFO] agent: Started DNS server 127.0.0.1:32507 (udp)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.278664 [INFO] agent: Started HTTP server on 127.0.0.1:32508 (tcp)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:08.279089 [INFO] agent: started state syncer
2019/11/27 02:27:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32512 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e079756d-569d-e09d-b1af-c311711eb1f9 Address:127.0.0.1:32518}]
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32518 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:bd5ec137-571b-e00c-4a9d-7a9ced33ed8a Address:127.0.0.1:32506}]
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32506 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:08 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:539aa565-88ba-7406-ad43-99d5987854a2 Address:127.0.0.1:32524}]
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32524 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_CAS - 2019/11/27 02:27:08.389567 [INFO] serf: EventMemberJoin: Node e079756d-569d-e09d-b1af-c311711eb1f9.dc1 127.0.0.1
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.391000 [INFO] serf: EventMemberJoin: Node bd5ec137-571b-e00c-4a9d-7a9ced33ed8a.dc1 127.0.0.1
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.429834 [INFO] serf: EventMemberJoin: Node bd5ec137-571b-e00c-4a9d-7a9ced33ed8a 127.0.0.1
TestKVPutCommand_Flags - 2019/11/27 02:27:08.430377 [INFO] serf: EventMemberJoin: Node 539aa565-88ba-7406-ad43-99d5987854a2.dc1 127.0.0.1
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.432557 [INFO] agent: Started DNS server 127.0.0.1:32501 (udp)
2019/11/27 02:27:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32524 [Candidate] entering Candidate state in term 2
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.435468 [INFO] consul: Adding LAN server Node bd5ec137-571b-e00c-4a9d-7a9ced33ed8a (Addr: tcp/127.0.0.1:32506) (DC: dc1)
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.435748 [INFO] consul: Handled member-join event for server "Node bd5ec137-571b-e00c-4a9d-7a9ced33ed8a.dc1" in area "wan"
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.436604 [INFO] agent: Started DNS server 127.0.0.1:32501 (tcp)
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.439324 [INFO] agent: Started HTTP server on 127.0.0.1:32502 (tcp)
TestKVPutCommand_Stdin - 2019/11/27 02:27:08.439441 [INFO] agent: started state syncer
2019/11/27 02:27:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32506 [Candidate] entering Candidate state in term 2
TestKVPutCommand_CAS - 2019/11/27 02:27:08.434270 [INFO] serf: EventMemberJoin: Node e079756d-569d-e09d-b1af-c311711eb1f9 127.0.0.1
TestKVPutCommand_CAS - 2019/11/27 02:27:08.444318 [INFO] consul: Handled member-join event for server "Node e079756d-569d-e09d-b1af-c311711eb1f9.dc1" in area "wan"
TestKVPutCommand_CAS - 2019/11/27 02:27:08.444609 [INFO] consul: Adding LAN server Node e079756d-569d-e09d-b1af-c311711eb1f9 (Addr: tcp/127.0.0.1:32518) (DC: dc1)
TestKVPutCommand_CAS - 2019/11/27 02:27:08.444890 [INFO] agent: Started DNS server 127.0.0.1:32513 (udp)
2019/11/27 02:27:08 [WARN]  raft: Heartbeat timeout from "" reached, starting election
TestKVPutCommand_CAS - 2019/11/27 02:27:08.445180 [INFO] agent: Started DNS server 127.0.0.1:32513 (tcp)
2019/11/27 02:27:08 [INFO]  raft: Node at 127.0.0.1:32518 [Candidate] entering Candidate state in term 2
TestKVPutCommand_CAS - 2019/11/27 02:27:08.447306 [INFO] agent: Started HTTP server on 127.0.0.1:32514 (tcp)
TestKVPutCommand_CAS - 2019/11/27 02:27:08.447413 [INFO] agent: started state syncer
TestKVPutCommand_Flags - 2019/11/27 02:27:08.448959 [INFO] serf: EventMemberJoin: Node 539aa565-88ba-7406-ad43-99d5987854a2 127.0.0.1
TestKVPutCommand_Flags - 2019/11/27 02:27:08.450260 [INFO] consul: Handled member-join event for server "Node 539aa565-88ba-7406-ad43-99d5987854a2.dc1" in area "wan"
TestKVPutCommand_Flags - 2019/11/27 02:27:08.450398 [INFO] consul: Adding LAN server Node 539aa565-88ba-7406-ad43-99d5987854a2 (Addr: tcp/127.0.0.1:32524) (DC: dc1)
TestKVPutCommand_Flags - 2019/11/27 02:27:08.451221 [INFO] agent: Started DNS server 127.0.0.1:32519 (tcp)
TestKVPutCommand_Flags - 2019/11/27 02:27:08.451654 [INFO] agent: Started DNS server 127.0.0.1:32519 (udp)
TestKVPutCommand_Flags - 2019/11/27 02:27:08.454294 [INFO] agent: Started HTTP server on 127.0.0.1:32520 (tcp)
TestKVPutCommand_Flags - 2019/11/27 02:27:08.454640 [INFO] agent: started state syncer
2019/11/27 02:27:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:09 [INFO]  raft: Node at 127.0.0.1:32524 [Leader] entering Leader state
2019/11/27 02:27:09 [INFO]  raft: Election won. Tally: 1
TestKVPutCommand_Flags - 2019/11/27 02:27:09.011198 [INFO] consul: cluster leadership acquired
TestKVPutCommand_Flags - 2019/11/27 02:27:09.011892 [INFO] consul: New leader elected: Node 539aa565-88ba-7406-ad43-99d5987854a2
2019/11/27 02:27:09 [INFO]  raft: Node at 127.0.0.1:32512 [Leader] entering Leader state
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.013642 [INFO] consul: cluster leadership acquired
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.014090 [INFO] consul: New leader elected: Node 09feaff4-334c-b7b1-859b-4e123149e56b
2019/11/27 02:27:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:09 [INFO]  raft: Node at 127.0.0.1:32518 [Leader] entering Leader state
2019/11/27 02:27:09 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:09 [INFO]  raft: Node at 127.0.0.1:32506 [Leader] entering Leader state
TestKVPutCommand_CAS - 2019/11/27 02:27:09.123179 [INFO] consul: cluster leadership acquired
TestKVPutCommand_CAS - 2019/11/27 02:27:09.123615 [INFO] consul: New leader elected: Node e079756d-569d-e09d-b1af-c311711eb1f9
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.123858 [INFO] consul: cluster leadership acquired
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.124157 [INFO] consul: New leader elected: Node bd5ec137-571b-e00c-4a9d-7a9ced33ed8a
TestKVPutCommand_Flags - 2019/11/27 02:27:09.445444 [INFO] agent: Synced node info
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.462815 [DEBUG] http: Request PUT /v1/kv/foo (365.174015ms) from=127.0.0.1:33138
TestKVPutCommand_Flags - 2019/11/27 02:27:09.465087 [DEBUG] http: Request PUT /v1/kv/foo?flags=12345 (278.166248ms) from=127.0.0.1:54220
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.472955 [INFO] agent: Synced node info
TestKVPutCommand_Flags - 2019/11/27 02:27:09.483265 [DEBUG] http: Request GET /v1/kv/foo (5.079848ms) from=127.0.0.1:54226
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.484385 [DEBUG] http: Request GET /v1/kv/foo (1.644392ms) from=127.0.0.1:33148
TestKVPutCommand_Flags - 2019/11/27 02:27:09.485812 [INFO] agent: Requesting shutdown
TestKVPutCommand_Flags - 2019/11/27 02:27:09.485952 [INFO] consul: shutting down server
TestKVPutCommand_Flags - 2019/11/27 02:27:09.486010 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.488357 [INFO] agent: Requesting shutdown
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.488704 [INFO] consul: shutting down server
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.488900 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/11/27 02:27:09.557420 [DEBUG] http: Request PUT /v1/kv/foo (333.535554ms) from=127.0.0.1:50716
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.559346 [INFO] agent: Synced node info
TestKVPutCommand_CAS - 2019/11/27 02:27:09.560043 [INFO] agent: Synced node info
TestKVPutCommand_CAS - 2019/11/27 02:27:09.560286 [DEBUG] agent: Node info in sync
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.654648 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Flags - 2019/11/27 02:27:09.656182 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.733215 [INFO] manager: shutting down
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.738216 [DEBUG] http: Request PUT /v1/kv/foo (346.261341ms) from=127.0.0.1:47722
TestKVPutCommand_Flags - 2019/11/27 02:27:09.738599 [INFO] manager: shutting down
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.743523 [DEBUG] http: Request GET /v1/kv/foo (2.051073ms) from=127.0.0.1:47730
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.745592 [INFO] agent: Requesting shutdown
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.745688 [INFO] consul: shutting down server
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.745742 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.866945 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.976924 [INFO] manager: shutting down
TestKVPutCommand_Flags - 2019/11/27 02:27:09.977161 [INFO] agent: consul server down
TestKVPutCommand_Flags - 2019/11/27 02:27:09.977233 [INFO] agent: shutdown complete
TestKVPutCommand_Flags - 2019/11/27 02:27:09.977295 [INFO] agent: Stopping DNS server 127.0.0.1:32519 (tcp)
TestKVPutCommand_Flags - 2019/11/27 02:27:09.977479 [INFO] agent: Stopping DNS server 127.0.0.1:32519 (udp)
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.977530 [INFO] agent: consul server down
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.977577 [INFO] agent: shutdown complete
TestKVPutCommand_Flags - 2019/11/27 02:27:09.977659 [INFO] agent: Stopping HTTP server 127.0.0.1:32520 (tcp)
TestKVPutCommand_Flags - 2019/11/27 02:27:09.978446 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Flags - 2019/11/27 02:27:09.977194 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.977667 [INFO] agent: Stopping DNS server 127.0.0.1:32501 (tcp)
TestKVPutCommand_Flags - 2019/11/27 02:27:09.978680 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Flags (3.05s)
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.978822 [INFO] agent: Stopping DNS server 127.0.0.1:32501 (udp)
=== CONT  TestKVPutCommand_FileNoExist
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.978989 [INFO] agent: Stopping HTTP server 127.0.0.1:32502 (tcp)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.979151 [INFO] agent: consul server down
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.979197 [INFO] agent: shutdown complete
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.979248 [INFO] agent: Stopping DNS server 127.0.0.1:32507 (tcp)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.979378 [INFO] agent: Stopping DNS server 127.0.0.1:32507 (udp)
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.979524 [INFO] agent: Stopping HTTP server 127.0.0.1:32508 (tcp)
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.979595 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.978002 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_Stdin - 2019/11/27 02:27:09.979787 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Stdin (3.06s)
=== CONT  TestKVPutCommand
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.980178 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.980317 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_NegativeVal - 2019/11/27 02:27:09.980525 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_NegativeVal (3.06s)
=== CONT  TestKVPutCommand_File
--- PASS: TestKVPutCommand_FileNoExist (0.00s)
=== CONT  TestKVPutCommand_Validation
--- PASS: TestKVPutCommand_Validation (0.02s)
=== CONT  TestKVPutCommand_Base64
TestKVPutCommand_CAS - 2019/11/27 02:27:10.075535 [DEBUG] http: Request PUT /v1/kv/foo?cas=123 (508.365452ms) from=127.0.0.1:50724
TestKVPutCommand_CAS - 2019/11/27 02:27:10.087677 [DEBUG] http: Request GET /v1/kv/foo (755.027µs) from=127.0.0.1:50716
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_Base64 - 2019/11/27 02:27:10.142333 [WARN] agent: Node name "Node 640a38fb-2089-0772-5718-808270a2d433" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_Base64 - 2019/11/27 02:27:10.142756 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_Base64 - 2019/11/27 02:27:10.142828 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_Base64 - 2019/11/27 02:27:10.142992 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand_Base64 - 2019/11/27 02:27:10.143112 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand_File - 2019/11/27 02:27:10.158770 [WARN] agent: Node name "Node f1d0853a-4d76-f577-c6b3-7b0cc54c42c7" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand_File - 2019/11/27 02:27:10.159183 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand_File - 2019/11/27 02:27:10.159251 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_File - 2019/11/27 02:27:10.159411 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand_File - 2019/11/27 02:27:10.159518 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKVPutCommand - 2019/11/27 02:27:10.190837 [WARN] agent: Node name "Node 4d139748-56f7-6290-1be5-44fe333ddc1f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKVPutCommand - 2019/11/27 02:27:10.191784 [DEBUG] tlsutil: Update with version 1
TestKVPutCommand - 2019/11/27 02:27:10.192039 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand - 2019/11/27 02:27:10.192769 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKVPutCommand - 2019/11/27 02:27:10.193104 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKVPutCommand_CAS - 2019/11/27 02:27:10.480072 [DEBUG] http: Request PUT /v1/kv/foo?cas=5 (388.380174ms) from=127.0.0.1:50728
TestKVPutCommand_CAS - 2019/11/27 02:27:10.483923 [DEBUG] http: Request GET /v1/kv/foo (748.36µs) from=127.0.0.1:50716
TestKVPutCommand_CAS - 2019/11/27 02:27:10.485608 [INFO] agent: Requesting shutdown
TestKVPutCommand_CAS - 2019/11/27 02:27:10.485701 [INFO] consul: shutting down server
TestKVPutCommand_CAS - 2019/11/27 02:27:10.485746 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/11/27 02:27:10.565522 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_CAS - 2019/11/27 02:27:10.633137 [INFO] manager: shutting down
TestKVPutCommand_CAS - 2019/11/27 02:27:10.633682 [INFO] agent: consul server down
TestKVPutCommand_CAS - 2019/11/27 02:27:10.633739 [INFO] agent: shutdown complete
TestKVPutCommand_CAS - 2019/11/27 02:27:10.633801 [INFO] agent: Stopping DNS server 127.0.0.1:32513 (tcp)
TestKVPutCommand_CAS - 2019/11/27 02:27:10.633956 [INFO] agent: Stopping DNS server 127.0.0.1:32513 (udp)
TestKVPutCommand_CAS - 2019/11/27 02:27:10.634124 [INFO] agent: Stopping HTTP server 127.0.0.1:32514 (tcp)
TestKVPutCommand_CAS - 2019/11/27 02:27:10.635041 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_CAS - 2019/11/27 02:27:10.635174 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestKVPutCommand_CAS - 2019/11/27 02:27:10.635339 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_CAS (3.71s)
2019/11/27 02:27:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4d139748-56f7-6290-1be5-44fe333ddc1f Address:127.0.0.1:32530}]
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32530 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:640a38fb-2089-0772-5718-808270a2d433 Address:127.0.0.1:32542}]
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32542 [Follower] entering Follower state (Leader: "")
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.016473 [INFO] serf: EventMemberJoin: Node 640a38fb-2089-0772-5718-808270a2d433.dc1 127.0.0.1
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.020255 [INFO] serf: EventMemberJoin: Node 640a38fb-2089-0772-5718-808270a2d433 127.0.0.1
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.021082 [INFO] consul: Handled member-join event for server "Node 640a38fb-2089-0772-5718-808270a2d433.dc1" in area "wan"
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.021477 [INFO] consul: Adding LAN server Node 640a38fb-2089-0772-5718-808270a2d433 (Addr: tcp/127.0.0.1:32542) (DC: dc1)
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.022373 [INFO] agent: Started DNS server 127.0.0.1:32537 (udp)
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.022463 [INFO] agent: Started DNS server 127.0.0.1:32537 (tcp)
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.024421 [INFO] agent: Started HTTP server on 127.0.0.1:32538 (tcp)
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.024496 [INFO] agent: started state syncer
2019/11/27 02:27:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f1d0853a-4d76-f577-c6b3-7b0cc54c42c7 Address:127.0.0.1:32536}]
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32536 [Follower] entering Follower state (Leader: "")
TestKVPutCommand - 2019/11/27 02:27:11.035836 [INFO] serf: EventMemberJoin: Node 4d139748-56f7-6290-1be5-44fe333ddc1f.dc1 127.0.0.1
TestKVPutCommand_File - 2019/11/27 02:27:11.037768 [INFO] serf: EventMemberJoin: Node f1d0853a-4d76-f577-c6b3-7b0cc54c42c7.dc1 127.0.0.1
TestKVPutCommand - 2019/11/27 02:27:11.039626 [INFO] serf: EventMemberJoin: Node 4d139748-56f7-6290-1be5-44fe333ddc1f 127.0.0.1
TestKVPutCommand - 2019/11/27 02:27:11.040905 [INFO] agent: Started DNS server 127.0.0.1:32525 (udp)
TestKVPutCommand_File - 2019/11/27 02:27:11.041150 [INFO] serf: EventMemberJoin: Node f1d0853a-4d76-f577-c6b3-7b0cc54c42c7 127.0.0.1
TestKVPutCommand_File - 2019/11/27 02:27:11.042861 [INFO] agent: Started DNS server 127.0.0.1:32531 (udp)
TestKVPutCommand_File - 2019/11/27 02:27:11.043321 [INFO] consul: Handled member-join event for server "Node f1d0853a-4d76-f577-c6b3-7b0cc54c42c7.dc1" in area "wan"
TestKVPutCommand - 2019/11/27 02:27:11.043345 [INFO] consul: Adding LAN server Node 4d139748-56f7-6290-1be5-44fe333ddc1f (Addr: tcp/127.0.0.1:32530) (DC: dc1)
TestKVPutCommand - 2019/11/27 02:27:11.043584 [INFO] consul: Handled member-join event for server "Node 4d139748-56f7-6290-1be5-44fe333ddc1f.dc1" in area "wan"
TestKVPutCommand_File - 2019/11/27 02:27:11.043868 [INFO] consul: Adding LAN server Node f1d0853a-4d76-f577-c6b3-7b0cc54c42c7 (Addr: tcp/127.0.0.1:32536) (DC: dc1)
TestKVPutCommand - 2019/11/27 02:27:11.044053 [INFO] agent: Started DNS server 127.0.0.1:32525 (tcp)
TestKVPutCommand_File - 2019/11/27 02:27:11.044639 [INFO] agent: Started DNS server 127.0.0.1:32531 (tcp)
TestKVPutCommand - 2019/11/27 02:27:11.045958 [INFO] agent: Started HTTP server on 127.0.0.1:32526 (tcp)
TestKVPutCommand - 2019/11/27 02:27:11.046035 [INFO] agent: started state syncer
TestKVPutCommand_File - 2019/11/27 02:27:11.046591 [INFO] agent: Started HTTP server on 127.0.0.1:32532 (tcp)
TestKVPutCommand_File - 2019/11/27 02:27:11.046668 [INFO] agent: started state syncer
2019/11/27 02:27:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32530 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32536 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32542 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32530 [Leader] entering Leader state
2019/11/27 02:27:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32536 [Leader] entering Leader state
TestKVPutCommand - 2019/11/27 02:27:11.632453 [INFO] consul: cluster leadership acquired
TestKVPutCommand - 2019/11/27 02:27:11.632908 [INFO] consul: New leader elected: Node 4d139748-56f7-6290-1be5-44fe333ddc1f
2019/11/27 02:27:11 [INFO]  raft: Election won. Tally: 1
TestKVPutCommand_File - 2019/11/27 02:27:11.633244 [INFO] consul: cluster leadership acquired
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:32542 [Leader] entering Leader state
TestKVPutCommand_File - 2019/11/27 02:27:11.633629 [INFO] consul: New leader elected: Node f1d0853a-4d76-f577-c6b3-7b0cc54c42c7
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.633857 [INFO] consul: cluster leadership acquired
TestKVPutCommand_Base64 - 2019/11/27 02:27:11.634206 [INFO] consul: New leader elected: Node 640a38fb-2089-0772-5718-808270a2d433
TestKVPutCommand_File - 2019/11/27 02:27:11.989388 [INFO] agent: Synced node info
TestKVPutCommand_File - 2019/11/27 02:27:11.989659 [DEBUG] http: Request PUT /v1/kv/foo (257.854521ms) from=127.0.0.1:46394
TestKVPutCommand_File - 2019/11/27 02:27:11.993612 [DEBUG] http: Request GET /v1/kv/foo (906.699µs) from=127.0.0.1:46398
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.002982 [INFO] agent: Synced node info
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.003930 [DEBUG] http: Request PUT /v1/kv/foo (204.247612ms) from=127.0.0.1:43580
TestKVPutCommand_File - 2019/11/27 02:27:12.006598 [INFO] agent: Requesting shutdown
TestKVPutCommand_File - 2019/11/27 02:27:12.006927 [INFO] consul: shutting down server
TestKVPutCommand_File - 2019/11/27 02:27:12.007086 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.009786 [DEBUG] http: Request GET /v1/kv/foo (1.087705ms) from=127.0.0.1:43584
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.011357 [INFO] agent: Requesting shutdown
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.011464 [INFO] consul: shutting down server
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.011520 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.165336 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/11/27 02:27:12.165336 [WARN] serf: Shutdown without a Leave
TestKVPutCommand - 2019/11/27 02:27:12.167999 [INFO] agent: Synced node info
TestKVPutCommand - 2019/11/27 02:27:12.169494 [DEBUG] http: Request PUT /v1/kv/foo (501.821881ms) from=127.0.0.1:52900
TestKVPutCommand - 2019/11/27 02:27:12.175127 [DEBUG] http: Request GET /v1/kv/foo (959.034µs) from=127.0.0.1:52910
TestKVPutCommand - 2019/11/27 02:27:12.177351 [INFO] agent: Requesting shutdown
TestKVPutCommand - 2019/11/27 02:27:12.177464 [INFO] consul: shutting down server
TestKVPutCommand - 2019/11/27 02:27:12.177520 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/11/27 02:27:12.265413 [INFO] manager: shutting down
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.265413 [INFO] manager: shutting down
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.266135 [INFO] agent: consul server down
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.266206 [INFO] agent: shutdown complete
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.266274 [INFO] agent: Stopping DNS server 127.0.0.1:32537 (tcp)
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.266445 [INFO] agent: Stopping DNS server 127.0.0.1:32537 (udp)
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.266740 [INFO] agent: Stopping HTTP server 127.0.0.1:32538 (tcp)
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.267497 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.267640 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.267955 [ERR] consul: failed to establish leadership: raft is already shutdown
TestKVPutCommand_Base64 - 2019/11/27 02:27:12.267968 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_Base64 (2.27s)
TestKVPutCommand - 2019/11/27 02:27:12.268504 [WARN] serf: Shutdown without a Leave
TestKVPutCommand_File - 2019/11/27 02:27:12.302470 [ERR] agent: failed to sync remote state: No cluster leader
TestKVPutCommand - 2019/11/27 02:27:12.354316 [INFO] manager: shutting down
TestKVPutCommand - 2019/11/27 02:27:12.354695 [INFO] agent: consul server down
TestKVPutCommand - 2019/11/27 02:27:12.354753 [INFO] agent: shutdown complete
TestKVPutCommand - 2019/11/27 02:27:12.354809 [INFO] agent: Stopping DNS server 127.0.0.1:32525 (tcp)
TestKVPutCommand - 2019/11/27 02:27:12.354962 [INFO] agent: Stopping DNS server 127.0.0.1:32525 (udp)
TestKVPutCommand - 2019/11/27 02:27:12.355146 [INFO] agent: Stopping HTTP server 127.0.0.1:32526 (tcp)
TestKVPutCommand - 2019/11/27 02:27:12.355903 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand - 2019/11/27 02:27:12.356105 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand (2.38s)
TestKVPutCommand_File - 2019/11/27 02:27:12.454451 [INFO] agent: consul server down
TestKVPutCommand_File - 2019/11/27 02:27:12.454532 [INFO] agent: shutdown complete
TestKVPutCommand_File - 2019/11/27 02:27:12.454610 [INFO] agent: Stopping DNS server 127.0.0.1:32531 (tcp)
TestKVPutCommand_File - 2019/11/27 02:27:12.454748 [INFO] agent: Stopping DNS server 127.0.0.1:32531 (udp)
TestKVPutCommand_File - 2019/11/27 02:27:12.454903 [INFO] agent: Stopping HTTP server 127.0.0.1:32532 (tcp)
TestKVPutCommand_File - 2019/11/27 02:27:12.455551 [INFO] agent: Waiting for endpoints to shut down
TestKVPutCommand_File - 2019/11/27 02:27:12.455675 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestKVPutCommand_File - 2019/11/27 02:27:12.455877 [INFO] agent: Endpoints down
--- PASS: TestKVPutCommand_File (2.48s)
PASS
ok  	github.com/hashicorp/consul/command/kv/put	5.687s
=== RUN   TestLeaveCommand_noTabs
=== PAUSE TestLeaveCommand_noTabs
=== RUN   TestLeaveCommand
=== PAUSE TestLeaveCommand
=== RUN   TestLeaveCommand_FailOnNonFlagArgs
=== PAUSE TestLeaveCommand_FailOnNonFlagArgs
=== CONT  TestLeaveCommand_noTabs
=== CONT  TestLeaveCommand
--- PASS: TestLeaveCommand_noTabs (0.00s)
=== CONT  TestLeaveCommand_FailOnNonFlagArgs
WARNING: bootstrap = true: do not enable unless necessary
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:09.423130 [WARN] agent: Node name "Node cb756770-9b7f-9b56-61a4-e6649adaffe8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:09.424236 [DEBUG] tlsutil: Update with version 1
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:09.424319 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:09.424585 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:09.424721 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLeaveCommand - 2019/11/27 02:27:09.433779 [WARN] agent: Node name "Node 53f59d08-86bf-cf7a-5c59-0faa16cef314" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLeaveCommand - 2019/11/27 02:27:09.434969 [DEBUG] tlsutil: Update with version 1
TestLeaveCommand - 2019/11/27 02:27:09.435164 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLeaveCommand - 2019/11/27 02:27:09.435660 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLeaveCommand - 2019/11/27 02:27:09.435871 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cb756770-9b7f-9b56-61a4-e6649adaffe8 Address:127.0.0.1:26512}]
2019/11/27 02:27:10 [INFO]  raft: Node at 127.0.0.1:26512 [Follower] entering Follower state (Leader: "")
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.484506 [INFO] serf: EventMemberJoin: Node cb756770-9b7f-9b56-61a4-e6649adaffe8.dc1 127.0.0.1
2019/11/27 02:27:10 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:53f59d08-86bf-cf7a-5c59-0faa16cef314 Address:127.0.0.1:26506}]
2019/11/27 02:27:10 [INFO]  raft: Node at 127.0.0.1:26506 [Follower] entering Follower state (Leader: "")
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.490438 [INFO] serf: EventMemberJoin: Node cb756770-9b7f-9b56-61a4-e6649adaffe8 127.0.0.1
TestLeaveCommand - 2019/11/27 02:27:10.499562 [INFO] serf: EventMemberJoin: Node 53f59d08-86bf-cf7a-5c59-0faa16cef314.dc1 127.0.0.1
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.504782 [INFO] agent: Started DNS server 127.0.0.1:26507 (udp)
TestLeaveCommand - 2019/11/27 02:27:10.508793 [INFO] serf: EventMemberJoin: Node 53f59d08-86bf-cf7a-5c59-0faa16cef314 127.0.0.1
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.522207 [INFO] agent: Started DNS server 127.0.0.1:26507 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.525436 [INFO] agent: Started HTTP server on 127.0.0.1:26508 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.525582 [INFO] agent: started state syncer
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.526166 [INFO] consul: Handled member-join event for server "Node cb756770-9b7f-9b56-61a4-e6649adaffe8.dc1" in area "wan"
TestLeaveCommand - 2019/11/27 02:27:10.535056 [INFO] agent: Started DNS server 127.0.0.1:26501 (udp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:10.536001 [INFO] consul: Adding LAN server Node cb756770-9b7f-9b56-61a4-e6649adaffe8 (Addr: tcp/127.0.0.1:26512) (DC: dc1)
TestLeaveCommand - 2019/11/27 02:27:10.536412 [INFO] consul: Handled member-join event for server "Node 53f59d08-86bf-cf7a-5c59-0faa16cef314.dc1" in area "wan"
TestLeaveCommand - 2019/11/27 02:27:10.536885 [INFO] agent: Started DNS server 127.0.0.1:26501 (tcp)
TestLeaveCommand - 2019/11/27 02:27:10.536989 [INFO] consul: Adding LAN server Node 53f59d08-86bf-cf7a-5c59-0faa16cef314 (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestLeaveCommand - 2019/11/27 02:27:10.539412 [INFO] agent: Started HTTP server on 127.0.0.1:26502 (tcp)
TestLeaveCommand - 2019/11/27 02:27:10.539508 [INFO] agent: started state syncer
2019/11/27 02:27:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:10 [INFO]  raft: Node at 127.0.0.1:26512 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:10 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:10 [INFO]  raft: Node at 127.0.0.1:26506 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:26512 [Leader] entering Leader state
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.012025 [INFO] consul: cluster leadership acquired
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.012634 [INFO] consul: New leader elected: Node cb756770-9b7f-9b56-61a4-e6649adaffe8
2019/11/27 02:27:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:11 [INFO]  raft: Node at 127.0.0.1:26506 [Leader] entering Leader state
TestLeaveCommand - 2019/11/27 02:27:11.013684 [INFO] consul: cluster leadership acquired
TestLeaveCommand - 2019/11/27 02:27:11.014134 [INFO] consul: New leader elected: Node 53f59d08-86bf-cf7a-5c59-0faa16cef314
TestLeaveCommand - 2019/11/27 02:27:11.035222 [INFO] consul: server starting leave
TestLeaveCommand - 2019/11/27 02:27:11.201793 [INFO] serf: EventMemberLeave: Node 53f59d08-86bf-cf7a-5c59-0faa16cef314.dc1 127.0.0.1
TestLeaveCommand - 2019/11/27 02:27:11.202118 [INFO] consul: Handled member-leave event for server "Node 53f59d08-86bf-cf7a-5c59-0faa16cef314.dc1" in area "wan"
TestLeaveCommand - 2019/11/27 02:27:11.202178 [INFO] manager: shutting down
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.304569 [INFO] agent: Requesting shutdown
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.304688 [INFO] consul: shutting down server
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.304738 [WARN] serf: Shutdown without a Leave
TestLeaveCommand - 2019/11/27 02:27:11.390280 [INFO] agent: Synced node info
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.390918 [WARN] serf: Shutdown without a Leave
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.477431 [INFO] manager: shutting down
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479342 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479466 [WARN] agent: Syncing node info failed. leadership lost while committing log
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479516 [ERR] agent: failed to sync remote state: leadership lost while committing log
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479569 [INFO] agent: consul server down
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479610 [INFO] agent: shutdown complete
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479657 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.479991 [INFO] agent: Stopping DNS server 127.0.0.1:26507 (udp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.480204 [INFO] agent: Stopping HTTP server 127.0.0.1:26508 (tcp)
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.480429 [INFO] agent: Waiting for endpoints to shut down
TestLeaveCommand_FailOnNonFlagArgs - 2019/11/27 02:27:11.480500 [INFO] agent: Endpoints down
--- PASS: TestLeaveCommand_FailOnNonFlagArgs (2.16s)
TestLeaveCommand - 2019/11/27 02:27:12.667124 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLeaveCommand - 2019/11/27 02:27:12.667548 [DEBUG] consul: Skipping self join check for "Node 53f59d08-86bf-cf7a-5c59-0faa16cef314" since the cluster is too small
TestLeaveCommand - 2019/11/27 02:27:12.667698 [INFO] consul: member 'Node 53f59d08-86bf-cf7a-5c59-0faa16cef314' joined, marking health alive
TestLeaveCommand - 2019/11/27 02:27:12.955888 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLeaveCommand - 2019/11/27 02:27:12.956011 [DEBUG] agent: Node info in sync
TestLeaveCommand - 2019/11/27 02:27:12.956102 [DEBUG] agent: Node info in sync
TestLeaveCommand - 2019/11/27 02:27:14.202356 [INFO] serf: EventMemberLeave: Node 53f59d08-86bf-cf7a-5c59-0faa16cef314 127.0.0.1
TestLeaveCommand - 2019/11/27 02:27:14.202725 [INFO] consul: Removing LAN server Node 53f59d08-86bf-cf7a-5c59-0faa16cef314 (Addr: tcp/127.0.0.1:26506) (DC: dc1)
TestLeaveCommand - 2019/11/27 02:27:14.202914 [WARN] consul: deregistering self (Node 53f59d08-86bf-cf7a-5c59-0faa16cef314) should be done by follower
TestLeaveCommand - 2019/11/27 02:27:15.557113 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/11/27 02:27:17.202885 [INFO] consul: Waiting 5s to drain RPC traffic
TestLeaveCommand - 2019/11/27 02:27:17.557153 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/11/27 02:27:19.557124 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/11/27 02:27:21.557259 [ERR] autopilot: Error updating cluster health: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/11/27 02:27:21.557262 [ERR] autopilot: Error promoting servers: error getting server raft protocol versions: No servers found
TestLeaveCommand - 2019/11/27 02:27:22.203324 [INFO] agent: Requesting shutdown
TestLeaveCommand - 2019/11/27 02:27:22.203464 [INFO] consul: shutting down server
TestLeaveCommand - 2019/11/27 02:27:22.298663 [INFO] agent: consul server down
TestLeaveCommand - 2019/11/27 02:27:22.298740 [INFO] agent: shutdown complete
TestLeaveCommand - 2019/11/27 02:27:22.298826 [DEBUG] http: Request PUT /v1/agent/leave (11.263589524s) from=127.0.0.1:49324
TestLeaveCommand - 2019/11/27 02:27:22.299534 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (tcp)
TestLeaveCommand - 2019/11/27 02:27:22.299728 [INFO] agent: Stopping DNS server 127.0.0.1:26501 (udp)
TestLeaveCommand - 2019/11/27 02:27:22.299898 [INFO] agent: Stopping HTTP server 127.0.0.1:26502 (tcp)
TestLeaveCommand - 2019/11/27 02:27:22.300398 [INFO] agent: Waiting for endpoints to shut down
TestLeaveCommand - 2019/11/27 02:27:22.300491 [INFO] agent: Endpoints down
--- PASS: TestLeaveCommand (12.98s)
PASS
ok  	github.com/hashicorp/consul/command/leave	13.114s
=== RUN   TestLockCommand_noTabs
=== PAUSE TestLockCommand_noTabs
=== RUN   TestLockCommand_BadArgs
--- SKIP: TestLockCommand_BadArgs (0.00s)
    lock_test.go:37: DM-skipped
=== RUN   TestLockCommand
--- SKIP: TestLockCommand (0.00s)
    lock_test.go:45: DM-skipped
=== RUN   TestLockCommand_NoShell
=== PAUSE TestLockCommand_NoShell
=== RUN   TestLockCommand_TryLock
=== PAUSE TestLockCommand_TryLock
=== RUN   TestLockCommand_TrySemaphore
=== PAUSE TestLockCommand_TrySemaphore
=== RUN   TestLockCommand_MonitorRetry_Lock_Default
=== PAUSE TestLockCommand_MonitorRetry_Lock_Default
=== RUN   TestLockCommand_MonitorRetry_Semaphore_Default
=== PAUSE TestLockCommand_MonitorRetry_Semaphore_Default
=== RUN   TestLockCommand_MonitorRetry_Lock_Arg
=== PAUSE TestLockCommand_MonitorRetry_Lock_Arg
=== RUN   TestLockCommand_MonitorRetry_Semaphore_Arg
=== PAUSE TestLockCommand_MonitorRetry_Semaphore_Arg
=== RUN   TestLockCommand_ChildExitCode
=== PAUSE TestLockCommand_ChildExitCode
=== CONT  TestLockCommand_noTabs
=== CONT  TestLockCommand_MonitorRetry_Semaphore_Default
=== CONT  TestLockCommand_MonitorRetry_Semaphore_Arg
=== CONT  TestLockCommand_MonitorRetry_Lock_Arg
--- PASS: TestLockCommand_noTabs (0.01s)
=== CONT  TestLockCommand_ChildExitCode
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:38.406821 [WARN] agent: Node name "Node 52535d58-fe28-9a99-056f-8e774c9117d3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:38.407758 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:38.408033 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:38.408273 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:38.408376 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_ChildExitCode - 2019/11/27 02:27:38.442385 [WARN] agent: Node name "Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_ChildExitCode - 2019/11/27 02:27:38.442927 [DEBUG] tlsutil: Update with version 1
TestLockCommand_ChildExitCode - 2019/11/27 02:27:38.443013 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_ChildExitCode - 2019/11/27 02:27:38.443199 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_ChildExitCode - 2019/11/27 02:27:38.443371 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:38.473258 [WARN] agent: Node name "Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:38.473688 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:38.473759 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:38.473918 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:38.474019 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:38.475143 [WARN] agent: Node name "Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:38.475785 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:38.475927 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:38.476292 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:38.476461 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:4c20faca-cbf4-71ff-74c8-4242ae12e0e2 Address:127.0.0.1:17524}]
2019/11/27 02:27:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:37c231fc-3c49-c013-c0b5-66832f0a6e5b Address:127.0.0.1:17506}]
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.815606 [INFO] serf: EventMemberJoin: Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b.dc1 127.0.0.1
2019/11/27 02:27:39 [INFO]  raft: Node at 127.0.0.1:17524 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:39 [INFO]  raft: Node at 127.0.0.1:17506 [Follower] entering Follower state (Leader: "")
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.826085 [INFO] serf: EventMemberJoin: Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.843752 [INFO] agent: Started DNS server 127.0.0.1:17501 (udp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.844528 [INFO] agent: Started DNS server 127.0.0.1:17501 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.847584 [INFO] consul: Adding LAN server Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b (Addr: tcp/127.0.0.1:17506) (DC: dc1)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.849386 [INFO] consul: Handled member-join event for server "Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b.dc1" in area "wan"
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.852419 [INFO] serf: EventMemberJoin: Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2.dc1 127.0.0.1
2019/11/27 02:27:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:52535d58-fe28-9a99-056f-8e774c9117d3 Address:127.0.0.1:17512}]
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.860031 [INFO] serf: EventMemberJoin: Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.861583 [INFO] agent: Started HTTP server on 127.0.0.1:17502 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:39.861753 [INFO] agent: started state syncer
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.865485 [INFO] serf: EventMemberJoin: Node 52535d58-fe28-9a99-056f-8e774c9117d3.dc1 127.0.0.1
2019/11/27 02:27:39 [INFO]  raft: Node at 127.0.0.1:17512 [Follower] entering Follower state (Leader: "")
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.868930 [INFO] consul: Adding LAN server Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2 (Addr: tcp/127.0.0.1:17524) (DC: dc1)
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.869274 [INFO] consul: Handled member-join event for server "Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2.dc1" in area "wan"
2019/11/27 02:27:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:39 [INFO]  raft: Node at 127.0.0.1:17506 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:39 [INFO]  raft: Node at 127.0.0.1:17524 [Candidate] entering Candidate state in term 2
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.887323 [INFO] agent: Started DNS server 127.0.0.1:17519 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.889835 [INFO] serf: EventMemberJoin: Node 52535d58-fe28-9a99-056f-8e774c9117d3 127.0.0.1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.892101 [INFO] agent: Started DNS server 127.0.0.1:17507 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.892581 [INFO] consul: Adding LAN server Node 52535d58-fe28-9a99-056f-8e774c9117d3 (Addr: tcp/127.0.0.1:17512) (DC: dc1)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.892808 [INFO] consul: Handled member-join event for server "Node 52535d58-fe28-9a99-056f-8e774c9117d3.dc1" in area "wan"
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.893316 [INFO] agent: Started DNS server 127.0.0.1:17507 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.895197 [INFO] agent: Started HTTP server on 127.0.0.1:17508 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:39.895306 [INFO] agent: started state syncer
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.896886 [INFO] agent: Started DNS server 127.0.0.1:17519 (tcp)
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.900244 [INFO] agent: Started HTTP server on 127.0.0.1:17520 (tcp)
TestLockCommand_ChildExitCode - 2019/11/27 02:27:39.900376 [INFO] agent: started state syncer
2019/11/27 02:27:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:39 [INFO]  raft: Node at 127.0.0.1:17512 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:40 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6 Address:127.0.0.1:17518}]
2019/11/27 02:27:40 [INFO]  raft: Node at 127.0.0.1:17518 [Follower] entering Follower state (Leader: "")
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.008170 [INFO] serf: EventMemberJoin: Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.013632 [INFO] serf: EventMemberJoin: Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.014891 [INFO] consul: Adding LAN server Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6 (Addr: tcp/127.0.0.1:17518) (DC: dc1)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.015251 [INFO] consul: Handled member-join event for server "Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.015739 [INFO] agent: Started DNS server 127.0.0.1:17513 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.016076 [INFO] agent: Started DNS server 127.0.0.1:17513 (udp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.019219 [INFO] agent: Started HTTP server on 127.0.0.1:17514 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.019357 [INFO] agent: started state syncer
2019/11/27 02:27:40 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:40 [INFO]  raft: Node at 127.0.0.1:17518 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:40 [INFO]  raft: Node at 127.0.0.1:17524 [Leader] entering Leader state
2019/11/27 02:27:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:40 [INFO]  raft: Node at 127.0.0.1:17506 [Leader] entering Leader state
TestLockCommand_ChildExitCode - 2019/11/27 02:27:40.525388 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:40.525404 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:40.525896 [INFO] consul: New leader elected: Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b
TestLockCommand_ChildExitCode - 2019/11/27 02:27:40.526125 [INFO] consul: New leader elected: Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2
2019/11/27 02:27:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:40 [INFO]  raft: Node at 127.0.0.1:17512 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:40.633307 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:40.633785 [INFO] consul: New leader elected: Node 52535d58-fe28-9a99-056f-8e774c9117d3
2019/11/27 02:27:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:40 [INFO]  raft: Node at 127.0.0.1:17518 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.969263 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:40.970164 [INFO] consul: New leader elected: Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6
TestLockCommand_ChildExitCode - 2019/11/27 02:27:41.353602 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:41.508963 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:41.511285 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:41.620070 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:41.620556 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:41.809911 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:41.821837 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:41.821982 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:42.005629 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:42.333626 [DEBUG] agent: Node info in sync
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.377984 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:43.378364 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.378620 [DEBUG] consul: Skipping self join check for "Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2" since the cluster is too small
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.378776 [INFO] consul: member 'Node 4c20faca-cbf4-71ff-74c8-4242ae12e0e2' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:43.378782 [DEBUG] consul: Skipping self join check for "Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b" since the cluster is too small
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:43.381095 [INFO] consul: member 'Node 37c231fc-3c49-c013-c0b5-66832f0a6e5b' joined, marking health alive
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:43.493866 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:43.494721 [DEBUG] consul: Skipping self join check for "Node 52535d58-fe28-9a99-056f-8e774c9117d3" since the cluster is too small
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:43.495006 [INFO] consul: member 'Node 52535d58-fe28-9a99-056f-8e774c9117d3' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:43.586668 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:43.587227 [DEBUG] consul: Skipping self join check for "Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6" since the cluster is too small
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:43.587404 [INFO] consul: member 'Node b0acc3b7-ac8a-1e2e-49ea-4ab11223dcc6' joined, marking health alive
=== RUN   TestLockCommand_ChildExitCode/clean_exit
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.634915 [DEBUG] http: Request GET /v1/agent/self (28.300006ms) from=127.0.0.1:38992
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:43.635198 [DEBUG] http: Request GET /v1/agent/self (28.062664ms) from=127.0.0.1:45268
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:43.718672 [DEBUG] http: Request GET /v1/agent/self (7.637605ms) from=127.0.0.1:46998
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.718816 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.718875 [DEBUG] agent: Node info in sync
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.718960 [DEBUG] agent: Node info in sync
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:43.810249 [DEBUG] http: Request GET /v1/agent/self (16.01657ms) from=127.0.0.1:59070
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:43.943478 [DEBUG] http: Request PUT /v1/session/create (211.66219ms) from=127.0.0.1:46998
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:43.943878 [DEBUG] http: Request PUT /v1/session/create (280.231293ms) from=127.0.0.1:45268
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.948500 [DEBUG] http: Request PUT /v1/session/create (284.522779ms) from=127.0.0.1:38992
TestLockCommand_ChildExitCode - 2019/11/27 02:27:43.955060 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (321.011µs) from=127.0.0.1:38992
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.130316 [DEBUG] http: Request PUT /v1/session/create (300.294673ms) from=127.0.0.1:59070
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.134295 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (210.674µs) from=127.0.0.1:59070
TestLockCommand_ChildExitCode - 2019/11/27 02:27:44.210017 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=66b4787e-5119-8e5f-5385-d456f46bb440&flags=3304740253564472344 (251.061589ms) from=127.0.0.1:38992
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.210247 [DEBUG] http: Request PUT /v1/kv/test/prefix/ab591bba-8428-07df-ccb3-54477e5eda3d?acquire=ab591bba-8428-07df-ccb3-54477e5eda3d&flags=16210313421097356768 (261.450959ms) from=127.0.0.1:45268
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.213521 [DEBUG] http: Request PUT /v1/kv/test/prefix/1529aabe-1ba7-2a2e-b2b7-ceb27990f8d4?acquire=1529aabe-1ba7-2a2e-b2b7-ceb27990f8d4&flags=16210313421097356768 (256.295776ms) from=127.0.0.1:46998
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.241658 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=15000ms (7.88428ms) from=127.0.0.1:46998
TestLockCommand_ChildExitCode - 2019/11/27 02:27:44.244880 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (1.626391ms) from=127.0.0.1:38992
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.257702 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=15000ms (28.586349ms) from=127.0.0.1:45268
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.410054 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=a4ed46aa-5516-4d00-f74e-1064459b203d&flags=3304740253564472344 (274.477421ms) from=127.0.0.1:59070
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.424261 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (4.269151ms) from=127.0.0.1:59070
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.534149 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (271.667655ms) from=127.0.0.1:45268
TestLockCommand_ChildExitCode - 2019/11/27 02:27:44.540112 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=66b4787e-5119-8e5f-5385-d456f46bb440 (290.796001ms) from=127.0.0.1:39002
TestLockCommand_ChildExitCode - 2019/11/27 02:27:44.545993 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (282.661712ms) from=127.0.0.1:38992
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.550381 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (273.152708ms) from=127.0.0.1:46998
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.590032 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (11.495742ms) from=127.0.0.1:45268
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.597846 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (2.274747ms) from=127.0.0.1:46998
TestLockCommand_ChildExitCode - 2019/11/27 02:27:44.606672 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (53.377897ms) from=127.0.0.1:39002
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.703867 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.267379ms) from=127.0.0.1:45280
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.712875 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (849.363µs) from=127.0.0.1:47008
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.965902 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (250.396899ms) from=127.0.0.1:47008
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.965903 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=a4ed46aa-5516-4d00-f74e-1064459b203d (266.990488ms) from=127.0.0.1:59074
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.967729 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (536.445065ms) from=127.0.0.1:59070
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:44.968413 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (366.793035ms) from=127.0.0.1:46998
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.971163 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (263.267023ms) from=127.0.0.1:45280
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:44.973621 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (371.949219ms) from=127.0.0.1:45268
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:44.975186 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.092705ms) from=127.0.0.1:59074
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.164781 [DEBUG] http: Request DELETE /v1/kv/test/prefix/1529aabe-1ba7-2a2e-b2b7-ceb27990f8d4 (194.966929ms) from=127.0.0.1:47008
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.167193 [DEBUG] http: Request PUT /v1/session/destroy/66b4787e-5119-8e5f-5385-d456f46bb440 (599.705979ms) from=127.0.0.1:38992
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.167933 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (548.902841ms) from=127.0.0.1:39002
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.169278 [DEBUG] http: Request DELETE /v1/kv/test/prefix/ab591bba-8428-07df-ccb3-54477e5eda3d (180.147068ms) from=127.0.0.1:45280
=== RUN   TestLockCommand_ChildExitCode/error_exit
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.173118 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (1.032704ms) from=127.0.0.1:45280
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.175055 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (1.263711ms) from=127.0.0.1:47008
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.194883 [DEBUG] http: Request GET /v1/agent/self (6.307557ms) from=127.0.0.1:39012
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.409818 [DEBUG] http: Request PUT /v1/session/create (201.20315ms) from=127.0.0.1:39012
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.411537 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (430.045949ms) from=127.0.0.1:59074
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.413058 [DEBUG] http: Request PUT /v1/session/destroy/a4ed46aa-5516-4d00-f74e-1064459b203d (439.868965ms) from=127.0.0.1:59080
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.414522 [DEBUG] http: Request PUT /v1/session/destroy/ab591bba-8428-07df-ccb3-54477e5eda3d (238.380805ms) from=127.0.0.1:45268
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.416604 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (254.676µs) from=127.0.0.1:39012
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.418219 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.418299 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.418359 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.518910 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.610028 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (432.4437ms) from=127.0.0.1:45280
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.610200 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=4acdec98-a79d-13b7-9f34-7e05387090f0&flags=3304740253564472344 (192.261166ms) from=127.0.0.1:39012
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.612814 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.612930 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.613002 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.613550 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.614344 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.614405 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.614462 [INFO] agent: Stopping DNS server 127.0.0.1:17513 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.614614 [INFO] agent: Stopping DNS server 127.0.0.1:17513 (udp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.614832 [INFO] agent: Stopping HTTP server 127.0.0.1:17514 (tcp)
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.615629 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Lock_Arg - 2019/11/27 02:27:45.615884 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Lock_Arg (7.35s)
=== CONT  TestLockCommand_TrySemaphore
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.627515 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (449.166962ms) from=127.0.0.1:47008
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.631530 [DEBUG] http: Request PUT /v1/session/destroy/1529aabe-1ba7-2a2e-b2b7-ceb27990f8d4 (458.223284ms) from=127.0.0.1:46998
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.632748 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.632911 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.633061 [WARN] serf: Shutdown without a Leave
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.638240 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (7.816611ms) from=127.0.0.1:39012
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.774332 [WARN] serf: Shutdown without a Leave
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_TrySemaphore - 2019/11/27 02:27:45.778201 [WARN] agent: Node name "Node 3d7599df-d385-c820-bd6f-85e252f73ad8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_TrySemaphore - 2019/11/27 02:27:45.778603 [DEBUG] tlsutil: Update with version 1
TestLockCommand_TrySemaphore - 2019/11/27 02:27:45.778675 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_TrySemaphore - 2019/11/27 02:27:45.778828 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_TrySemaphore - 2019/11/27 02:27:45.778966 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.787538 [WARN] serf: Shutdown without a Leave
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.888546 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=4acdec98-a79d-13b7-9f34-7e05387090f0 (221.887219ms) from=127.0.0.1:39014
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.889939 [INFO] manager: shutting down
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.889954 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=17 (237.208763ms) from=127.0.0.1:39012
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.890918 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.890974 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.891031 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.891199 [INFO] agent: Stopping DNS server 127.0.0.1:17501 (udp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.891366 [INFO] agent: Stopping HTTP server 127.0.0.1:17502 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.892799 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_ChildExitCode - 2019/11/27 02:27:45.894903 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (786.362µs) from=127.0.0.1:39014
TestLockCommand_MonitorRetry_Semaphore_Default - 2019/11/27 02:27:45.896077 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Semaphore_Default (7.63s)
=== CONT  TestLockCommand_MonitorRetry_Lock_Default
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.976331 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.977233 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.977296 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.977353 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.977506 [INFO] agent: Stopping DNS server 127.0.0.1:17507 (udp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.977658 [INFO] agent: Stopping HTTP server 127.0.0.1:17508 (tcp)
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.978242 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_MonitorRetry_Semaphore_Arg - 2019/11/27 02:27:45.978515 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_MonitorRetry_Semaphore_Arg (7.72s)
=== CONT  TestLockCommand_TryLock
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:46.051787 [WARN] agent: Node name "Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:46.052401 [DEBUG] tlsutil: Update with version 1
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:46.052570 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:46.052891 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:46.053074 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_TryLock - 2019/11/27 02:27:46.066434 [WARN] agent: Node name "Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_TryLock - 2019/11/27 02:27:46.066931 [DEBUG] tlsutil: Update with version 1
TestLockCommand_TryLock - 2019/11/27 02:27:46.067004 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_TryLock - 2019/11/27 02:27:46.067255 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_TryLock - 2019/11/27 02:27:46.067363 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.277458 [DEBUG] http: Request PUT /v1/session/destroy/4acdec98-a79d-13b7-9f34-7e05387090f0 (383.267286ms) from=127.0.0.1:39012
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.278702 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=18 (351.39482ms) from=127.0.0.1:39014
=== RUN   TestLockCommand_ChildExitCode/not_propagated
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.292696 [DEBUG] http: Request GET /v1/agent/self (6.385227ms) from=127.0.0.1:39016
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.533759 [DEBUG] http: Request PUT /v1/session/create (227.883097ms) from=127.0.0.1:39016
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.535988 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (249.343µs) from=127.0.0.1:39016
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.688334 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=8800aad3-840f-207b-c7da-562761222369&flags=3304740253564472344 (150.808026ms) from=127.0.0.1:39016
TestLockCommand_ChildExitCode - 2019/11/27 02:27:46.699731 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (4.611164ms) from=127.0.0.1:39016
2019/11/27 02:27:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3d7599df-d385-c820-bd6f-85e252f73ad8 Address:127.0.0.1:17530}]
2019/11/27 02:27:46 [INFO]  raft: Node at 127.0.0.1:17530 [Follower] entering Follower state (Leader: "")
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.851289 [INFO] serf: EventMemberJoin: Node 3d7599df-d385-c820-bd6f-85e252f73ad8.dc1 127.0.0.1
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.865719 [INFO] serf: EventMemberJoin: Node 3d7599df-d385-c820-bd6f-85e252f73ad8 127.0.0.1
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.866610 [INFO] consul: Adding LAN server Node 3d7599df-d385-c820-bd6f-85e252f73ad8 (Addr: tcp/127.0.0.1:17530) (DC: dc1)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.867312 [INFO] consul: Handled member-join event for server "Node 3d7599df-d385-c820-bd6f-85e252f73ad8.dc1" in area "wan"
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.868321 [INFO] agent: Started DNS server 127.0.0.1:17525 (tcp)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.868394 [INFO] agent: Started DNS server 127.0.0.1:17525 (udp)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.870412 [INFO] agent: Started HTTP server on 127.0.0.1:17526 (tcp)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:46.870519 [INFO] agent: started state syncer
2019/11/27 02:27:46 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:46 [INFO]  raft: Node at 127.0.0.1:17530 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:46 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ea17209f-8cdf-3840-06b2-ceb7d7ba9660 Address:127.0.0.1:17536}]
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17536 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:91b863e9-532c-3c40-3d6f-cd5a5fb75947 Address:127.0.0.1:17542}]
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.002057 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=8800aad3-840f-207b-c7da-562761222369 (283.804751ms) from=127.0.0.1:39022
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.002647 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=22 (298.677947ms) from=127.0.0.1:39016
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17542 [Follower] entering Follower state (Leader: "")
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.006321 [INFO] serf: EventMemberJoin: Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660.dc1 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.009622 [INFO] serf: EventMemberJoin: Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660 127.0.0.1
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.010285 [INFO] consul: Adding LAN server Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660 (Addr: tcp/127.0.0.1:17536) (DC: dc1)
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.010290 [INFO] consul: Handled member-join event for server "Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660.dc1" in area "wan"
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.010866 [INFO] agent: Started DNS server 127.0.0.1:17531 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.010940 [INFO] agent: Started DNS server 127.0.0.1:17531 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.013255 [INFO] agent: Started HTTP server on 127.0.0.1:17532 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.013390 [INFO] agent: started state syncer
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.015273 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.352714ms) from=127.0.0.1:39016
TestLockCommand_TryLock - 2019/11/27 02:27:47.016997 [INFO] serf: EventMemberJoin: Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947.dc1 127.0.0.1
TestLockCommand_TryLock - 2019/11/27 02:27:47.020269 [INFO] serf: EventMemberJoin: Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947 127.0.0.1
TestLockCommand_TryLock - 2019/11/27 02:27:47.021097 [INFO] consul: Adding LAN server Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947 (Addr: tcp/127.0.0.1:17542) (DC: dc1)
TestLockCommand_TryLock - 2019/11/27 02:27:47.021419 [INFO] consul: Handled member-join event for server "Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947.dc1" in area "wan"
TestLockCommand_TryLock - 2019/11/27 02:27:47.021805 [INFO] agent: Started DNS server 127.0.0.1:17537 (udp)
TestLockCommand_TryLock - 2019/11/27 02:27:47.022003 [INFO] agent: Started DNS server 127.0.0.1:17537 (tcp)
TestLockCommand_TryLock - 2019/11/27 02:27:47.023790 [INFO] agent: Started HTTP server on 127.0.0.1:17538 (tcp)
TestLockCommand_TryLock - 2019/11/27 02:27:47.023908 [INFO] agent: started state syncer
2019/11/27 02:27:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17536 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17542 [Candidate] entering Candidate state in term 2
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.154690 [DEBUG] http: Request PUT /v1/session/destroy/8800aad3-840f-207b-c7da-562761222369 (142.994748ms) from=127.0.0.1:39022
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.287881 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=23 (270.038595ms) from=127.0.0.1:39016
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.289327 [INFO] agent: Requesting shutdown
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.289407 [INFO] consul: shutting down server
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.289470 [WARN] serf: Shutdown without a Leave
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.407668 [WARN] serf: Shutdown without a Leave
2019/11/27 02:27:47 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17530 [Leader] entering Leader state
TestLockCommand_TrySemaphore - 2019/11/27 02:27:47.409539 [INFO] consul: cluster leadership acquired
TestLockCommand_TrySemaphore - 2019/11/27 02:27:47.410004 [INFO] consul: New leader elected: Node 3d7599df-d385-c820-bd6f-85e252f73ad8
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.476992 [INFO] manager: shutting down
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.477701 [INFO] agent: consul server down
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.477760 [INFO] agent: shutdown complete
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.477816 [INFO] agent: Stopping DNS server 127.0.0.1:17519 (tcp)
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.478063 [INFO] agent: Stopping DNS server 127.0.0.1:17519 (udp)
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.478256 [INFO] agent: Stopping HTTP server 127.0.0.1:17520 (tcp)
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.479428 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_ChildExitCode - 2019/11/27 02:27:47.479614 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_ChildExitCode (9.21s)
    --- PASS: TestLockCommand_ChildExitCode/clean_exit (1.57s)
    --- PASS: TestLockCommand_ChildExitCode/error_exit (1.11s)
    --- PASS: TestLockCommand_ChildExitCode/not_propagated (1.01s)
=== CONT  TestLockCommand_NoShell
2019/11/27 02:27:47 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17536 [Leader] entering Leader state
2019/11/27 02:27:47 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:47 [INFO]  raft: Node at 127.0.0.1:17542 [Leader] entering Leader state
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.570953 [INFO] consul: cluster leadership acquired
TestLockCommand_TryLock - 2019/11/27 02:27:47.571054 [INFO] consul: cluster leadership acquired
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:47.571469 [INFO] consul: New leader elected: Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660
TestLockCommand_TryLock - 2019/11/27 02:27:47.571526 [INFO] consul: New leader elected: Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947
WARNING: bootstrap = true: do not enable unless necessary
TestLockCommand_NoShell - 2019/11/27 02:27:47.630943 [WARN] agent: Node name "Node f9454198-1691-6eb2-de8d-78c3aaed6fa6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestLockCommand_NoShell - 2019/11/27 02:27:47.631320 [DEBUG] tlsutil: Update with version 1
TestLockCommand_NoShell - 2019/11/27 02:27:47.631379 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_NoShell - 2019/11/27 02:27:47.631581 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestLockCommand_NoShell - 2019/11/27 02:27:47.631672 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestLockCommand_TrySemaphore - 2019/11/27 02:27:47.710495 [INFO] agent: Synced node info
TestLockCommand_TryLock - 2019/11/27 02:27:47.953447 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:48.132324 [INFO] agent: Synced node info
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:48.132462 [DEBUG] agent: Node info in sync
2019/11/27 02:27:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:f9454198-1691-6eb2-de8d-78c3aaed6fa6 Address:127.0.0.1:17548}]
TestLockCommand_NoShell - 2019/11/27 02:27:48.567421 [INFO] serf: EventMemberJoin: Node f9454198-1691-6eb2-de8d-78c3aaed6fa6.dc1 127.0.0.1
2019/11/27 02:27:48 [INFO]  raft: Node at 127.0.0.1:17548 [Follower] entering Follower state (Leader: "")
TestLockCommand_NoShell - 2019/11/27 02:27:48.572765 [INFO] serf: EventMemberJoin: Node f9454198-1691-6eb2-de8d-78c3aaed6fa6 127.0.0.1
TestLockCommand_NoShell - 2019/11/27 02:27:48.573604 [INFO] consul: Adding LAN server Node f9454198-1691-6eb2-de8d-78c3aaed6fa6 (Addr: tcp/127.0.0.1:17548) (DC: dc1)
TestLockCommand_NoShell - 2019/11/27 02:27:48.573606 [INFO] consul: Handled member-join event for server "Node f9454198-1691-6eb2-de8d-78c3aaed6fa6.dc1" in area "wan"
TestLockCommand_NoShell - 2019/11/27 02:27:48.574220 [INFO] agent: Started DNS server 127.0.0.1:17543 (udp)
TestLockCommand_NoShell - 2019/11/27 02:27:48.574299 [INFO] agent: Started DNS server 127.0.0.1:17543 (tcp)
TestLockCommand_NoShell - 2019/11/27 02:27:48.576517 [INFO] agent: Started HTTP server on 127.0.0.1:17544 (tcp)
TestLockCommand_NoShell - 2019/11/27 02:27:48.576645 [INFO] agent: started state syncer
2019/11/27 02:27:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:48 [INFO]  raft: Node at 127.0.0.1:17548 [Candidate] entering Candidate state in term 2
TestLockCommand_TrySemaphore - 2019/11/27 02:27:48.864554 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_TrySemaphore - 2019/11/27 02:27:48.864986 [DEBUG] consul: Skipping self join check for "Node 3d7599df-d385-c820-bd6f-85e252f73ad8" since the cluster is too small
TestLockCommand_TrySemaphore - 2019/11/27 02:27:48.865134 [INFO] consul: member 'Node 3d7599df-d385-c820-bd6f-85e252f73ad8' joined, marking health alive
TestLockCommand_TryLock - 2019/11/27 02:27:49.119607 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_TryLock - 2019/11/27 02:27:49.120105 [DEBUG] consul: Skipping self join check for "Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947" since the cluster is too small
TestLockCommand_TryLock - 2019/11/27 02:27:49.120263 [INFO] consul: member 'Node 91b863e9-532c-3c40-3d6f-cd5a5fb75947' joined, marking health alive
TestLockCommand_TrySemaphore - 2019/11/27 02:27:49.135979 [DEBUG] http: Request GET /v1/agent/self (8.82798ms) from=127.0.0.1:35458
2019/11/27 02:27:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:49 [INFO]  raft: Node at 127.0.0.1:17548 [Leader] entering Leader state
TestLockCommand_NoShell - 2019/11/27 02:27:49.367295 [INFO] consul: cluster leadership acquired
TestLockCommand_NoShell - 2019/11/27 02:27:49.367783 [INFO] consul: New leader elected: Node f9454198-1691-6eb2-de8d-78c3aaed6fa6
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:49.564927 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:49.565343 [DEBUG] consul: Skipping self join check for "Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660" since the cluster is too small
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:49.565502 [INFO] consul: member 'Node ea17209f-8cdf-3840-06b2-ceb7d7ba9660' joined, marking health alive
TestLockCommand_TryLock - 2019/11/27 02:27:49.587768 [DEBUG] http: Request GET /v1/agent/self (7.305926ms) from=127.0.0.1:38404
TestLockCommand_TrySemaphore - 2019/11/27 02:27:49.665429 [DEBUG] http: Request PUT /v1/session/create (516.69169ms) from=127.0.0.1:35458
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:49.764552 [DEBUG] http: Request GET /v1/agent/self (6.651236ms) from=127.0.0.1:55230
TestLockCommand_NoShell - 2019/11/27 02:27:49.831144 [INFO] agent: Synced node info
TestLockCommand_TrySemaphore - 2019/11/27 02:27:49.834893 [DEBUG] http: Request PUT /v1/kv/test/prefix/b2fb3948-76f6-8923-4652-d7b2cd41730f?acquire=b2fb3948-76f6-8923-4652-d7b2cd41730f&flags=16210313421097356768 (164.01216ms) from=127.0.0.1:35458
TestLockCommand_TryLock - 2019/11/27 02:27:49.837674 [DEBUG] http: Request PUT /v1/session/create (239.090161ms) from=127.0.0.1:38404
TestLockCommand_TrySemaphore - 2019/11/27 02:27:49.843024 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse=&wait=10000ms (2.116741ms) from=127.0.0.1:35458
TestLockCommand_TryLock - 2019/11/27 02:27:49.849553 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=10000ms (1.59339ms) from=127.0.0.1:38404
TestLockCommand_NoShell - 2019/11/27 02:27:50.002042 [DEBUG] agent: Node info in sync
TestLockCommand_NoShell - 2019/11/27 02:27:50.002157 [DEBUG] agent: Node info in sync
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.023436 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=0&flags=16210313421097356768 (177.168627ms) from=127.0.0.1:35458
TestLockCommand_TryLock - 2019/11/27 02:27:50.023842 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=1374f5e0-aa8a-9065-05c2-8facd345aa0e&flags=3304740253564472344 (167.67429ms) from=127.0.0.1:38404
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.040199 [DEBUG] http: Request PUT /v1/session/create (263.695701ms) from=127.0.0.1:55230
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.055522 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (386.68µs) from=127.0.0.1:55230
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.057491 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&recurse= (4.846839ms) from=127.0.0.1:35458
TestLockCommand_TryLock - 2019/11/27 02:27:50.062155 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (6.966248ms) from=127.0.0.1:38404
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.083770 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.558722ms) from=127.0.0.1:35466
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.373357 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.373440 [DEBUG] agent: Node info in sync
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.373530 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/11/27 02:27:50.466186 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=1374f5e0-aa8a-9065-05c2-8facd345aa0e (390.07719ms) from=127.0.0.1:38408
TestLockCommand_TryLock - 2019/11/27 02:27:50.470371 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (395.916398ms) from=127.0.0.1:38404
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.480459 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=98ba548e-d6d5-a02d-8f38-a94aa4a74ae7&flags=3304740253564472344 (422.222665ms) from=127.0.0.1:55230
TestLockCommand_TryLock - 2019/11/27 02:27:50.482742 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (2.55309ms) from=127.0.0.1:38408
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.505887 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (3.227782ms) from=127.0.0.1:55230
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.510461 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?cas=13&flags=16210313421097356768 (410.477248ms) from=127.0.0.1:35466
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.510922 [DEBUG] http: Request GET /v1/kv/test/prefix?consistent=&index=13&recurse= (448.258924ms) from=127.0.0.1:35458
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.754069 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=98ba548e-d6d5-a02d-8f38-a94aa4a74ae7 (233.476627ms) from=127.0.0.1:55236
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.758353 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (244.968702ms) from=127.0.0.1:55230
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.768879 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (999.035µs) from=127.0.0.1:55236
TestLockCommand_TryLock - 2019/11/27 02:27:50.778013 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (269.893921ms) from=127.0.0.1:38408
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.779772 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:50.779837 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/11/27 02:27:50.780998 [INFO] agent: Requesting shutdown
TestLockCommand_TryLock - 2019/11/27 02:27:50.781097 [INFO] consul: shutting down server
TestLockCommand_TryLock - 2019/11/27 02:27:50.781151 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.784381 [DEBUG] http: Request DELETE /v1/kv/test/prefix/b2fb3948-76f6-8923-4652-d7b2cd41730f (271.096297ms) from=127.0.0.1:35466
TestLockCommand_TrySemaphore - 2019/11/27 02:27:50.790197 [DEBUG] http: Request GET /v1/kv/test/prefix?recurse= (884.365µs) from=127.0.0.1:35466
TestLockCommand_TryLock - 2019/11/27 02:27:50.862045 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestLockCommand_TryLock - 2019/11/27 02:27:50.862121 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/11/27 02:27:50.862198 [DEBUG] agent: Node info in sync
TestLockCommand_TryLock - 2019/11/27 02:27:50.975567 [WARN] serf: Shutdown without a Leave
TestLockCommand_TryLock - 2019/11/27 02:27:51.063141 [INFO] manager: shutting down
TestLockCommand_TryLock - 2019/11/27 02:27:51.066988 [INFO] agent: consul server down
TestLockCommand_TryLock - 2019/11/27 02:27:51.067059 [INFO] agent: shutdown complete
TestLockCommand_TryLock - 2019/11/27 02:27:51.067149 [INFO] agent: Stopping DNS server 127.0.0.1:17537 (tcp)
TestLockCommand_TryLock - 2019/11/27 02:27:51.067319 [INFO] agent: Stopping DNS server 127.0.0.1:17537 (udp)
TestLockCommand_TryLock - 2019/11/27 02:27:51.067485 [INFO] agent: Stopping HTTP server 127.0.0.1:17538 (tcp)
TestLockCommand_TryLock - 2019/11/27 02:27:51.068130 [ERR] consul.session: Apply failed: leadership lost while committing log
TestLockCommand_TryLock - 2019/11/27 02:27:51.068259 [ERR] http: Request PUT /v1/session/destroy/1374f5e0-aa8a-9065-05c2-8facd345aa0e, error: leadership lost while committing log from=127.0.0.1:38404
TestLockCommand_TryLock - 2019/11/27 02:27:51.069049 [DEBUG] http: Request PUT /v1/session/destroy/1374f5e0-aa8a-9065-05c2-8facd345aa0e (557.299131ms) from=127.0.0.1:38404
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.074522 [DEBUG] http: Request PUT /v1/session/destroy/98ba548e-d6d5-a02d-8f38-a94aa4a74ae7 (311.652738ms) from=127.0.0.1:55230
TestLockCommand_NoShell - 2019/11/27 02:27:51.252815 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestLockCommand_NoShell - 2019/11/27 02:27:51.253414 [DEBUG] consul: Skipping self join check for "Node f9454198-1691-6eb2-de8d-78c3aaed6fa6" since the cluster is too small
TestLockCommand_NoShell - 2019/11/27 02:27:51.253597 [INFO] consul: member 'Node f9454198-1691-6eb2-de8d-78c3aaed6fa6' joined, marking health alive
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.255289 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (483.612513ms) from=127.0.0.1:55236
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.257302 [INFO] agent: Requesting shutdown
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.257391 [INFO] consul: shutting down server
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.257457 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.258438 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=14 (465.516203ms) from=127.0.0.1:35466
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.260546 [INFO] agent: Requesting shutdown
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.260635 [INFO] consul: shutting down server
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.260706 [WARN] serf: Shutdown without a Leave
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.261876 [DEBUG] http: Request PUT /v1/session/destroy/b2fb3948-76f6-8923-4652-d7b2cd41730f (474.730864ms) from=127.0.0.1:35458
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.353221 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.353848 [WARN] serf: Shutdown without a Leave
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.442034 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.442750 [INFO] agent: consul server down
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.442816 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.442871 [INFO] agent: Stopping DNS server 127.0.0.1:17531 (tcp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.443054 [INFO] agent: Stopping DNS server 127.0.0.1:17531 (udp)
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.443226 [INFO] agent: Stopping HTTP server 127.0.0.1:17532 (tcp)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.443442 [INFO] manager: shutting down
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.443942 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.444084 [INFO] agent: consul server down
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.444139 [INFO] agent: shutdown complete
TestLockCommand_MonitorRetry_Lock_Default - 2019/11/27 02:27:51.444088 [INFO] agent: Endpoints down
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.444193 [INFO] agent: Stopping DNS server 127.0.0.1:17525 (tcp)
--- PASS: TestLockCommand_MonitorRetry_Lock_Default (5.55s)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.444348 [INFO] agent: Stopping DNS server 127.0.0.1:17525 (udp)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.444542 [INFO] agent: Stopping HTTP server 127.0.0.1:17526 (tcp)
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.445206 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TrySemaphore - 2019/11/27 02:27:51.445353 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_TrySemaphore (5.83s)
TestLockCommand_NoShell - 2019/11/27 02:27:51.483704 [DEBUG] http: Request GET /v1/agent/self (17.416285ms) from=127.0.0.1:49754
TestLockCommand_TryLock - 2019/11/27 02:27:51.568055 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_TryLock - 2019/11/27 02:27:51.568133 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_TryLock (5.59s)
TestLockCommand_NoShell - 2019/11/27 02:27:51.720409 [DEBUG] http: Request PUT /v1/session/create (225.486009ms) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:51.724785 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?wait=15000ms (271.676µs) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:51.930828 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?acquire=becc5378-46b5-396e-929b-810766b1da94&flags=3304740253564472344 (203.08088ms) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:51.941379 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent= (3.528459ms) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:52.097841 [DEBUG] http: Request PUT /v1/kv/test/prefix/.lock?flags=3304740253564472344&release=becc5378-46b5-396e-929b-810766b1da94 (148.697615ms) from=127.0.0.1:49758
TestLockCommand_NoShell - 2019/11/27 02:27:52.099988 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock?consistent=&index=12 (153.790462ms) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:52.103535 [DEBUG] http: Request GET /v1/kv/test/prefix/.lock (1.104039ms) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:52.245343 [DEBUG] http: Request PUT /v1/session/destroy/becc5378-46b5-396e-929b-810766b1da94 (142.555396ms) from=127.0.0.1:49758
TestLockCommand_NoShell - 2019/11/27 02:27:52.477874 [DEBUG] http: Request DELETE /v1/kv/test/prefix/.lock?cas=13 (366.537685ms) from=127.0.0.1:49754
TestLockCommand_NoShell - 2019/11/27 02:27:52.480078 [INFO] agent: Requesting shutdown
TestLockCommand_NoShell - 2019/11/27 02:27:52.480216 [INFO] consul: shutting down server
TestLockCommand_NoShell - 2019/11/27 02:27:52.480343 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/11/27 02:27:53.145387 [WARN] serf: Shutdown without a Leave
TestLockCommand_NoShell - 2019/11/27 02:27:53.285203 [INFO] manager: shutting down
TestLockCommand_NoShell - 2019/11/27 02:27:53.286103 [INFO] agent: consul server down
TestLockCommand_NoShell - 2019/11/27 02:27:53.286181 [INFO] agent: shutdown complete
TestLockCommand_NoShell - 2019/11/27 02:27:53.286246 [INFO] agent: Stopping DNS server 127.0.0.1:17543 (tcp)
TestLockCommand_NoShell - 2019/11/27 02:27:53.286406 [INFO] agent: Stopping DNS server 127.0.0.1:17543 (udp)
TestLockCommand_NoShell - 2019/11/27 02:27:53.286576 [INFO] agent: Stopping HTTP server 127.0.0.1:17544 (tcp)
TestLockCommand_NoShell - 2019/11/27 02:27:53.287729 [INFO] agent: Waiting for endpoints to shut down
TestLockCommand_NoShell - 2019/11/27 02:27:53.287872 [INFO] agent: Endpoints down
--- PASS: TestLockCommand_NoShell (5.81s)
PASS
ok  	github.com/hashicorp/consul/command/lock	15.276s
=== RUN   TestMaintCommand_noTabs
=== PAUSE TestMaintCommand_noTabs
=== RUN   TestMaintCommand_ConflictingArgs
=== PAUSE TestMaintCommand_ConflictingArgs
=== RUN   TestMaintCommand_NoArgs
=== PAUSE TestMaintCommand_NoArgs
=== RUN   TestMaintCommand_EnableNodeMaintenance
=== PAUSE TestMaintCommand_EnableNodeMaintenance
=== RUN   TestMaintCommand_DisableNodeMaintenance
=== PAUSE TestMaintCommand_DisableNodeMaintenance
=== RUN   TestMaintCommand_EnableServiceMaintenance
=== PAUSE TestMaintCommand_EnableServiceMaintenance
=== RUN   TestMaintCommand_DisableServiceMaintenance
=== PAUSE TestMaintCommand_DisableServiceMaintenance
=== RUN   TestMaintCommand_ServiceMaintenance_NoService
=== PAUSE TestMaintCommand_ServiceMaintenance_NoService
=== CONT  TestMaintCommand_noTabs
=== CONT  TestMaintCommand_EnableServiceMaintenance
=== CONT  TestMaintCommand_ServiceMaintenance_NoService
=== CONT  TestMaintCommand_DisableServiceMaintenance
--- PASS: TestMaintCommand_noTabs (0.00s)
=== CONT  TestMaintCommand_DisableNodeMaintenance
WARNING: bootstrap = true: do not enable unless necessary
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:40.321007 [WARN] agent: Node name "Node cf0b6736-822e-58c7-d0a4-840b32214db1" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:40.322055 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:40.322153 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:40.322384 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:40.322512 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:40.312206 [WARN] agent: Node name "Node 2212cb5a-cd4d-8f68-2bbe-f99f849dc41a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:40.327320 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:40.327394 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:40.341261 [WARN] agent: Node name "Node 033518fa-f6ba-ec61-fd49-e309642f150a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:40.341778 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:40.341965 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:40.349051 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:40.349229 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:40.349507 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:40.349640 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:40.379091 [WARN] agent: Node name "Node 651c163c-e4ec-e078-2978-39e52b7d9b2f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:40.379680 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:40.379877 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:40.380244 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:40.380504 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:033518fa-f6ba-ec61-fd49-e309642f150a Address:127.0.0.1:34018}]
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34018 [Follower] entering Follower state (Leader: "")
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.054947 [INFO] serf: EventMemberJoin: Node 033518fa-f6ba-ec61-fd49-e309642f150a.dc1 127.0.0.1
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.061221 [INFO] serf: EventMemberJoin: Node 033518fa-f6ba-ec61-fd49-e309642f150a 127.0.0.1
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.062498 [INFO] consul: Adding LAN server Node 033518fa-f6ba-ec61-fd49-e309642f150a (Addr: tcp/127.0.0.1:34018) (DC: dc1)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.062842 [INFO] consul: Handled member-join event for server "Node 033518fa-f6ba-ec61-fd49-e309642f150a.dc1" in area "wan"
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.063438 [INFO] agent: Started DNS server 127.0.0.1:34013 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.063506 [INFO] agent: Started DNS server 127.0.0.1:34013 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.066037 [INFO] agent: Started HTTP server on 127.0.0.1:34014 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:42.066443 [INFO] agent: started state syncer
2019/11/27 02:27:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34018 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cf0b6736-822e-58c7-d0a4-840b32214db1 Address:127.0.0.1:34006}]
2019/11/27 02:27:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:651c163c-e4ec-e078-2978-39e52b7d9b2f Address:127.0.0.1:34024}]
2019/11/27 02:27:42 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2212cb5a-cd4d-8f68-2bbe-f99f849dc41a Address:127.0.0.1:34012}]
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.167668 [INFO] serf: EventMemberJoin: Node cf0b6736-822e-58c7-d0a4-840b32214db1.dc1 127.0.0.1
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34024 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34012 [Follower] entering Follower state (Leader: "")
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.186616 [INFO] serf: EventMemberJoin: Node 651c163c-e4ec-e078-2978-39e52b7d9b2f.dc1 127.0.0.1
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.191055 [INFO] serf: EventMemberJoin: Node cf0b6736-822e-58c7-d0a4-840b32214db1 127.0.0.1
2019/11/27 02:27:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.221207 [INFO] serf: EventMemberJoin: Node 651c163c-e4ec-e078-2978-39e52b7d9b2f 127.0.0.1
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.224600 [INFO] consul: Handled member-join event for server "Node cf0b6736-822e-58c7-d0a4-840b32214db1.dc1" in area "wan"
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.225271 [INFO] consul: Adding LAN server Node cf0b6736-822e-58c7-d0a4-840b32214db1 (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.227489 [INFO] consul: Adding LAN server Node 651c163c-e4ec-e078-2978-39e52b7d9b2f (Addr: tcp/127.0.0.1:34024) (DC: dc1)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.228881 [INFO] serf: EventMemberJoin: Node 2212cb5a-cd4d-8f68-2bbe-f99f849dc41a.dc1 127.0.0.1
2019/11/27 02:27:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.226428 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.231123 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.227856 [INFO] consul: Handled member-join event for server "Node 651c163c-e4ec-e078-2978-39e52b7d9b2f.dc1" in area "wan"
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.228437 [INFO] agent: Started DNS server 127.0.0.1:34019 (udp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.231278 [INFO] agent: Started DNS server 127.0.0.1:34019 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.233220 [INFO] agent: Started HTTP server on 127.0.0.1:34020 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:42.233327 [INFO] agent: started state syncer
2019/11/27 02:27:42 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34024 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:42 [INFO]  raft: Node at 127.0.0.1:34012 [Candidate] entering Candidate state in term 2
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.237392 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:42.237704 [INFO] agent: started state syncer
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.246449 [INFO] serf: EventMemberJoin: Node 2212cb5a-cd4d-8f68-2bbe-f99f849dc41a 127.0.0.1
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.248049 [INFO] consul: Adding LAN server Node 2212cb5a-cd4d-8f68-2bbe-f99f849dc41a (Addr: tcp/127.0.0.1:34012) (DC: dc1)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.248344 [INFO] consul: Handled member-join event for server "Node 2212cb5a-cd4d-8f68-2bbe-f99f849dc41a.dc1" in area "wan"
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.249135 [INFO] agent: Started DNS server 127.0.0.1:34007 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.249210 [INFO] agent: Started DNS server 127.0.0.1:34007 (udp)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.251122 [INFO] agent: Started HTTP server on 127.0.0.1:34008 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:42.251519 [INFO] agent: started state syncer
2019/11/27 02:27:43 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:43 [INFO]  raft: Node at 127.0.0.1:34018 [Leader] entering Leader state
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.376302 [INFO] consul: cluster leadership acquired
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.376866 [INFO] consul: New leader elected: Node 033518fa-f6ba-ec61-fd49-e309642f150a
2019/11/27 02:27:43 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:43 [INFO]  raft: Node at 127.0.0.1:34012 [Leader] entering Leader state
2019/11/27 02:27:43 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:43 [INFO]  raft: Node at 127.0.0.1:34024 [Leader] entering Leader state
2019/11/27 02:27:43 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:43 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.489108 [INFO] consul: cluster leadership acquired
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.489552 [INFO] consul: New leader elected: Node 651c163c-e4ec-e078-2978-39e52b7d9b2f
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.489813 [INFO] consul: cluster leadership acquired
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.490279 [INFO] consul: New leader elected: Node cf0b6736-822e-58c7-d0a4-840b32214db1
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.490509 [INFO] consul: cluster leadership acquired
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.490872 [INFO] consul: New leader elected: Node 2212cb5a-cd4d-8f68-2bbe-f99f849dc41a
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.778519 [INFO] agent: Synced node info
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.805224 [DEBUG] http: Request GET /v1/agent/self (207.787718ms) from=127.0.0.1:55318
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.823867 [DEBUG] http: Request PUT /v1/agent/service/maintenance/redis?enable=true&reason=broken (802.362µs) from=127.0.0.1:55318
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.825364 [INFO] agent: Requesting shutdown
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.825548 [INFO] consul: shutting down server
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.825683 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.942703 [INFO] agent: Synced node info
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.954065 [DEBUG] http: Request GET /v1/agent/self (382.436593ms) from=127.0.0.1:59536
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.963476 [INFO] agent: Synced service "test"
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.963577 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.963673 [DEBUG] agent: Service "test" in sync
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.963721 [DEBUG] agent: Node info in sync
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:43.961780 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.954761 [INFO] agent: Synced service "test"
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.969158 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.969547 [DEBUG] agent: Service "test" in sync
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.969715 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:43.969817 [DEBUG] http: Request GET /v1/agent/self (404.371372ms) from=127.0.0.1:47206
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.974173 [DEBUG] http: Request GET /v1/agent/self (320.129711ms) from=127.0.0.1:58732
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.987428 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.987560 [DEBUG] http: Request PUT /v1/agent/maintenance?enable=false (139.672µs) from=127.0.0.1:59536
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.987450 [DEBUG] agent: Service "test" in sync
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.987934 [DEBUG] agent: Node info in sync
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.988003 [DEBUG] http: Request PUT /v1/agent/service/maintenance/test?enable=false (588.021µs) from=127.0.0.1:58732
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.988506 [INFO] agent: Requesting shutdown
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.988563 [INFO] consul: shutting down server
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:43.988603 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.991324 [INFO] agent: Requesting shutdown
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.991399 [INFO] consul: shutting down server
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:43.991442 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.121530 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.123970 [INFO] agent: Service "test" entered maintenance mode
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.124076 [DEBUG] agent: Service "test" in sync
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.125704 [INFO] manager: shutting down
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.127825 [WARN] serf: Shutdown without a Leave
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.215736 [INFO] manager: shutting down
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.217857 [INFO] manager: shutting down
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.218319 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.219104 [INFO] agent: consul server down
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.219299 [INFO] agent: shutdown complete
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.220414 [INFO] agent: Stopping DNS server 127.0.0.1:34013 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.220990 [INFO] agent: Stopping DNS server 127.0.0.1:34013 (udp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.221661 [INFO] agent: Stopping HTTP server 127.0.0.1:34014 (tcp)
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.223284 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_ServiceMaintenance_NoService - 2019/11/27 02:27:44.223530 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_ServiceMaintenance_NoService (4.06s)
=== CONT  TestMaintCommand_NoArgs
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_NoArgs - 2019/11/27 02:27:44.329756 [WARN] agent: Node name "Node 84bc0f61-89d8-3b01-257c-9aee79cfa4b0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_NoArgs - 2019/11/27 02:27:44.330277 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_NoArgs - 2019/11/27 02:27:44.330356 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_NoArgs - 2019/11/27 02:27:44.330571 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMaintCommand_NoArgs - 2019/11/27 02:27:44.330713 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.530394 [INFO] agent: consul server down
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.530473 [INFO] agent: shutdown complete
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.530554 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.530693 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (udp)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.530853 [INFO] agent: Stopping HTTP server 127.0.0.1:34008 (tcp)
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.531327 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.531422 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestMaintCommand_DisableNodeMaintenance - 2019/11/27 02:27:44.531590 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_DisableNodeMaintenance (4.37s)
=== CONT  TestMaintCommand_EnableNodeMaintenance
WARNING: bootstrap = true: do not enable unless necessary
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:44.754990 [WARN] agent: Node name "Node 487f0702-0f75-64ad-ffe1-11eb9a113dfd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:44.755677 [DEBUG] tlsutil: Update with version 1
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:44.755855 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:44.756089 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:44.756274 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.821216 [INFO] agent: consul server down
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.821302 [INFO] agent: shutdown complete
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.821365 [INFO] agent: Stopping DNS server 127.0.0.1:34019 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.821565 [INFO] agent: Stopping DNS server 127.0.0.1:34019 (udp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.821850 [INFO] agent: Stopping HTTP server 127.0.0.1:34020 (tcp)
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.823087 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.823302 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_DisableServiceMaintenance (4.66s)
=== CONT  TestMaintCommand_ConflictingArgs
TestMaintCommand_DisableServiceMaintenance - 2019/11/27 02:27:44.823567 [ERR] consul: failed to establish leadership: leadership lost while committing log
--- PASS: TestMaintCommand_ConflictingArgs (0.00s)
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.978179 [INFO] agent: Synced check "_service_maintenance:test"
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.978253 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.978339 [DEBUG] http: Request PUT /v1/agent/service/maintenance/test?enable=true&reason=broken (990.289194ms) from=127.0.0.1:47206
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.978917 [DEBUG] agent: Service "test" in sync
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.979657 [INFO] agent: Requesting shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.979730 [INFO] consul: shutting down server
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:44.979773 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.086487 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.163441 [INFO] manager: shutting down
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.163680 [WARN] agent: Syncing check "_service_maintenance:test" failed. raft is already shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.163764 [ERR] agent: failed to sync remote state: raft is already shutdown
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.167679 [INFO] agent: consul server down
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.167802 [INFO] agent: shutdown complete
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.167882 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.168133 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.168357 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.169248 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.169459 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestMaintCommand_EnableServiceMaintenance - 2019/11/27 02:27:45.170120 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_EnableServiceMaintenance (5.01s)
2019/11/27 02:27:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:84bc0f61-89d8-3b01-257c-9aee79cfa4b0 Address:127.0.0.1:34030}]
2019/11/27 02:27:45 [INFO]  raft: Node at 127.0.0.1:34030 [Follower] entering Follower state (Leader: "")
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.615976 [INFO] serf: EventMemberJoin: Node 84bc0f61-89d8-3b01-257c-9aee79cfa4b0.dc1 127.0.0.1
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.632059 [INFO] serf: EventMemberJoin: Node 84bc0f61-89d8-3b01-257c-9aee79cfa4b0 127.0.0.1
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.633847 [INFO] agent: Started DNS server 127.0.0.1:34025 (udp)
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.634484 [INFO] consul: Adding LAN server Node 84bc0f61-89d8-3b01-257c-9aee79cfa4b0 (Addr: tcp/127.0.0.1:34030) (DC: dc1)
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.634772 [INFO] consul: Handled member-join event for server "Node 84bc0f61-89d8-3b01-257c-9aee79cfa4b0.dc1" in area "wan"
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.635359 [INFO] agent: Started DNS server 127.0.0.1:34025 (tcp)
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.638053 [INFO] agent: Started HTTP server on 127.0.0.1:34026 (tcp)
TestMaintCommand_NoArgs - 2019/11/27 02:27:45.638243 [INFO] agent: started state syncer
2019/11/27 02:27:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:45 [INFO]  raft: Node at 127.0.0.1:34030 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:45 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:487f0702-0f75-64ad-ffe1-11eb9a113dfd Address:127.0.0.1:34036}]
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.785863 [INFO] serf: EventMemberJoin: Node 487f0702-0f75-64ad-ffe1-11eb9a113dfd.dc1 127.0.0.1
2019/11/27 02:27:45 [INFO]  raft: Node at 127.0.0.1:34036 [Follower] entering Follower state (Leader: "")
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.800135 [INFO] serf: EventMemberJoin: Node 487f0702-0f75-64ad-ffe1-11eb9a113dfd 127.0.0.1
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.804305 [INFO] consul: Adding LAN server Node 487f0702-0f75-64ad-ffe1-11eb9a113dfd (Addr: tcp/127.0.0.1:34036) (DC: dc1)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.806572 [INFO] consul: Handled member-join event for server "Node 487f0702-0f75-64ad-ffe1-11eb9a113dfd.dc1" in area "wan"
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.811507 [INFO] agent: Started DNS server 127.0.0.1:34031 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.811602 [INFO] agent: Started DNS server 127.0.0.1:34031 (udp)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.817158 [INFO] agent: Started HTTP server on 127.0.0.1:34032 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:45.817627 [INFO] agent: started state syncer
2019/11/27 02:27:45 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:45 [INFO]  raft: Node at 127.0.0.1:34036 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:46 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:46 [INFO]  raft: Node at 127.0.0.1:34030 [Leader] entering Leader state
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.279450 [INFO] consul: cluster leadership acquired
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.279936 [INFO] consul: New leader elected: Node 84bc0f61-89d8-3b01-257c-9aee79cfa4b0
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.380674 [INFO] agent: Service "test" entered maintenance mode
2019/11/27 02:27:46 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:46 [INFO]  raft: Node at 127.0.0.1:34036 [Leader] entering Leader state
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:46.453607 [INFO] consul: cluster leadership acquired
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:46.454226 [INFO] consul: New leader elected: Node 487f0702-0f75-64ad-ffe1-11eb9a113dfd
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.458419 [INFO] agent: Node entered maintenance mode
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.610433 [INFO] agent: Synced service "test"
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.610873 [DEBUG] agent: Check "_service_maintenance:test" in sync
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:46.769873 [INFO] agent: Synced node info
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:46.769997 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:46.775140 [DEBUG] http: Request GET /v1/agent/self (126.119815ms) from=127.0.0.1:58686
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.844510 [INFO] agent: Synced check "_node_maintenance"
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.844588 [DEBUG] agent: Node info in sync
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.844734 [DEBUG] agent: Service "test" in sync
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.844791 [DEBUG] agent: Check "_service_maintenance:test" in sync
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.844833 [DEBUG] agent: Check "_node_maintenance" in sync
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.844875 [DEBUG] agent: Node info in sync
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.854209 [DEBUG] http: Request GET /v1/agent/self (387.737778ms) from=127.0.0.1:51544
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.874242 [DEBUG] http: Request GET /v1/agent/checks (618.356µs) from=127.0.0.1:51544
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.876882 [INFO] agent: Requesting shutdown
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.876982 [INFO] consul: shutting down server
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.877031 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:46.919645 [INFO] agent: Node entered maintenance mode
TestMaintCommand_NoArgs - 2019/11/27 02:27:46.996622 [WARN] serf: Shutdown without a Leave
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.093655 [INFO] manager: shutting down
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.094986 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.094999 [INFO] agent: consul server down
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.095261 [INFO] agent: shutdown complete
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.095325 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.095327 [INFO] agent: Stopping DNS server 127.0.0.1:34025 (tcp)
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.095554 [INFO] agent: Stopping DNS server 127.0.0.1:34025 (udp)
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.095732 [INFO] agent: Stopping HTTP server 127.0.0.1:34026 (tcp)
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.096252 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_NoArgs - 2019/11/27 02:27:47.096343 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_NoArgs (2.87s)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.409289 [INFO] agent: Synced check "_node_maintenance"
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.409366 [DEBUG] agent: Node info in sync
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.409440 [DEBUG] http: Request PUT /v1/agent/maintenance?enable=true&reason=broken (622.303778ms) from=127.0.0.1:58686
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.411920 [INFO] agent: Requesting shutdown
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.412016 [INFO] consul: shutting down server
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.412066 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.475699 [WARN] serf: Shutdown without a Leave
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.552295 [INFO] manager: shutting down
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.553229 [INFO] agent: consul server down
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.553294 [INFO] agent: shutdown complete
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.553366 [INFO] agent: Stopping DNS server 127.0.0.1:34031 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.553574 [INFO] agent: Stopping DNS server 127.0.0.1:34031 (udp)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.553792 [INFO] agent: Stopping HTTP server 127.0.0.1:34032 (tcp)
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.554404 [INFO] agent: Waiting for endpoints to shut down
TestMaintCommand_EnableNodeMaintenance - 2019/11/27 02:27:47.554518 [INFO] agent: Endpoints down
--- PASS: TestMaintCommand_EnableNodeMaintenance (3.02s)
PASS
ok  	github.com/hashicorp/consul/command/maint	7.573s
=== RUN   TestMembersCommand_noTabs
=== PAUSE TestMembersCommand_noTabs
=== RUN   TestMembersCommand
=== PAUSE TestMembersCommand
=== RUN   TestMembersCommand_WAN
=== PAUSE TestMembersCommand_WAN
=== RUN   TestMembersCommand_statusFilter
=== PAUSE TestMembersCommand_statusFilter
=== RUN   TestMembersCommand_statusFilter_failed
=== PAUSE TestMembersCommand_statusFilter_failed
=== CONT  TestMembersCommand_noTabs
=== CONT  TestMembersCommand_statusFilter
=== CONT  TestMembersCommand_statusFilter_failed
=== CONT  TestMembersCommand
=== CONT  TestMembersCommand_WAN
--- PASS: TestMembersCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_statusFilter - 2019/11/27 02:27:48.877531 [WARN] agent: Node name "Node 01fbdfe8-4552-5d09-c863-acce0ed6373b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_statusFilter - 2019/11/27 02:27:48.878769 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_statusFilter - 2019/11/27 02:27:48.878928 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMembersCommand_statusFilter - 2019/11/27 02:27:48.879454 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMembersCommand_statusFilter - 2019/11/27 02:27:48.879628 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_WAN - 2019/11/27 02:27:48.903178 [WARN] agent: Node name "Node e6b7a178-66ff-93eb-dc92-6a76a28d9af6" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_WAN - 2019/11/27 02:27:48.903679 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_WAN - 2019/11/27 02:27:48.903848 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMembersCommand_WAN - 2019/11/27 02:27:48.904119 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMembersCommand_WAN - 2019/11/27 02:27:48.904375 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand - 2019/11/27 02:27:48.942918 [WARN] agent: Node name "Node 477069ea-bb50-352e-f3c0-63cb762d83c3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand - 2019/11/27 02:27:48.943387 [DEBUG] tlsutil: Update with version 1
TestMembersCommand - 2019/11/27 02:27:48.943461 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMembersCommand - 2019/11/27 02:27:48.943635 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMembersCommand - 2019/11/27 02:27:48.943760 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:48.951022 [WARN] agent: Node name "Node ce00427f-595b-957f-2e73-2ca31dec6704" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:48.951842 [DEBUG] tlsutil: Update with version 1
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:48.952355 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:48.953690 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:48.953945 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:477069ea-bb50-352e-f3c0-63cb762d83c3 Address:127.0.0.1:52012}]
2019/11/27 02:27:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:01fbdfe8-4552-5d09-c863-acce0ed6373b Address:127.0.0.1:52006}]
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52012 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52006 [Follower] entering Follower state (Leader: "")
2019/11/27 02:27:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e6b7a178-66ff-93eb-dc92-6a76a28d9af6 Address:127.0.0.1:52024}]
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.130283 [INFO] serf: EventMemberJoin: Node 01fbdfe8-4552-5d09-c863-acce0ed6373b.dc1 127.0.0.1
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52024 [Follower] entering Follower state (Leader: "")
TestMembersCommand_WAN - 2019/11/27 02:27:50.130683 [INFO] serf: EventMemberJoin: Node e6b7a178-66ff-93eb-dc92-6a76a28d9af6.dc1 127.0.0.1
2019/11/27 02:27:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ce00427f-595b-957f-2e73-2ca31dec6704 Address:127.0.0.1:52018}]
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52018 [Follower] entering Follower state (Leader: "")
TestMembersCommand - 2019/11/27 02:27:50.132932 [INFO] serf: EventMemberJoin: Node 477069ea-bb50-352e-f3c0-63cb762d83c3.dc1 127.0.0.1
TestMembersCommand - 2019/11/27 02:27:50.142368 [INFO] serf: EventMemberJoin: Node 477069ea-bb50-352e-f3c0-63cb762d83c3 127.0.0.1
TestMembersCommand_WAN - 2019/11/27 02:27:50.144429 [INFO] serf: EventMemberJoin: Node e6b7a178-66ff-93eb-dc92-6a76a28d9af6 127.0.0.1
TestMembersCommand - 2019/11/27 02:27:50.146052 [INFO] consul: Adding LAN server Node 477069ea-bb50-352e-f3c0-63cb762d83c3 (Addr: tcp/127.0.0.1:52012) (DC: dc1)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.155390 [INFO] serf: EventMemberJoin: Node ce00427f-595b-957f-2e73-2ca31dec6704.dc1 127.0.0.1
TestMembersCommand - 2019/11/27 02:27:50.155017 [INFO] consul: Handled member-join event for server "Node 477069ea-bb50-352e-f3c0-63cb762d83c3.dc1" in area "wan"
2019/11/27 02:27:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52018 [Candidate] entering Candidate state in term 2
TestMembersCommand_WAN - 2019/11/27 02:27:50.195053 [INFO] consul: Adding LAN server Node e6b7a178-66ff-93eb-dc92-6a76a28d9af6 (Addr: tcp/127.0.0.1:52024) (DC: dc1)
TestMembersCommand_WAN - 2019/11/27 02:27:50.195659 [INFO] consul: Handled member-join event for server "Node e6b7a178-66ff-93eb-dc92-6a76a28d9af6.dc1" in area "wan"
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.195792 [INFO] serf: EventMemberJoin: Node 01fbdfe8-4552-5d09-c863-acce0ed6373b 127.0.0.1
2019/11/27 02:27:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52006 [Candidate] entering Candidate state in term 2
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.202911 [INFO] consul: Adding LAN server Node 01fbdfe8-4552-5d09-c863-acce0ed6373b (Addr: tcp/127.0.0.1:52006) (DC: dc1)
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.203171 [INFO] consul: Handled member-join event for server "Node 01fbdfe8-4552-5d09-c863-acce0ed6373b.dc1" in area "wan"
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.203679 [INFO] agent: Started DNS server 127.0.0.1:52001 (tcp)
2019/11/27 02:27:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52024 [Candidate] entering Candidate state in term 2
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.205862 [INFO] agent: Started DNS server 127.0.0.1:52001 (udp)
TestMembersCommand - 2019/11/27 02:27:50.207449 [INFO] agent: Started DNS server 127.0.0.1:52007 (udp)
TestMembersCommand - 2019/11/27 02:27:50.207802 [INFO] agent: Started DNS server 127.0.0.1:52007 (tcp)
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.208208 [INFO] agent: Started HTTP server on 127.0.0.1:52002 (tcp)
TestMembersCommand_statusFilter - 2019/11/27 02:27:50.208323 [INFO] agent: started state syncer
2019/11/27 02:27:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:50 [INFO]  raft: Node at 127.0.0.1:52012 [Candidate] entering Candidate state in term 2
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.213809 [INFO] serf: EventMemberJoin: Node ce00427f-595b-957f-2e73-2ca31dec6704 127.0.0.1
TestMembersCommand_WAN - 2019/11/27 02:27:50.214680 [INFO] agent: Started DNS server 127.0.0.1:52019 (udp)
TestMembersCommand_WAN - 2019/11/27 02:27:50.215018 [INFO] agent: Started DNS server 127.0.0.1:52019 (tcp)
TestMembersCommand - 2019/11/27 02:27:50.215200 [INFO] agent: Started HTTP server on 127.0.0.1:52008 (tcp)
TestMembersCommand - 2019/11/27 02:27:50.215704 [INFO] agent: started state syncer
TestMembersCommand_WAN - 2019/11/27 02:27:50.217073 [INFO] agent: Started HTTP server on 127.0.0.1:52020 (tcp)
TestMembersCommand_WAN - 2019/11/27 02:27:50.217186 [INFO] agent: started state syncer
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.217983 [INFO] consul: Adding LAN server Node ce00427f-595b-957f-2e73-2ca31dec6704 (Addr: tcp/127.0.0.1:52018) (DC: dc1)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.218221 [INFO] consul: Handled member-join event for server "Node ce00427f-595b-957f-2e73-2ca31dec6704.dc1" in area "wan"
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.218677 [INFO] agent: Started DNS server 127.0.0.1:52013 (udp)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.218743 [INFO] agent: Started DNS server 127.0.0.1:52013 (tcp)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.220679 [INFO] agent: Started HTTP server on 127.0.0.1:52014 (tcp)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:50.220765 [INFO] agent: started state syncer
2019/11/27 02:27:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:51 [INFO]  raft: Node at 127.0.0.1:52024 [Leader] entering Leader state
2019/11/27 02:27:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:51 [INFO]  raft: Node at 127.0.0.1:52018 [Leader] entering Leader state
TestMembersCommand_WAN - 2019/11/27 02:27:51.077008 [INFO] consul: cluster leadership acquired
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.077060 [INFO] consul: cluster leadership acquired
TestMembersCommand_WAN - 2019/11/27 02:27:51.077643 [INFO] consul: New leader elected: Node e6b7a178-66ff-93eb-dc92-6a76a28d9af6
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.077789 [INFO] consul: New leader elected: Node ce00427f-595b-957f-2e73-2ca31dec6704
2019/11/27 02:27:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:51 [INFO]  raft: Node at 127.0.0.1:52006 [Leader] entering Leader state
2019/11/27 02:27:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:51 [INFO]  raft: Node at 127.0.0.1:52012 [Leader] entering Leader state
TestMembersCommand - 2019/11/27 02:27:51.164746 [INFO] consul: cluster leadership acquired
TestMembersCommand - 2019/11/27 02:27:51.165216 [INFO] consul: New leader elected: Node 477069ea-bb50-352e-f3c0-63cb762d83c3
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.165605 [INFO] consul: cluster leadership acquired
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.166178 [INFO] consul: New leader elected: Node 01fbdfe8-4552-5d09-c863-acce0ed6373b
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.264007 [DEBUG] http: Request GET /v1/agent/members?segment=_all (3.673797ms) from=127.0.0.1:60050
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.271494 [INFO] agent: Requesting shutdown
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.271597 [INFO] consul: shutting down server
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.271647 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.273112 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand - 2019/11/27 02:27:51.400601 [DEBUG] http: Request GET /v1/agent/members?segment=_all (1.454385ms) from=127.0.0.1:58798
TestMembersCommand - 2019/11/27 02:27:51.404168 [INFO] agent: Requesting shutdown
TestMembersCommand - 2019/11/27 02:27:51.404278 [INFO] consul: shutting down server
TestMembersCommand - 2019/11/27 02:27:51.404359 [WARN] serf: Shutdown without a Leave
TestMembersCommand - 2019/11/27 02:27:51.404539 [ERR] agent: failed to sync remote state: No cluster leader
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.440765 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.452542 [DEBUG] http: Request GET /v1/agent/members?segment=_all (5.268521ms) from=127.0.0.1:49440
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.455063 [INFO] agent: Requesting shutdown
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.455165 [INFO] consul: shutting down server
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.455212 [WARN] serf: Shutdown without a Leave
TestMembersCommand_WAN - 2019/11/27 02:27:51.476666 [DEBUG] http: Request GET /v1/agent/members?segment=_all&wan=1 (916.032µs) from=127.0.0.1:40924
TestMembersCommand_WAN - 2019/11/27 02:27:51.480332 [INFO] agent: Requesting shutdown
TestMembersCommand_WAN - 2019/11/27 02:27:51.480459 [INFO] consul: shutting down server
TestMembersCommand_WAN - 2019/11/27 02:27:51.480523 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.547666 [WARN] serf: Shutdown without a Leave
TestMembersCommand - 2019/11/27 02:27:51.547666 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.549505 [INFO] manager: shutting down
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.550839 [INFO] agent: Synced node info
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.590221 [DEBUG] agent: Node info in sync
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.590350 [DEBUG] agent: Node info in sync
TestMembersCommand_WAN - 2019/11/27 02:27:51.630833 [WARN] serf: Shutdown without a Leave
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.630969 [INFO] manager: shutting down
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.632329 [INFO] agent: consul server down
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.632391 [INFO] agent: shutdown complete
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.632442 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (tcp)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.632587 [INFO] agent: Stopping DNS server 127.0.0.1:52013 (udp)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.632742 [INFO] agent: Stopping HTTP server 127.0.0.1:52014 (tcp)
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.633225 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.633323 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.634012 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMembersCommand - 2019/11/27 02:27:51.634590 [INFO] manager: shutting down
TestMembersCommand_statusFilter_failed - 2019/11/27 02:27:51.634668 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_statusFilter_failed (2.88s)
TestMembersCommand_WAN - 2019/11/27 02:27:51.635591 [INFO] agent: Synced node info
TestMembersCommand - 2019/11/27 02:27:51.718722 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMembersCommand - 2019/11/27 02:27:51.719140 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.719550 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestMembersCommand - 2019/11/27 02:27:51.719872 [INFO] agent: consul server down
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.719922 [ERR] consul: failed to establish leadership: raft is already shutdown
TestMembersCommand - 2019/11/27 02:27:51.719948 [INFO] agent: shutdown complete
TestMembersCommand - 2019/11/27 02:27:51.720115 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (tcp)
TestMembersCommand - 2019/11/27 02:27:51.720352 [INFO] agent: Stopping DNS server 127.0.0.1:52007 (udp)
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.720352 [INFO] agent: consul server down
TestMembersCommand - 2019/11/27 02:27:51.720556 [INFO] agent: Stopping HTTP server 127.0.0.1:52008 (tcp)
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.720577 [INFO] agent: shutdown complete
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.720628 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (tcp)
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.720780 [INFO] agent: Stopping DNS server 127.0.0.1:52001 (udp)
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.720952 [INFO] agent: Stopping HTTP server 127.0.0.1:52002 (tcp)
TestMembersCommand - 2019/11/27 02:27:51.721136 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand - 2019/11/27 02:27:51.721500 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand (2.97s)
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.722558 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_statusFilter - 2019/11/27 02:27:51.722699 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_statusFilter (2.97s)
TestMembersCommand_WAN - 2019/11/27 02:27:51.727408 [INFO] manager: shutting down
TestMembersCommand_WAN - 2019/11/27 02:27:51.796756 [INFO] agent: consul server down
TestMembersCommand_WAN - 2019/11/27 02:27:51.796870 [INFO] agent: shutdown complete
TestMembersCommand_WAN - 2019/11/27 02:27:51.796937 [INFO] agent: Stopping DNS server 127.0.0.1:52019 (tcp)
TestMembersCommand_WAN - 2019/11/27 02:27:51.797125 [INFO] agent: Stopping DNS server 127.0.0.1:52019 (udp)
TestMembersCommand_WAN - 2019/11/27 02:27:51.797380 [INFO] agent: Stopping HTTP server 127.0.0.1:52020 (tcp)
TestMembersCommand_WAN - 2019/11/27 02:27:51.797938 [INFO] agent: Waiting for endpoints to shut down
TestMembersCommand_WAN - 2019/11/27 02:27:51.798120 [INFO] agent: Endpoints down
--- PASS: TestMembersCommand_WAN (3.04s)
TestMembersCommand_WAN - 2019/11/27 02:27:51.798223 [ERR] autopilot: failed to initialize config: leadership lost while committing log
PASS
ok  	github.com/hashicorp/consul/command/members	3.322s
=== RUN   TestMonitorCommand_exitsOnSignalBeforeLinesArrive
=== PAUSE TestMonitorCommand_exitsOnSignalBeforeLinesArrive
=== CONT  TestMonitorCommand_exitsOnSignalBeforeLinesArrive
WARNING: bootstrap = true: do not enable unless necessary
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:58.769070 [WARN] agent: Node name "Node 3fac97b3-0897-a392-dfe3-ad810afeb43a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:58.770055 [DEBUG] tlsutil: Update with version 1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:58.770137 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:58.770435 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:58.770553 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:27:59 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3fac97b3-0897-a392-dfe3-ad810afeb43a Address:127.0.0.1:19006}]
2019/11/27 02:27:59 [INFO]  raft: Node at 127.0.0.1:19006 [Follower] entering Follower state (Leader: "")
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.468038 [INFO] serf: EventMemberJoin: Node 3fac97b3-0897-a392-dfe3-ad810afeb43a.dc1 127.0.0.1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.472945 [INFO] serf: EventMemberJoin: Node 3fac97b3-0897-a392-dfe3-ad810afeb43a 127.0.0.1
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.475422 [INFO] consul: Adding LAN server Node 3fac97b3-0897-a392-dfe3-ad810afeb43a (Addr: tcp/127.0.0.1:19006) (DC: dc1)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.475892 [INFO] consul: Handled member-join event for server "Node 3fac97b3-0897-a392-dfe3-ad810afeb43a.dc1" in area "wan"
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.480064 [INFO] agent: Started DNS server 127.0.0.1:19001 (udp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.480585 [INFO] agent: Started DNS server 127.0.0.1:19001 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.482762 [INFO] agent: Started HTTP server on 127.0.0.1:19002 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.489246 [INFO] agent: started state syncer
2019/11/27 02:27:59 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:27:59 [INFO]  raft: Node at 127.0.0.1:19006 [Candidate] entering Candidate state in term 2
2019/11/27 02:27:59 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:27:59 [INFO]  raft: Node at 127.0.0.1:19006 [Leader] entering Leader state
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.932192 [INFO] consul: cluster leadership acquired
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.932642 [INFO] consul: New leader elected: Node 3fac97b3-0897-a392-dfe3-ad810afeb43a
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.952082 [INFO] agent: Requesting shutdown
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.952256 [INFO] consul: shutting down server
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.952337 [WARN] serf: Shutdown without a Leave
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:27:59.952721 [ERR] agent: failed to sync remote state: No cluster leader
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.073575 [WARN] serf: Shutdown without a Leave
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.195889 [INFO] manager: shutting down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.318076 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.318361 [INFO] agent: consul server down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.318424 [INFO] agent: shutdown complete
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.318484 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.318698 [INFO] agent: Stopping DNS server 127.0.0.1:19001 (udp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:00.318882 [INFO] agent: Stopping HTTP server 127.0.0.1:19002 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:01.319337 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:19002 (tcp)
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:01.319427 [INFO] agent: Waiting for endpoints to shut down
TestMonitorCommand_exitsOnSignalBeforeLinesArrive - 2019/11/27 02:28:01.319472 [INFO] agent: Endpoints down
--- PASS: TestMonitorCommand_exitsOnSignalBeforeLinesArrive (2.63s)
PASS
ok  	github.com/hashicorp/consul/command/monitor	2.772s
=== RUN   TestOperatorCommand_noTabs
=== PAUSE TestOperatorCommand_noTabs
=== CONT  TestOperatorCommand_noTabs
--- PASS: TestOperatorCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator	0.046s
=== RUN   TestOperatorAutopilotCommand_noTabs
=== PAUSE TestOperatorAutopilotCommand_noTabs
=== CONT  TestOperatorAutopilotCommand_noTabs
--- PASS: TestOperatorAutopilotCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot	0.040s
=== RUN   TestOperatorAutopilotGetConfigCommand_noTabs
=== PAUSE TestOperatorAutopilotGetConfigCommand_noTabs
=== RUN   TestOperatorAutopilotGetConfigCommand
=== PAUSE TestOperatorAutopilotGetConfigCommand
=== CONT  TestOperatorAutopilotGetConfigCommand_noTabs
=== CONT  TestOperatorAutopilotGetConfigCommand
--- PASS: TestOperatorAutopilotGetConfigCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:37.552055 [WARN] agent: Node name "Node 531bc3ae-6bf8-227c-df45-583c852a70bc" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:37.554169 [DEBUG] tlsutil: Update with version 1
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:37.555169 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:37.556520 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:37.557638 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:28:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:531bc3ae-6bf8-227c-df45-583c852a70bc Address:127.0.0.1:43006}]
2019/11/27 02:28:39 [INFO]  raft: Node at 127.0.0.1:43006 [Follower] entering Follower state (Leader: "")
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.420825 [INFO] serf: EventMemberJoin: Node 531bc3ae-6bf8-227c-df45-583c852a70bc.dc1 127.0.0.1
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.424462 [INFO] serf: EventMemberJoin: Node 531bc3ae-6bf8-227c-df45-583c852a70bc 127.0.0.1
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.426621 [INFO] agent: Started DNS server 127.0.0.1:43001 (udp)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.427855 [INFO] consul: Adding LAN server Node 531bc3ae-6bf8-227c-df45-583c852a70bc (Addr: tcp/127.0.0.1:43006) (DC: dc1)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.428197 [INFO] consul: Handled member-join event for server "Node 531bc3ae-6bf8-227c-df45-583c852a70bc.dc1" in area "wan"
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.428774 [INFO] agent: Started DNS server 127.0.0.1:43001 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.430907 [INFO] agent: Started HTTP server on 127.0.0.1:43002 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:39.431190 [INFO] agent: started state syncer
2019/11/27 02:28:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:28:39 [INFO]  raft: Node at 127.0.0.1:43006 [Candidate] entering Candidate state in term 2
2019/11/27 02:28:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:28:40 [INFO]  raft: Node at 127.0.0.1:43006 [Leader] entering Leader state
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:40.394457 [INFO] consul: cluster leadership acquired
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:40.395149 [INFO] consul: New leader elected: Node 531bc3ae-6bf8-227c-df45-583c852a70bc
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:40.930553 [INFO] agent: Synced node info
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:41.538549 [DEBUG] agent: Node info in sync
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:41.538677 [DEBUG] agent: Node info in sync
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.316324 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.316941 [DEBUG] consul: Skipping self join check for "Node 531bc3ae-6bf8-227c-df45-583c852a70bc" since the cluster is too small
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.317107 [INFO] consul: member 'Node 531bc3ae-6bf8-227c-df45-583c852a70bc' joined, marking health alive
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.634052 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (4.342153ms) from=127.0.0.1:37862
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.653153 [INFO] agent: Requesting shutdown
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.653408 [INFO] consul: shutting down server
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.653560 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.715403 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.772650 [INFO] manager: shutting down
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.773133 [INFO] agent: consul server down
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.773177 [INFO] agent: shutdown complete
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.773229 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.773356 [INFO] agent: Stopping DNS server 127.0.0.1:43001 (udp)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.773524 [INFO] agent: Stopping HTTP server 127.0.0.1:43002 (tcp)
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.773967 [INFO] agent: Waiting for endpoints to shut down
TestOperatorAutopilotGetConfigCommand - 2019/11/27 02:28:42.774113 [INFO] agent: Endpoints down
--- PASS: TestOperatorAutopilotGetConfigCommand (5.40s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot/get	5.554s
=== RUN   TestOperatorAutopilotSetConfigCommand_noTabs
=== PAUSE TestOperatorAutopilotSetConfigCommand_noTabs
=== RUN   TestOperatorAutopilotSetConfigCommand
=== PAUSE TestOperatorAutopilotSetConfigCommand
=== CONT  TestOperatorAutopilotSetConfigCommand_noTabs
=== CONT  TestOperatorAutopilotSetConfigCommand
--- PASS: TestOperatorAutopilotSetConfigCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:36.689912 [WARN] agent: Node name "Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:36.691914 [DEBUG] tlsutil: Update with version 1
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:36.692217 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:36.692554 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:36.697077 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:28:37 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1a6d619a-799a-f08b-c9d3-5730f761ecaa Address:127.0.0.1:50506}]
2019/11/27 02:28:37 [INFO]  raft: Node at 127.0.0.1:50506 [Follower] entering Follower state (Leader: "")
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.766160 [INFO] serf: EventMemberJoin: Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa.dc1 127.0.0.1
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.775600 [INFO] serf: EventMemberJoin: Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa 127.0.0.1
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.780975 [INFO] agent: Started DNS server 127.0.0.1:50501 (udp)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.782335 [INFO] consul: Adding LAN server Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa (Addr: tcp/127.0.0.1:50506) (DC: dc1)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.782368 [INFO] agent: Started DNS server 127.0.0.1:50501 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.781460 [INFO] consul: Handled member-join event for server "Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa.dc1" in area "wan"
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.797032 [INFO] agent: Started HTTP server on 127.0.0.1:50502 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:37.797243 [INFO] agent: started state syncer
2019/11/27 02:28:37 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:28:37 [INFO]  raft: Node at 127.0.0.1:50506 [Candidate] entering Candidate state in term 2
2019/11/27 02:28:39 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:28:39 [INFO]  raft: Node at 127.0.0.1:50506 [Leader] entering Leader state
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:39.184159 [INFO] consul: cluster leadership acquired
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:39.185052 [INFO] consul: New leader elected: Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:39.494385 [INFO] agent: Synced node info
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:39.494501 [DEBUG] agent: Node info in sync
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.039143 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.039614 [DEBUG] consul: Skipping self join check for "Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa" since the cluster is too small
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.039755 [INFO] consul: member 'Node 1a6d619a-799a-f08b-c9d3-5730f761ecaa' joined, marking health alive
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.201101 [DEBUG] http: Request GET /v1/operator/autopilot/configuration (4.742501ms) from=127.0.0.1:54896
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.396043 [DEBUG] http: Request PUT /v1/operator/autopilot/configuration?cas=5 (186.26126ms) from=127.0.0.1:54896
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.397544 [INFO] agent: Requesting shutdown
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.397647 [INFO] consul: shutting down server
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.397697 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.537640 [WARN] serf: Shutdown without a Leave
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.596162 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.596264 [DEBUG] agent: Node info in sync
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.615569 [INFO] manager: shutting down
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.616124 [INFO] agent: consul server down
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.616183 [INFO] agent: shutdown complete
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.616239 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.616378 [INFO] agent: Stopping DNS server 127.0.0.1:50501 (udp)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.616546 [INFO] agent: Stopping HTTP server 127.0.0.1:50502 (tcp)
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.617230 [INFO] agent: Waiting for endpoints to shut down
TestOperatorAutopilotSetConfigCommand - 2019/11/27 02:28:41.617439 [INFO] agent: Endpoints down
--- PASS: TestOperatorAutopilotSetConfigCommand (5.04s)
PASS
ok  	github.com/hashicorp/consul/command/operator/autopilot/set	5.349s
=== RUN   TestOperatorRaftCommand_noTabs
=== PAUSE TestOperatorRaftCommand_noTabs
=== CONT  TestOperatorRaftCommand_noTabs
--- PASS: TestOperatorRaftCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft	0.078s
=== RUN   TestOperatorRaftListPeersCommand_noTabs
=== PAUSE TestOperatorRaftListPeersCommand_noTabs
=== RUN   TestOperatorRaftListPeersCommand
=== PAUSE TestOperatorRaftListPeersCommand
=== CONT  TestOperatorRaftListPeersCommand_noTabs
=== CONT  TestOperatorRaftListPeersCommand
--- PASS: TestOperatorRaftListPeersCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:43.933340 [WARN] agent: Node name "Node 8d98315f-6b83-2245-a975-c90c7021e381" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:43.934331 [DEBUG] tlsutil: Update with version 1
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:43.934413 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:43.934715 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:43.934839 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:28:44 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8d98315f-6b83-2245-a975-c90c7021e381 Address:127.0.0.1:16006}]
2019/11/27 02:28:44 [INFO]  raft: Node at 127.0.0.1:16006 [Follower] entering Follower state (Leader: "")
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.642867 [INFO] serf: EventMemberJoin: Node 8d98315f-6b83-2245-a975-c90c7021e381.dc1 127.0.0.1
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.646767 [INFO] serf: EventMemberJoin: Node 8d98315f-6b83-2245-a975-c90c7021e381 127.0.0.1
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.649158 [INFO] agent: Started DNS server 127.0.0.1:16001 (udp)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.649934 [INFO] consul: Handled member-join event for server "Node 8d98315f-6b83-2245-a975-c90c7021e381.dc1" in area "wan"
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.650588 [INFO] agent: Started DNS server 127.0.0.1:16001 (tcp)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.650705 [INFO] consul: Adding LAN server Node 8d98315f-6b83-2245-a975-c90c7021e381 (Addr: tcp/127.0.0.1:16006) (DC: dc1)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.652898 [INFO] agent: Started HTTP server on 127.0.0.1:16002 (tcp)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:44.653156 [INFO] agent: started state syncer
2019/11/27 02:28:44 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:28:44 [INFO]  raft: Node at 127.0.0.1:16006 [Candidate] entering Candidate state in term 2
2019/11/27 02:28:45 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:28:45 [INFO]  raft: Node at 127.0.0.1:16006 [Leader] entering Leader state
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:45.642966 [INFO] consul: cluster leadership acquired
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:45.643521 [INFO] consul: New leader elected: Node 8d98315f-6b83-2245-a975-c90c7021e381
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:45.949588 [INFO] agent: Synced node info
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.114534 [DEBUG] http: Request GET /v1/operator/raft/configuration (90.540538ms) from=127.0.0.1:40904
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.118564 [INFO] agent: Requesting shutdown
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.118663 [INFO] consul: shutting down server
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.118709 [WARN] serf: Shutdown without a Leave
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.259748 [WARN] serf: Shutdown without a Leave
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.370812 [INFO] manager: shutting down
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.581963 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.582407 [INFO] agent: consul server down
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.582544 [INFO] agent: shutdown complete
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.582695 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (tcp)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.582932 [INFO] agent: Stopping DNS server 127.0.0.1:16001 (udp)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.583182 [INFO] agent: Stopping HTTP server 127.0.0.1:16002 (tcp)
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.583852 [INFO] agent: Waiting for endpoints to shut down
TestOperatorRaftListPeersCommand - 2019/11/27 02:28:46.584035 [INFO] agent: Endpoints down
--- PASS: TestOperatorRaftListPeersCommand (2.79s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft/listpeers	3.063s
=== RUN   TestOperatorRaftRemovePeerCommand_noTabs
=== PAUSE TestOperatorRaftRemovePeerCommand_noTabs
=== RUN   TestOperatorRaftRemovePeerCommand
=== PAUSE TestOperatorRaftRemovePeerCommand
=== CONT  TestOperatorRaftRemovePeerCommand_noTabs
=== CONT  TestOperatorRaftRemovePeerCommand
--- PASS: TestOperatorRaftRemovePeerCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:51.613979 [WARN] agent: Node name "Node 07670b0d-bf29-b5cb-93db-d3a0b16706a9" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:51.615085 [DEBUG] tlsutil: Update with version 1
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:51.615174 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:51.615376 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:51.615500 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:28:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:07670b0d-bf29-b5cb-93db-d3a0b16706a9 Address:127.0.0.1:31006}]
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.475425 [INFO] serf: EventMemberJoin: Node 07670b0d-bf29-b5cb-93db-d3a0b16706a9.dc1 127.0.0.1
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.479190 [INFO] serf: EventMemberJoin: Node 07670b0d-bf29-b5cb-93db-d3a0b16706a9 127.0.0.1
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.481597 [INFO] agent: Started DNS server 127.0.0.1:31001 (udp)
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.482116 [INFO] agent: Started DNS server 127.0.0.1:31001 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.484161 [INFO] agent: Started HTTP server on 127.0.0.1:31002 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.484323 [INFO] agent: started state syncer
2019/11/27 02:28:52 [INFO]  raft: Node at 127.0.0.1:31006 [Follower] entering Follower state (Leader: "")
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.487382 [INFO] consul: Handled member-join event for server "Node 07670b0d-bf29-b5cb-93db-d3a0b16706a9.dc1" in area "wan"
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.488708 [INFO] consul: Adding LAN server Node 07670b0d-bf29-b5cb-93db-d3a0b16706a9 (Addr: tcp/127.0.0.1:31006) (DC: dc1)
2019/11/27 02:28:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:28:52 [INFO]  raft: Node at 127.0.0.1:31006 [Candidate] entering Candidate state in term 2
2019/11/27 02:28:52 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:28:52 [INFO]  raft: Node at 127.0.0.1:31006 [Leader] entering Leader state
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.970824 [INFO] consul: cluster leadership acquired
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:52.971416 [INFO] consul: New leader elected: Node 07670b0d-bf29-b5cb-93db-d3a0b16706a9
=== RUN   TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_directly
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.259737 [ERR] http: Request DELETE /v1/operator/raft/peer?address=nope, error: address "nope" was not found in the Raft configuration from=127.0.0.1:37018
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.260288 [INFO] agent: Synced node info
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.261013 [DEBUG] http: Request DELETE /v1/operator/raft/peer?address=nope (129.84726ms) from=127.0.0.1:37018
=== RUN   TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_with_-id
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.415269 [ERR] http: Request DELETE /v1/operator/raft/peer?id=nope, error: id "nope" was not found in the Raft configuration from=127.0.0.1:37020
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.418821 [DEBUG] http: Request DELETE /v1/operator/raft/peer?id=nope (151.340354ms) from=127.0.0.1:37020
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.423213 [INFO] agent: Requesting shutdown
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.423503 [INFO] consul: shutting down server
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.423889 [WARN] serf: Shutdown without a Leave
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.559323 [WARN] serf: Shutdown without a Leave
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.649739 [INFO] manager: shutting down
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.848208 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.848513 [INFO] agent: consul server down
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.848571 [INFO] agent: shutdown complete
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.848628 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.848773 [INFO] agent: Stopping DNS server 127.0.0.1:31001 (udp)
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.848948 [INFO] agent: Stopping HTTP server 127.0.0.1:31002 (tcp)
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.849663 [INFO] agent: Waiting for endpoints to shut down
TestOperatorRaftRemovePeerCommand - 2019/11/27 02:28:53.849758 [INFO] agent: Endpoints down
--- PASS: TestOperatorRaftRemovePeerCommand (2.31s)
    --- PASS: TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_directly (0.14s)
    --- PASS: TestOperatorRaftRemovePeerCommand/Test_the_remove-peer_subcommand_with_-id (0.16s)
PASS
ok  	github.com/hashicorp/consul/command/operator/raft/removepeer	2.489s
=== RUN   TestReloadCommand_noTabs
=== PAUSE TestReloadCommand_noTabs
=== RUN   TestReloadCommand
=== PAUSE TestReloadCommand
=== CONT  TestReloadCommand_noTabs
=== CONT  TestReloadCommand
--- PASS: TestReloadCommand_noTabs (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestReloadCommand - 2019/11/27 02:29:18.076359 [WARN] agent: Node name "Node 2daff32c-cca4-7bdf-1192-a75398707f21" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestReloadCommand - 2019/11/27 02:29:18.077353 [DEBUG] tlsutil: Update with version 1
TestReloadCommand - 2019/11/27 02:29:18.077521 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestReloadCommand - 2019/11/27 02:29:18.077756 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestReloadCommand - 2019/11/27 02:29:18.089606 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:29:18 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2daff32c-cca4-7bdf-1192-a75398707f21 Address:127.0.0.1:10006}]
2019/11/27 02:29:18 [INFO]  raft: Node at 127.0.0.1:10006 [Follower] entering Follower state (Leader: "")
TestReloadCommand - 2019/11/27 02:29:18.898448 [INFO] serf: EventMemberJoin: Node 2daff32c-cca4-7bdf-1192-a75398707f21.dc1 127.0.0.1
TestReloadCommand - 2019/11/27 02:29:18.903002 [INFO] serf: EventMemberJoin: Node 2daff32c-cca4-7bdf-1192-a75398707f21 127.0.0.1
TestReloadCommand - 2019/11/27 02:29:18.904143 [INFO] consul: Handled member-join event for server "Node 2daff32c-cca4-7bdf-1192-a75398707f21.dc1" in area "wan"
TestReloadCommand - 2019/11/27 02:29:18.905042 [INFO] agent: Started DNS server 127.0.0.1:10001 (tcp)
TestReloadCommand - 2019/11/27 02:29:18.905557 [INFO] agent: Started DNS server 127.0.0.1:10001 (udp)
TestReloadCommand - 2019/11/27 02:29:18.907980 [INFO] agent: Started HTTP server on 127.0.0.1:10002 (tcp)
TestReloadCommand - 2019/11/27 02:29:18.908138 [INFO] agent: started state syncer
TestReloadCommand - 2019/11/27 02:29:18.909529 [INFO] consul: Adding LAN server Node 2daff32c-cca4-7bdf-1192-a75398707f21 (Addr: tcp/127.0.0.1:10006) (DC: dc1)
2019/11/27 02:29:18 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:18 [INFO]  raft: Node at 127.0.0.1:10006 [Candidate] entering Candidate state in term 2
2019/11/27 02:29:19 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:19 [INFO]  raft: Node at 127.0.0.1:10006 [Leader] entering Leader state
TestReloadCommand - 2019/11/27 02:29:19.359459 [INFO] consul: cluster leadership acquired
TestReloadCommand - 2019/11/27 02:29:19.360073 [INFO] consul: New leader elected: Node 2daff32c-cca4-7bdf-1192-a75398707f21
TestReloadCommand - 2019/11/27 02:29:19.402263 [DEBUG] http: Request PUT /v1/agent/reload (97.337µs) from=127.0.0.1:36112
TestReloadCommand - 2019/11/27 02:29:19.403135 [INFO] agent: Requesting shutdown
TestReloadCommand - 2019/11/27 02:29:19.403211 [INFO] consul: shutting down server
TestReloadCommand - 2019/11/27 02:29:19.403257 [WARN] serf: Shutdown without a Leave
TestReloadCommand - 2019/11/27 02:29:19.403581 [ERR] agent: failed to sync remote state: No cluster leader
TestReloadCommand - 2019/11/27 02:29:19.490890 [WARN] serf: Shutdown without a Leave
TestReloadCommand - 2019/11/27 02:29:19.602094 [INFO] manager: shutting down
TestReloadCommand - 2019/11/27 02:29:19.713453 [INFO] agent: consul server down
TestReloadCommand - 2019/11/27 02:29:19.713529 [INFO] agent: shutdown complete
TestReloadCommand - 2019/11/27 02:29:19.713581 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (tcp)
TestReloadCommand - 2019/11/27 02:29:19.713704 [INFO] agent: Stopping DNS server 127.0.0.1:10001 (udp)
TestReloadCommand - 2019/11/27 02:29:19.713839 [INFO] agent: Stopping HTTP server 127.0.0.1:10002 (tcp)
TestReloadCommand - 2019/11/27 02:29:19.714245 [INFO] agent: Waiting for endpoints to shut down
TestReloadCommand - 2019/11/27 02:29:19.714314 [ERR] consul: failed to wait for barrier: leadership lost while committing log
TestReloadCommand - 2019/11/27 02:29:19.714432 [INFO] agent: Endpoints down
--- PASS: TestReloadCommand (1.71s)
PASS
ok  	github.com/hashicorp/consul/command/reload	1.839s
=== RUN   TestRTTCommand_noTabs
=== PAUSE TestRTTCommand_noTabs
=== RUN   TestRTTCommand_BadArgs
=== PAUSE TestRTTCommand_BadArgs
=== RUN   TestRTTCommand_LAN
=== PAUSE TestRTTCommand_LAN
=== RUN   TestRTTCommand_WAN
=== PAUSE TestRTTCommand_WAN
=== CONT  TestRTTCommand_noTabs
--- PASS: TestRTTCommand_noTabs (0.00s)
=== CONT  TestRTTCommand_WAN
=== CONT  TestRTTCommand_LAN
=== CONT  TestRTTCommand_BadArgs
=== RUN   TestRTTCommand_BadArgs/#00
=== RUN   TestRTTCommand_BadArgs/node1_node2_node3
=== RUN   TestRTTCommand_BadArgs/-wan_node1_node2
=== RUN   TestRTTCommand_BadArgs/-wan_node1.dc1_node2
=== RUN   TestRTTCommand_BadArgs/-wan_node1_node2.dc1
--- PASS: TestRTTCommand_BadArgs (0.03s)
    --- PASS: TestRTTCommand_BadArgs/#00 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/node1_node2_node3 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1_node2 (0.00s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1.dc1_node2 (0.02s)
    --- PASS: TestRTTCommand_BadArgs/-wan_node1_node2.dc1 (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestRTTCommand_WAN - 2019/11/27 02:29:21.509165 [WARN] agent: Node name "Node ae910a31-c761-a953-f244-b262d1038ff8" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRTTCommand_WAN - 2019/11/27 02:29:21.510435 [DEBUG] tlsutil: Update with version 1
TestRTTCommand_WAN - 2019/11/27 02:29:21.510517 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRTTCommand_WAN - 2019/11/27 02:29:21.510860 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestRTTCommand_WAN - 2019/11/27 02:29:21.511033 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestRTTCommand_LAN - 2019/11/27 02:29:21.571540 [WARN] agent: Node name "Node cea0017e-f473-df1c-b2de-2c106a67fc3a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestRTTCommand_LAN - 2019/11/27 02:29:21.572215 [DEBUG] tlsutil: Update with version 1
TestRTTCommand_LAN - 2019/11/27 02:29:21.572293 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestRTTCommand_LAN - 2019/11/27 02:29:21.572622 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestRTTCommand_LAN - 2019/11/27 02:29:21.574911 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:29:22 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:ae910a31-c761-a953-f244-b262d1038ff8 Address:127.0.0.1:49006}]
2019/11/27 02:29:22 [INFO]  raft: Node at 127.0.0.1:49006 [Follower] entering Follower state (Leader: "")
TestRTTCommand_WAN - 2019/11/27 02:29:22.801567 [INFO] serf: EventMemberJoin: Node ae910a31-c761-a953-f244-b262d1038ff8.dc1 127.0.0.1
TestRTTCommand_WAN - 2019/11/27 02:29:22.810357 [INFO] serf: EventMemberJoin: Node ae910a31-c761-a953-f244-b262d1038ff8 127.0.0.1
TestRTTCommand_WAN - 2019/11/27 02:29:22.814552 [INFO] consul: Handled member-join event for server "Node ae910a31-c761-a953-f244-b262d1038ff8.dc1" in area "wan"
TestRTTCommand_WAN - 2019/11/27 02:29:22.814624 [INFO] consul: Adding LAN server Node ae910a31-c761-a953-f244-b262d1038ff8 (Addr: tcp/127.0.0.1:49006) (DC: dc1)
TestRTTCommand_WAN - 2019/11/27 02:29:22.815409 [INFO] agent: Started DNS server 127.0.0.1:49001 (tcp)
TestRTTCommand_WAN - 2019/11/27 02:29:22.816319 [INFO] agent: Started DNS server 127.0.0.1:49001 (udp)
TestRTTCommand_WAN - 2019/11/27 02:29:22.818890 [INFO] agent: Started HTTP server on 127.0.0.1:49002 (tcp)
TestRTTCommand_WAN - 2019/11/27 02:29:22.819076 [INFO] agent: started state syncer
2019/11/27 02:29:22 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:22 [INFO]  raft: Node at 127.0.0.1:49006 [Candidate] entering Candidate state in term 2
2019/11/27 02:29:23 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:cea0017e-f473-df1c-b2de-2c106a67fc3a Address:127.0.0.1:49012}]
2019/11/27 02:29:23 [INFO]  raft: Node at 127.0.0.1:49012 [Follower] entering Follower state (Leader: "")
TestRTTCommand_LAN - 2019/11/27 02:29:23.150739 [INFO] serf: EventMemberJoin: Node cea0017e-f473-df1c-b2de-2c106a67fc3a.dc1 127.0.0.1
TestRTTCommand_LAN - 2019/11/27 02:29:23.154366 [INFO] serf: EventMemberJoin: Node cea0017e-f473-df1c-b2de-2c106a67fc3a 127.0.0.1
TestRTTCommand_LAN - 2019/11/27 02:29:23.155918 [INFO] agent: Started DNS server 127.0.0.1:49007 (udp)
TestRTTCommand_LAN - 2019/11/27 02:29:23.157027 [INFO] consul: Handled member-join event for server "Node cea0017e-f473-df1c-b2de-2c106a67fc3a.dc1" in area "wan"
TestRTTCommand_LAN - 2019/11/27 02:29:23.157219 [INFO] agent: Started DNS server 127.0.0.1:49007 (tcp)
TestRTTCommand_LAN - 2019/11/27 02:29:23.157358 [INFO] consul: Adding LAN server Node cea0017e-f473-df1c-b2de-2c106a67fc3a (Addr: tcp/127.0.0.1:49012) (DC: dc1)
TestRTTCommand_LAN - 2019/11/27 02:29:23.159298 [INFO] agent: Started HTTP server on 127.0.0.1:49008 (tcp)
TestRTTCommand_LAN - 2019/11/27 02:29:23.159439 [INFO] agent: started state syncer
2019/11/27 02:29:23 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:23 [INFO]  raft: Node at 127.0.0.1:49012 [Candidate] entering Candidate state in term 2
2019/11/27 02:29:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:23 [INFO]  raft: Node at 127.0.0.1:49006 [Leader] entering Leader state
TestRTTCommand_WAN - 2019/11/27 02:29:23.474691 [INFO] consul: cluster leadership acquired
TestRTTCommand_WAN - 2019/11/27 02:29:23.475272 [INFO] consul: New leader elected: Node ae910a31-c761-a953-f244-b262d1038ff8
2019/11/27 02:29:23 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:23 [INFO]  raft: Node at 127.0.0.1:49012 [Leader] entering Leader state
TestRTTCommand_LAN - 2019/11/27 02:29:23.768851 [INFO] consul: cluster leadership acquired
TestRTTCommand_LAN - 2019/11/27 02:29:23.769298 [INFO] consul: New leader elected: Node cea0017e-f473-df1c-b2de-2c106a67fc3a
TestRTTCommand_LAN - 2019/11/27 02:29:24.145334 [INFO] agent: Synced node info
TestRTTCommand_LAN - 2019/11/27 02:29:24.145466 [DEBUG] agent: Node info in sync
TestRTTCommand_WAN - 2019/11/27 02:29:24.676403 [INFO] agent: Synced node info
TestRTTCommand_WAN - 2019/11/27 02:29:24.711259 [DEBUG] http: Request GET /v1/coordinate/datacenters (6.314556ms) from=127.0.0.1:52412
TestRTTCommand_WAN - 2019/11/27 02:29:24.953807 [DEBUG] http: Request GET /v1/agent/self (227.203025ms) from=127.0.0.1:52414
TestRTTCommand_WAN - 2019/11/27 02:29:24.966608 [DEBUG] http: Request GET /v1/coordinate/datacenters (1.14004ms) from=127.0.0.1:52414
TestRTTCommand_WAN - 2019/11/27 02:29:25.017120 [DEBUG] http: Request GET /v1/coordinate/datacenters (4.452158ms) from=127.0.0.1:52416
TestRTTCommand_WAN - 2019/11/27 02:29:25.020164 [INFO] agent: Requesting shutdown
TestRTTCommand_WAN - 2019/11/27 02:29:25.020410 [INFO] consul: shutting down server
TestRTTCommand_WAN - 2019/11/27 02:29:25.020549 [WARN] serf: Shutdown without a Leave
TestRTTCommand_WAN - 2019/11/27 02:29:25.135029 [WARN] serf: Shutdown without a Leave
TestRTTCommand_WAN - 2019/11/27 02:29:25.201898 [INFO] manager: shutting down
TestRTTCommand_WAN - 2019/11/27 02:29:25.202649 [INFO] agent: consul server down
TestRTTCommand_WAN - 2019/11/27 02:29:25.202708 [INFO] agent: shutdown complete
TestRTTCommand_WAN - 2019/11/27 02:29:25.202762 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (tcp)
TestRTTCommand_WAN - 2019/11/27 02:29:25.202906 [INFO] agent: Stopping DNS server 127.0.0.1:49001 (udp)
TestRTTCommand_WAN - 2019/11/27 02:29:25.203063 [INFO] agent: Stopping HTTP server 127.0.0.1:49002 (tcp)
TestRTTCommand_WAN - 2019/11/27 02:29:25.203906 [INFO] agent: Waiting for endpoints to shut down
TestRTTCommand_WAN - 2019/11/27 02:29:25.206439 [INFO] agent: Endpoints down
--- PASS: TestRTTCommand_WAN (3.81s)
TestRTTCommand_LAN - 2019/11/27 02:29:25.210426 [DEBUG] http: Request GET /v1/coordinate/nodes (1.098039ms) from=127.0.0.1:58108
TestRTTCommand_WAN - 2019/11/27 02:29:25.231509 [ERR] consul: failed to establish leadership: error generating CA root certificate: raft is already shutdown
TestRTTCommand_LAN - 2019/11/27 02:29:25.241067 [DEBUG] http: Request GET /v1/coordinate/nodes (1.172375ms) from=127.0.0.1:58110
TestRTTCommand_LAN - 2019/11/27 02:29:25.246946 [DEBUG] agent: Node info in sync
TestRTTCommand_LAN - 2019/11/27 02:29:25.272955 [DEBUG] http: Request GET /v1/coordinate/nodes (1.289713ms) from=127.0.0.1:58112
TestRTTCommand_LAN - 2019/11/27 02:29:25.303156 [DEBUG] http: Request GET /v1/coordinate/nodes (724.692µs) from=127.0.0.1:58114
TestRTTCommand_LAN - 2019/11/27 02:29:25.333054 [DEBUG] http: Request GET /v1/coordinate/nodes (765.694µs) from=127.0.0.1:58116
TestRTTCommand_LAN - 2019/11/27 02:29:25.363331 [DEBUG] http: Request GET /v1/coordinate/nodes (761.694µs) from=127.0.0.1:58118
TestRTTCommand_LAN - 2019/11/27 02:29:25.393268 [DEBUG] http: Request GET /v1/coordinate/nodes (744.026µs) from=127.0.0.1:58120
TestRTTCommand_LAN - 2019/11/27 02:29:25.423525 [DEBUG] http: Request GET /v1/coordinate/nodes (561.02µs) from=127.0.0.1:58122
TestRTTCommand_LAN - 2019/11/27 02:29:25.454839 [DEBUG] http: Request GET /v1/coordinate/nodes (603.355µs) from=127.0.0.1:58124
TestRTTCommand_LAN - 2019/11/27 02:29:25.484874 [DEBUG] http: Request GET /v1/coordinate/nodes (744.693µs) from=127.0.0.1:58126
TestRTTCommand_LAN - 2019/11/27 02:29:25.515184 [DEBUG] http: Request GET /v1/coordinate/nodes (1.117373ms) from=127.0.0.1:58128
TestRTTCommand_LAN - 2019/11/27 02:29:25.716854 [DEBUG] http: Request GET /v1/agent/self (195.128224ms) from=127.0.0.1:58130
TestRTTCommand_LAN - 2019/11/27 02:29:25.727822 [DEBUG] http: Request GET /v1/coordinate/nodes (716.025µs) from=127.0.0.1:58130
TestRTTCommand_LAN - 2019/11/27 02:29:25.733851 [DEBUG] http: Request GET /v1/coordinate/nodes (605.355µs) from=127.0.0.1:58132
TestRTTCommand_LAN - 2019/11/27 02:29:25.735504 [INFO] agent: Requesting shutdown
TestRTTCommand_LAN - 2019/11/27 02:29:25.735574 [INFO] consul: shutting down server
TestRTTCommand_LAN - 2019/11/27 02:29:25.735614 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/11/27 02:29:25.850202 [WARN] serf: Shutdown without a Leave
TestRTTCommand_LAN - 2019/11/27 02:29:25.913827 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestRTTCommand_LAN - 2019/11/27 02:29:25.914433 [DEBUG] consul: Skipping self join check for "Node cea0017e-f473-df1c-b2de-2c106a67fc3a" since the cluster is too small
TestRTTCommand_LAN - 2019/11/27 02:29:25.914697 [INFO] consul: member 'Node cea0017e-f473-df1c-b2de-2c106a67fc3a' joined, marking health alive
TestRTTCommand_LAN - 2019/11/27 02:29:26.046061 [INFO] manager: shutting down
TestRTTCommand_LAN - 2019/11/27 02:29:26.168676 [ERR] consul: failed to reconcile member: {Node cea0017e-f473-df1c-b2de-2c106a67fc3a 127.0.0.1 49010 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:cea0017e-f473-df1c-b2de-2c106a67fc3a port:49012 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:49011] alive 1 5 2 2 5 4}: leadership lost while committing log
TestRTTCommand_LAN - 2019/11/27 02:29:26.168925 [INFO] agent: consul server down
TestRTTCommand_LAN - 2019/11/27 02:29:26.168979 [INFO] agent: shutdown complete
TestRTTCommand_LAN - 2019/11/27 02:29:26.169034 [INFO] agent: Stopping DNS server 127.0.0.1:49007 (tcp)
TestRTTCommand_LAN - 2019/11/27 02:29:26.169181 [INFO] agent: Stopping DNS server 127.0.0.1:49007 (udp)
TestRTTCommand_LAN - 2019/11/27 02:29:26.169335 [INFO] agent: Stopping HTTP server 127.0.0.1:49008 (tcp)
TestRTTCommand_LAN - 2019/11/27 02:29:26.172032 [INFO] agent: Waiting for endpoints to shut down
TestRTTCommand_LAN - 2019/11/27 02:29:26.172127 [INFO] agent: Endpoints down
--- PASS: TestRTTCommand_LAN (4.77s)
PASS
ok  	github.com/hashicorp/consul/command/rtt	4.973s
=== RUN   TestDevModeHasNoServices
=== PAUSE TestDevModeHasNoServices
=== RUN   TestStructsToAgentService
=== PAUSE TestStructsToAgentService
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== CONT  TestDevModeHasNoServices
=== CONT  TestCommand_noTabs
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestStructsToAgentService
=== RUN   TestStructsToAgentService/Basic_service_with_port
=== PAUSE TestStructsToAgentService/Basic_service_with_port
=== RUN   TestStructsToAgentService/Service_with_a_check
=== PAUSE TestStructsToAgentService/Service_with_a_check
=== RUN   TestStructsToAgentService/Service_with_checks
=== PAUSE TestStructsToAgentService/Service_with_checks
=== RUN   TestStructsToAgentService/Proxy_service
=== PAUSE TestStructsToAgentService/Proxy_service
=== CONT  TestStructsToAgentService/Basic_service_with_port
=== CONT  TestStructsToAgentService/Proxy_service
=== CONT  TestStructsToAgentService/Service_with_checks
=== CONT  TestStructsToAgentService/Service_with_a_check
--- PASS: TestStructsToAgentService (0.00s)
    --- PASS: TestStructsToAgentService/Basic_service_with_port (0.00s)
    --- PASS: TestStructsToAgentService/Proxy_service (0.00s)
    --- PASS: TestStructsToAgentService/Service_with_checks (0.00s)
    --- PASS: TestStructsToAgentService/Service_with_a_check (0.01s)
--- PASS: TestDevModeHasNoServices (0.08s)
PASS
ok  	github.com/hashicorp/consul/command/services	0.242s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_File_id
=== PAUSE TestCommand_File_id
=== RUN   TestCommand_File_nameOnly
=== PAUSE TestCommand_File_nameOnly
=== RUN   TestCommand_Flag
=== PAUSE TestCommand_Flag
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_File_nameOnly
=== CONT  TestCommand_Flag
=== CONT  TestCommand_File_id
--- PASS: TestCommand_noTabs (0.03s)
=== CONT  TestCommand_Validation
=== RUN   TestCommand_Validation/no_args_or_id
=== RUN   TestCommand_Validation/args_and_-id
--- PASS: TestCommand_Validation (0.03s)
    --- PASS: TestCommand_Validation/no_args_or_id (0.00s)
    --- PASS: TestCommand_Validation/args_and_-id (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File_id - 2019/11/27 02:29:30.557720 [WARN] agent: Node name "Node 8ecf4cf8-738c-f092-62a5-70274015e2cd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File_id - 2019/11/27 02:29:30.561092 [DEBUG] tlsutil: Update with version 1
TestCommand_File_id - 2019/11/27 02:29:30.561348 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File_id - 2019/11/27 02:29:30.561762 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_File_id - 2019/11/27 02:29:30.563261 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_Flag - 2019/11/27 02:29:30.568141 [WARN] agent: Node name "Node 16cc1ef4-dba9-d0b3-6b35-b30782d8497a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_Flag - 2019/11/27 02:29:30.568579 [DEBUG] tlsutil: Update with version 1
TestCommand_Flag - 2019/11/27 02:29:30.568651 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_Flag - 2019/11/27 02:29:30.598743 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_Flag - 2019/11/27 02:29:30.599425 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File_nameOnly - 2019/11/27 02:29:30.613985 [WARN] agent: Node name "Node e5ee49f5-c5ca-7aa6-8735-e8840939f3bd" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File_nameOnly - 2019/11/27 02:29:30.616443 [DEBUG] tlsutil: Update with version 1
TestCommand_File_nameOnly - 2019/11/27 02:29:30.616867 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File_nameOnly - 2019/11/27 02:29:30.617187 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_File_nameOnly - 2019/11/27 02:29:30.617412 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:29:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:16cc1ef4-dba9-d0b3-6b35-b30782d8497a Address:127.0.0.1:38512}]
2019/11/27 02:29:31 [INFO]  raft: Node at 127.0.0.1:38512 [Follower] entering Follower state (Leader: "")
2019/11/27 02:29:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e5ee49f5-c5ca-7aa6-8735-e8840939f3bd Address:127.0.0.1:38506}]
2019/11/27 02:29:31 [INFO]  raft: Node at 127.0.0.1:38506 [Follower] entering Follower state (Leader: "")
2019/11/27 02:29:31 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:8ecf4cf8-738c-f092-62a5-70274015e2cd Address:127.0.0.1:38518}]
2019/11/27 02:29:31 [INFO]  raft: Node at 127.0.0.1:38518 [Follower] entering Follower state (Leader: "")
TestCommand_File_nameOnly - 2019/11/27 02:29:31.354154 [INFO] serf: EventMemberJoin: Node e5ee49f5-c5ca-7aa6-8735-e8840939f3bd.dc1 127.0.0.1
TestCommand_File_id - 2019/11/27 02:29:31.354753 [INFO] serf: EventMemberJoin: Node 8ecf4cf8-738c-f092-62a5-70274015e2cd.dc1 127.0.0.1
TestCommand_File_id - 2019/11/27 02:29:31.359309 [INFO] serf: EventMemberJoin: Node 8ecf4cf8-738c-f092-62a5-70274015e2cd 127.0.0.1
TestCommand_File_nameOnly - 2019/11/27 02:29:31.362081 [INFO] serf: EventMemberJoin: Node e5ee49f5-c5ca-7aa6-8735-e8840939f3bd 127.0.0.1
TestCommand_Flag - 2019/11/27 02:29:31.361399 [INFO] serf: EventMemberJoin: Node 16cc1ef4-dba9-d0b3-6b35-b30782d8497a.dc1 127.0.0.1
TestCommand_File_id - 2019/11/27 02:29:31.364551 [INFO] consul: Adding LAN server Node 8ecf4cf8-738c-f092-62a5-70274015e2cd (Addr: tcp/127.0.0.1:38518) (DC: dc1)
TestCommand_File_nameOnly - 2019/11/27 02:29:31.365033 [INFO] consul: Adding LAN server Node e5ee49f5-c5ca-7aa6-8735-e8840939f3bd (Addr: tcp/127.0.0.1:38506) (DC: dc1)
TestCommand_File_nameOnly - 2019/11/27 02:29:31.365264 [INFO] consul: Handled member-join event for server "Node e5ee49f5-c5ca-7aa6-8735-e8840939f3bd.dc1" in area "wan"
TestCommand_File_id - 2019/11/27 02:29:31.365664 [INFO] consul: Handled member-join event for server "Node 8ecf4cf8-738c-f092-62a5-70274015e2cd.dc1" in area "wan"
TestCommand_File_nameOnly - 2019/11/27 02:29:31.366260 [INFO] agent: Started DNS server 127.0.0.1:38501 (udp)
TestCommand_File_nameOnly - 2019/11/27 02:29:31.366331 [INFO] agent: Started DNS server 127.0.0.1:38501 (tcp)
TestCommand_File_id - 2019/11/27 02:29:31.367739 [INFO] agent: Started DNS server 127.0.0.1:38513 (tcp)
TestCommand_File_id - 2019/11/27 02:29:31.368488 [INFO] agent: Started DNS server 127.0.0.1:38513 (udp)
TestCommand_File_nameOnly - 2019/11/27 02:29:31.368763 [INFO] agent: Started HTTP server on 127.0.0.1:38502 (tcp)
TestCommand_File_nameOnly - 2019/11/27 02:29:31.369095 [INFO] agent: started state syncer
2019/11/27 02:29:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:31 [INFO]  raft: Node at 127.0.0.1:38506 [Candidate] entering Candidate state in term 2
TestCommand_File_id - 2019/11/27 02:29:31.399028 [INFO] agent: Started HTTP server on 127.0.0.1:38514 (tcp)
TestCommand_File_id - 2019/11/27 02:29:31.399158 [INFO] agent: started state syncer
TestCommand_Flag - 2019/11/27 02:29:31.399680 [INFO] serf: EventMemberJoin: Node 16cc1ef4-dba9-d0b3-6b35-b30782d8497a 127.0.0.1
TestCommand_Flag - 2019/11/27 02:29:31.401666 [INFO] consul: Adding LAN server Node 16cc1ef4-dba9-d0b3-6b35-b30782d8497a (Addr: tcp/127.0.0.1:38512) (DC: dc1)
TestCommand_Flag - 2019/11/27 02:29:31.401863 [INFO] consul: Handled member-join event for server "Node 16cc1ef4-dba9-d0b3-6b35-b30782d8497a.dc1" in area "wan"
TestCommand_Flag - 2019/11/27 02:29:31.401932 [INFO] agent: Started DNS server 127.0.0.1:38507 (tcp)
TestCommand_Flag - 2019/11/27 02:29:31.402667 [INFO] agent: Started DNS server 127.0.0.1:38507 (udp)
TestCommand_Flag - 2019/11/27 02:29:31.408602 [INFO] agent: Started HTTP server on 127.0.0.1:38508 (tcp)
TestCommand_Flag - 2019/11/27 02:29:31.408736 [INFO] agent: started state syncer
2019/11/27 02:29:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:31 [INFO]  raft: Node at 127.0.0.1:38518 [Candidate] entering Candidate state in term 2
2019/11/27 02:29:31 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:31 [INFO]  raft: Node at 127.0.0.1:38512 [Candidate] entering Candidate state in term 2
2019/11/27 02:29:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:32 [INFO]  raft: Node at 127.0.0.1:38506 [Leader] entering Leader state
TestCommand_File_nameOnly - 2019/11/27 02:29:32.414214 [INFO] consul: cluster leadership acquired
TestCommand_File_nameOnly - 2019/11/27 02:29:32.414975 [INFO] consul: New leader elected: Node e5ee49f5-c5ca-7aa6-8735-e8840939f3bd
2019/11/27 02:29:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:32 [INFO]  raft: Node at 127.0.0.1:38518 [Leader] entering Leader state
TestCommand_File_id - 2019/11/27 02:29:32.490558 [INFO] consul: cluster leadership acquired
TestCommand_File_id - 2019/11/27 02:29:32.491130 [INFO] consul: New leader elected: Node 8ecf4cf8-738c-f092-62a5-70274015e2cd
2019/11/27 02:29:32 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:32 [INFO]  raft: Node at 127.0.0.1:38512 [Leader] entering Leader state
TestCommand_Flag - 2019/11/27 02:29:32.492724 [INFO] consul: cluster leadership acquired
TestCommand_Flag - 2019/11/27 02:29:32.493181 [INFO] consul: New leader elected: Node 16cc1ef4-dba9-d0b3-6b35-b30782d8497a
TestCommand_File_nameOnly - 2019/11/27 02:29:32.780521 [INFO] agent: Synced service "web"
TestCommand_File_nameOnly - 2019/11/27 02:29:32.780608 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:32.780711 [DEBUG] http: Request PUT /v1/agent/service/register (360.938408ms) from=127.0.0.1:48280
TestCommand_Flag - 2019/11/27 02:29:32.857601 [INFO] agent: Synced node info
TestCommand_File_id - 2019/11/27 02:29:32.935565 [INFO] agent: Synced node info
TestCommand_File_nameOnly - 2019/11/27 02:29:33.091827 [INFO] agent: Synced service "web"
TestCommand_File_nameOnly - 2019/11/27 02:29:33.091906 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/11/27 02:29:33.170909 [INFO] agent: Synced service "web"
TestCommand_Flag - 2019/11/27 02:29:33.171009 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/11/27 02:29:33.171087 [DEBUG] http: Request PUT /v1/agent/service/register (501.993055ms) from=127.0.0.1:45966
TestCommand_File_nameOnly - 2019/11/27 02:29:33.176112 [DEBUG] agent: Service "web" in sync
TestCommand_Flag - 2019/11/27 02:29:33.259673 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/11/27 02:29:33.347706 [INFO] agent: Synced service "web"
TestCommand_File_id - 2019/11/27 02:29:33.347794 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/11/27 02:29:33.347879 [DEBUG] http: Request PUT /v1/agent/service/register (617.766142ms) from=127.0.0.1:35852
TestCommand_File_nameOnly - 2019/11/27 02:29:33.513870 [INFO] agent: Synced service "db"
TestCommand_File_nameOnly - 2019/11/27 02:29:33.513948 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.514024 [DEBUG] http: Request PUT /v1/agent/service/register (729.07707ms) from=127.0.0.1:48280
TestCommand_File_nameOnly - 2019/11/27 02:29:33.521372 [DEBUG] agent: Service "web" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.521448 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.521482 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.521560 [DEBUG] agent: Service "web" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.521602 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.521632 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.606780 [DEBUG] agent: removed service "web"
TestCommand_Flag - 2019/11/27 02:29:33.616609 [INFO] agent: Synced service "db"
TestCommand_Flag - 2019/11/27 02:29:33.616782 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/11/27 02:29:33.616866 [DEBUG] http: Request PUT /v1/agent/service/register (443.632661ms) from=127.0.0.1:45966
TestCommand_Flag - 2019/11/27 02:29:33.621529 [DEBUG] agent: removed service "web"
TestCommand_Flag - 2019/11/27 02:29:33.621840 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/11/27 02:29:33.791566 [INFO] agent: Synced service "web"
TestCommand_File_id - 2019/11/27 02:29:33.791660 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.882834 [INFO] agent: Deregistered service "web"
TestCommand_File_nameOnly - 2019/11/27 02:29:33.882934 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.882970 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.883046 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (277.157117ms) from=127.0.0.1:48286
TestCommand_File_nameOnly - 2019/11/27 02:29:33.886620 [DEBUG] http: Request GET /v1/agent/services (1.725061ms) from=127.0.0.1:48280
TestCommand_File_id - 2019/11/27 02:29:33.890394 [DEBUG] agent: Service "web" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.888511 [DEBUG] agent: Service "db" in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.893000 [DEBUG] agent: Node info in sync
TestCommand_File_nameOnly - 2019/11/27 02:29:33.895967 [INFO] agent: Requesting shutdown
TestCommand_File_nameOnly - 2019/11/27 02:29:33.896070 [INFO] consul: shutting down server
TestCommand_File_nameOnly - 2019/11/27 02:29:33.896120 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/11/27 02:29:33.979003 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/11/27 02:29:33.983569 [INFO] agent: Deregistered service "web"
TestCommand_Flag - 2019/11/27 02:29:33.983725 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/11/27 02:29:33.983853 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (362.98448ms) from=127.0.0.1:45972
TestCommand_Flag - 2019/11/27 02:29:33.989755 [DEBUG] http: Request GET /v1/agent/services (1.298045ms) from=127.0.0.1:45966
TestCommand_Flag - 2019/11/27 02:29:33.997991 [INFO] agent: Requesting shutdown
TestCommand_Flag - 2019/11/27 02:29:33.998099 [INFO] consul: shutting down server
TestCommand_Flag - 2019/11/27 02:29:33.998150 [WARN] serf: Shutdown without a Leave
TestCommand_File_nameOnly - 2019/11/27 02:29:34.067871 [INFO] manager: shutting down
TestCommand_File_nameOnly - 2019/11/27 02:29:34.135870 [INFO] agent: consul server down
TestCommand_File_nameOnly - 2019/11/27 02:29:34.135952 [INFO] agent: shutdown complete
TestCommand_File_nameOnly - 2019/11/27 02:29:34.136017 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (tcp)
TestCommand_File_nameOnly - 2019/11/27 02:29:34.136159 [INFO] agent: Stopping DNS server 127.0.0.1:38501 (udp)
TestCommand_File_nameOnly - 2019/11/27 02:29:34.136302 [INFO] agent: Stopping HTTP server 127.0.0.1:38502 (tcp)
TestCommand_File_nameOnly - 2019/11/27 02:29:34.136998 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File_nameOnly - 2019/11/27 02:29:34.137118 [ERR] consul: failed to establish leadership: error configuring provider: leadership lost while committing log
TestCommand_File_nameOnly - 2019/11/27 02:29:34.137283 [INFO] agent: Endpoints down
--- PASS: TestCommand_File_nameOnly (3.77s)
TestCommand_Flag - 2019/11/27 02:29:34.137583 [WARN] serf: Shutdown without a Leave
TestCommand_Flag - 2019/11/27 02:29:34.223509 [INFO] manager: shutting down
TestCommand_File_id - 2019/11/27 02:29:34.225366 [INFO] agent: Synced service "db"
TestCommand_File_id - 2019/11/27 02:29:34.230743 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/11/27 02:29:34.231006 [DEBUG] http: Request PUT /v1/agent/service/register (880.763758ms) from=127.0.0.1:35852
TestCommand_File_id - 2019/11/27 02:29:34.231221 [DEBUG] agent: Service "web" in sync
TestCommand_File_id - 2019/11/27 02:29:34.236876 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/11/27 02:29:34.236913 [DEBUG] agent: Node info in sync
TestCommand_Flag - 2019/11/27 02:29:34.251279 [ERR] consul: failed to establish leadership: error generating CA root certificate: leadership lost while committing log
TestCommand_Flag - 2019/11/27 02:29:34.252108 [INFO] agent: consul server down
TestCommand_Flag - 2019/11/27 02:29:34.252323 [INFO] agent: shutdown complete
TestCommand_Flag - 2019/11/27 02:29:34.252497 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (tcp)
TestCommand_Flag - 2019/11/27 02:29:34.252915 [INFO] agent: Stopping DNS server 127.0.0.1:38507 (udp)
TestCommand_Flag - 2019/11/27 02:29:34.253373 [INFO] agent: Stopping HTTP server 127.0.0.1:38508 (tcp)
TestCommand_Flag - 2019/11/27 02:29:34.255197 [INFO] agent: Waiting for endpoints to shut down
TestCommand_Flag - 2019/11/27 02:29:34.255397 [INFO] agent: Endpoints down
--- PASS: TestCommand_Flag (3.88s)
TestCommand_File_id - 2019/11/27 02:29:34.311055 [DEBUG] agent: removed service "web"
TestCommand_File_id - 2019/11/27 02:29:34.580962 [INFO] agent: Deregistered service "web"
TestCommand_File_id - 2019/11/27 02:29:34.581056 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/11/27 02:29:34.581090 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/11/27 02:29:34.581186 [DEBUG] http: Request PUT /v1/agent/service/deregister/web (270.867895ms) from=127.0.0.1:35858
TestCommand_File_id - 2019/11/27 02:29:34.581792 [DEBUG] agent: Service "db" in sync
TestCommand_File_id - 2019/11/27 02:29:34.581846 [DEBUG] agent: Node info in sync
TestCommand_File_id - 2019/11/27 02:29:34.584261 [DEBUG] http: Request GET /v1/agent/services (1.037037ms) from=127.0.0.1:35852
TestCommand_File_id - 2019/11/27 02:29:34.587418 [INFO] agent: Requesting shutdown
TestCommand_File_id - 2019/11/27 02:29:34.587517 [INFO] consul: shutting down server
TestCommand_File_id - 2019/11/27 02:29:34.587560 [WARN] serf: Shutdown without a Leave
TestCommand_File_id - 2019/11/27 02:29:34.659467 [WARN] serf: Shutdown without a Leave
TestCommand_File_id - 2019/11/27 02:29:34.724361 [INFO] manager: shutting down
TestCommand_File_id - 2019/11/27 02:29:34.801478 [INFO] agent: consul server down
TestCommand_File_id - 2019/11/27 02:29:34.801558 [INFO] agent: shutdown complete
TestCommand_File_id - 2019/11/27 02:29:34.801615 [INFO] agent: Stopping DNS server 127.0.0.1:38513 (tcp)
TestCommand_File_id - 2019/11/27 02:29:34.801821 [INFO] agent: Stopping DNS server 127.0.0.1:38513 (udp)
TestCommand_File_id - 2019/11/27 02:29:34.801982 [INFO] agent: Stopping HTTP server 127.0.0.1:38514 (tcp)
TestCommand_File_id - 2019/11/27 02:29:34.802606 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File_id - 2019/11/27 02:29:34.802718 [ERR] connect: Apply failed leadership lost while committing log
TestCommand_File_id - 2019/11/27 02:29:34.802780 [ERR] consul: failed to establish leadership: leadership lost while committing log
TestCommand_File_id - 2019/11/27 02:29:34.802969 [INFO] agent: Endpoints down
--- PASS: TestCommand_File_id (4.43s)
PASS
ok  	github.com/hashicorp/consul/command/services/deregister	4.646s
=== RUN   TestCommand_noTabs
=== PAUSE TestCommand_noTabs
=== RUN   TestCommand_Validation
=== PAUSE TestCommand_Validation
=== RUN   TestCommand_File
=== PAUSE TestCommand_File
=== RUN   TestCommand_Flags
=== PAUSE TestCommand_Flags
=== CONT  TestCommand_noTabs
=== CONT  TestCommand_Validation
--- PASS: TestCommand_noTabs (0.00s)
=== CONT  TestCommand_Flags
=== RUN   TestCommand_Validation/no_args_or_id
=== RUN   TestCommand_Validation/args_and_-name
--- PASS: TestCommand_Validation (0.01s)
    --- PASS: TestCommand_Validation/no_args_or_id (0.00s)
    --- PASS: TestCommand_Validation/args_and_-name (0.00s)
=== CONT  TestCommand_File
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_Flags - 2019/11/27 02:29:38.543635 [WARN] agent: Node name "Node 500cf9a9-ad9b-e72a-cd29-79731f83218e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_Flags - 2019/11/27 02:29:38.545211 [DEBUG] tlsutil: Update with version 1
TestCommand_Flags - 2019/11/27 02:29:38.545304 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_Flags - 2019/11/27 02:29:38.545786 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_Flags - 2019/11/27 02:29:38.546204 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestCommand_File - 2019/11/27 02:29:38.550077 [WARN] agent: Node name "Node 08c88165-b19d-860a-b8d9-ebc787fcf4ef" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestCommand_File - 2019/11/27 02:29:38.550645 [DEBUG] tlsutil: Update with version 1
TestCommand_File - 2019/11/27 02:29:38.550722 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestCommand_File - 2019/11/27 02:29:38.551094 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestCommand_File - 2019/11/27 02:29:38.551260 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:29:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:500cf9a9-ad9b-e72a-cd29-79731f83218e Address:127.0.0.1:53506}]
2019/11/27 02:29:39 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:08c88165-b19d-860a-b8d9-ebc787fcf4ef Address:127.0.0.1:53512}]
2019/11/27 02:29:39 [INFO]  raft: Node at 127.0.0.1:53512 [Follower] entering Follower state (Leader: "")
2019/11/27 02:29:39 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestCommand_File - 2019/11/27 02:29:39.448802 [INFO] serf: EventMemberJoin: Node 08c88165-b19d-860a-b8d9-ebc787fcf4ef.dc1 127.0.0.1
TestCommand_File - 2019/11/27 02:29:39.457117 [INFO] serf: EventMemberJoin: Node 08c88165-b19d-860a-b8d9-ebc787fcf4ef 127.0.0.1
TestCommand_File - 2019/11/27 02:29:39.477092 [INFO] agent: Started DNS server 127.0.0.1:53507 (udp)
2019/11/27 02:29:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:39 [INFO]  raft: Node at 127.0.0.1:53512 [Candidate] entering Candidate state in term 2
TestCommand_File - 2019/11/27 02:29:39.480772 [INFO] consul: Handled member-join event for server "Node 08c88165-b19d-860a-b8d9-ebc787fcf4ef.dc1" in area "wan"
TestCommand_File - 2019/11/27 02:29:39.487893 [INFO] agent: Started DNS server 127.0.0.1:53507 (tcp)
TestCommand_File - 2019/11/27 02:29:39.489872 [INFO] agent: Started HTTP server on 127.0.0.1:53508 (tcp)
TestCommand_File - 2019/11/27 02:29:39.490045 [INFO] agent: started state syncer
TestCommand_Flags - 2019/11/27 02:29:39.490740 [INFO] serf: EventMemberJoin: Node 500cf9a9-ad9b-e72a-cd29-79731f83218e.dc1 127.0.0.1
TestCommand_File - 2019/11/27 02:29:39.491061 [INFO] consul: Adding LAN server Node 08c88165-b19d-860a-b8d9-ebc787fcf4ef (Addr: tcp/127.0.0.1:53512) (DC: dc1)
2019/11/27 02:29:39 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:29:39 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
TestCommand_Flags - 2019/11/27 02:29:39.512494 [INFO] serf: EventMemberJoin: Node 500cf9a9-ad9b-e72a-cd29-79731f83218e 127.0.0.1
TestCommand_Flags - 2019/11/27 02:29:39.513881 [INFO] consul: Adding LAN server Node 500cf9a9-ad9b-e72a-cd29-79731f83218e (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestCommand_Flags - 2019/11/27 02:29:39.514979 [INFO] consul: Handled member-join event for server "Node 500cf9a9-ad9b-e72a-cd29-79731f83218e.dc1" in area "wan"
TestCommand_Flags - 2019/11/27 02:29:39.517545 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestCommand_Flags - 2019/11/27 02:29:39.517882 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestCommand_Flags - 2019/11/27 02:29:39.519831 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestCommand_Flags - 2019/11/27 02:29:39.520311 [INFO] agent: started state syncer
2019/11/27 02:29:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:40 [INFO]  raft: Node at 127.0.0.1:53512 [Leader] entering Leader state
TestCommand_File - 2019/11/27 02:29:40.335622 [INFO] consul: cluster leadership acquired
TestCommand_File - 2019/11/27 02:29:40.336341 [INFO] consul: New leader elected: Node 08c88165-b19d-860a-b8d9-ebc787fcf4ef
2019/11/27 02:29:40 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:29:40 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestCommand_Flags - 2019/11/27 02:29:40.413891 [INFO] consul: cluster leadership acquired
TestCommand_Flags - 2019/11/27 02:29:40.414332 [INFO] consul: New leader elected: Node 500cf9a9-ad9b-e72a-cd29-79731f83218e
TestCommand_Flags - 2019/11/27 02:29:40.814860 [INFO] agent: Synced node info
TestCommand_File - 2019/11/27 02:29:40.815366 [INFO] agent: Synced service "web"
TestCommand_File - 2019/11/27 02:29:40.815423 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/11/27 02:29:40.815505 [DEBUG] agent: Service "web" in sync
TestCommand_File - 2019/11/27 02:29:40.815543 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/11/27 02:29:40.815626 [DEBUG] agent: Service "web" in sync
TestCommand_File - 2019/11/27 02:29:40.815661 [DEBUG] agent: Node info in sync
TestCommand_File - 2019/11/27 02:29:40.815748 [DEBUG] http: Request PUT /v1/agent/service/register (325.786163ms) from=127.0.0.1:39958
TestCommand_File - 2019/11/27 02:29:40.821226 [DEBUG] http: Request GET /v1/agent/services (2.088407ms) from=127.0.0.1:39962
TestCommand_File - 2019/11/27 02:29:40.824375 [INFO] agent: Requesting shutdown
TestCommand_File - 2019/11/27 02:29:40.824492 [INFO] consul: shutting down server
TestCommand_File - 2019/11/27 02:29:40.824538 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/11/27 02:29:40.902026 [WARN] serf: Shutdown without a Leave
TestCommand_File - 2019/11/27 02:29:40.978731 [INFO] manager: shutting down
TestCommand_File - 2019/11/27 02:29:40.979440 [INFO] agent: consul server down
TestCommand_File - 2019/11/27 02:29:40.979511 [INFO] agent: shutdown complete
TestCommand_File - 2019/11/27 02:29:40.979573 [INFO] agent: Stopping DNS server 127.0.0.1:53507 (tcp)
TestCommand_File - 2019/11/27 02:29:40.979730 [INFO] agent: Stopping DNS server 127.0.0.1:53507 (udp)
TestCommand_File - 2019/11/27 02:29:40.979917 [INFO] agent: Stopping HTTP server 127.0.0.1:53508 (tcp)
TestCommand_File - 2019/11/27 02:29:40.980693 [INFO] agent: Waiting for endpoints to shut down
TestCommand_File - 2019/11/27 02:29:40.981204 [INFO] agent: Endpoints down
--- PASS: TestCommand_File (2.55s)
TestCommand_File - 2019/11/27 02:29:40.981341 [ERR] consul: failed to establish leadership: raft is already shutdown
TestCommand_Flags - 2019/11/27 02:29:41.136067 [INFO] agent: Synced service "web"
TestCommand_Flags - 2019/11/27 02:29:41.136171 [DEBUG] agent: Node info in sync
TestCommand_Flags - 2019/11/27 02:29:41.136441 [DEBUG] http: Request PUT /v1/agent/service/register (637.468495ms) from=127.0.0.1:43430
TestCommand_Flags - 2019/11/27 02:29:41.140982 [DEBUG] http: Request GET /v1/agent/services (1.082038ms) from=127.0.0.1:43434
TestCommand_Flags - 2019/11/27 02:29:41.144014 [INFO] agent: Requesting shutdown
TestCommand_Flags - 2019/11/27 02:29:41.144159 [INFO] consul: shutting down server
TestCommand_Flags - 2019/11/27 02:29:41.144480 [WARN] serf: Shutdown without a Leave
TestCommand_Flags - 2019/11/27 02:29:41.202521 [WARN] serf: Shutdown without a Leave
TestCommand_Flags - 2019/11/27 02:29:41.313596 [INFO] manager: shutting down
TestCommand_Flags - 2019/11/27 02:29:41.445797 [INFO] agent: consul server down
TestCommand_Flags - 2019/11/27 02:29:41.445879 [INFO] agent: shutdown complete
TestCommand_Flags - 2019/11/27 02:29:41.445952 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestCommand_Flags - 2019/11/27 02:29:41.446107 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestCommand_Flags - 2019/11/27 02:29:41.446272 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestCommand_Flags - 2019/11/27 02:29:41.447022 [INFO] agent: Waiting for endpoints to shut down
TestCommand_Flags - 2019/11/27 02:29:41.447215 [INFO] agent: Endpoints down
--- PASS: TestCommand_Flags (3.03s)
PASS
ok  	github.com/hashicorp/consul/command/services/register	3.184s
=== RUN   TestSnapshotCommand_noTabs
=== PAUSE TestSnapshotCommand_noTabs
=== CONT  TestSnapshotCommand_noTabs
--- PASS: TestSnapshotCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot	0.042s
=== RUN   TestSnapshotInspectCommand_noTabs
=== PAUSE TestSnapshotInspectCommand_noTabs
=== RUN   TestSnapshotInspectCommand_Validation
=== PAUSE TestSnapshotInspectCommand_Validation
=== RUN   TestSnapshotInspectCommand
=== PAUSE TestSnapshotInspectCommand
=== CONT  TestSnapshotInspectCommand_noTabs
--- PASS: TestSnapshotInspectCommand_noTabs (0.00s)
=== CONT  TestSnapshotInspectCommand
=== CONT  TestSnapshotInspectCommand_Validation
--- PASS: TestSnapshotInspectCommand_Validation (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotInspectCommand - 2019/11/27 02:30:02.573246 [WARN] agent: Node name "Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotInspectCommand - 2019/11/27 02:30:02.574053 [DEBUG] tlsutil: Update with version 1
TestSnapshotInspectCommand - 2019/11/27 02:30:02.574123 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSnapshotInspectCommand - 2019/11/27 02:30:02.574310 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestSnapshotInspectCommand - 2019/11/27 02:30:02.574448 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:03 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6445e506-c1f8-a5df-a30f-1f276fc1b6ad Address:127.0.0.1:53506}]
2019/11/27 02:30:03 [INFO]  raft: Node at 127.0.0.1:53506 [Follower] entering Follower state (Leader: "")
TestSnapshotInspectCommand - 2019/11/27 02:30:03.615826 [INFO] serf: EventMemberJoin: Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad.dc1 127.0.0.1
TestSnapshotInspectCommand - 2019/11/27 02:30:03.620763 [INFO] serf: EventMemberJoin: Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad 127.0.0.1
TestSnapshotInspectCommand - 2019/11/27 02:30:03.625639 [INFO] consul: Adding LAN server Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad (Addr: tcp/127.0.0.1:53506) (DC: dc1)
TestSnapshotInspectCommand - 2019/11/27 02:30:03.626519 [INFO] consul: Handled member-join event for server "Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad.dc1" in area "wan"
TestSnapshotInspectCommand - 2019/11/27 02:30:03.626733 [INFO] agent: Started DNS server 127.0.0.1:53501 (tcp)
TestSnapshotInspectCommand - 2019/11/27 02:30:03.627074 [INFO] agent: Started DNS server 127.0.0.1:53501 (udp)
TestSnapshotInspectCommand - 2019/11/27 02:30:03.629278 [INFO] agent: Started HTTP server on 127.0.0.1:53502 (tcp)
TestSnapshotInspectCommand - 2019/11/27 02:30:03.629435 [INFO] agent: started state syncer
2019/11/27 02:30:03 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:03 [INFO]  raft: Node at 127.0.0.1:53506 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:04 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:04 [INFO]  raft: Node at 127.0.0.1:53506 [Leader] entering Leader state
TestSnapshotInspectCommand - 2019/11/27 02:30:04.088629 [INFO] consul: cluster leadership acquired
TestSnapshotInspectCommand - 2019/11/27 02:30:04.089209 [INFO] consul: New leader elected: Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad
TestSnapshotInspectCommand - 2019/11/27 02:30:05.090058 [INFO] agent: Synced node info
TestSnapshotInspectCommand - 2019/11/27 02:30:05.090172 [DEBUG] agent: Node info in sync
TestSnapshotInspectCommand - 2019/11/27 02:30:06.310985 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotInspectCommand - 2019/11/27 02:30:06.311415 [DEBUG] consul: Skipping self join check for "Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad" since the cluster is too small
TestSnapshotInspectCommand - 2019/11/27 02:30:06.311554 [INFO] consul: member 'Node 6445e506-c1f8-a5df-a30f-1f276fc1b6ad' joined, marking health alive
2019/11/27 02:30:06 [INFO] consul.fsm: snapshot created in 171.673µs
2019/11/27 02:30:06 [INFO]  raft: Starting snapshot up to 9
2019/11/27 02:30:06 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotInspectCommand-agent363280015/raft/snapshots/2-9-1574821806521.tmp
2019/11/27 02:30:07 [INFO]  raft: Snapshot to 9 complete
TestSnapshotInspectCommand - 2019/11/27 02:30:07.095064 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestSnapshotInspectCommand - 2019/11/27 02:30:07.095155 [DEBUG] agent: Node info in sync
TestSnapshotInspectCommand - 2019/11/27 02:30:07.146589 [DEBUG] http: Request GET /v1/snapshot (2.961333047s) from=127.0.0.1:43436
TestSnapshotInspectCommand - 2019/11/27 02:30:07.155697 [INFO] agent: Requesting shutdown
TestSnapshotInspectCommand - 2019/11/27 02:30:07.155809 [INFO] consul: shutting down server
TestSnapshotInspectCommand - 2019/11/27 02:30:07.155865 [WARN] serf: Shutdown without a Leave
TestSnapshotInspectCommand - 2019/11/27 02:30:07.265802 [WARN] serf: Shutdown without a Leave
TestSnapshotInspectCommand - 2019/11/27 02:30:07.388261 [INFO] manager: shutting down
TestSnapshotInspectCommand - 2019/11/27 02:30:07.389235 [INFO] agent: consul server down
TestSnapshotInspectCommand - 2019/11/27 02:30:07.389432 [INFO] agent: shutdown complete
TestSnapshotInspectCommand - 2019/11/27 02:30:07.389551 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (tcp)
TestSnapshotInspectCommand - 2019/11/27 02:30:07.390061 [INFO] agent: Stopping DNS server 127.0.0.1:53501 (udp)
TestSnapshotInspectCommand - 2019/11/27 02:30:07.390266 [INFO] agent: Stopping HTTP server 127.0.0.1:53502 (tcp)
TestSnapshotInspectCommand - 2019/11/27 02:30:07.391371 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotInspectCommand - 2019/11/27 02:30:07.391497 [INFO] agent: Endpoints down
--- PASS: TestSnapshotInspectCommand (4.91s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/inspect	5.125s
=== RUN   TestSnapshotRestoreCommand_noTabs
=== PAUSE TestSnapshotRestoreCommand_noTabs
=== RUN   TestSnapshotRestoreCommand_Validation
=== PAUSE TestSnapshotRestoreCommand_Validation
=== RUN   TestSnapshotRestoreCommand
=== PAUSE TestSnapshotRestoreCommand
=== CONT  TestSnapshotRestoreCommand_noTabs
=== CONT  TestSnapshotRestoreCommand
--- PASS: TestSnapshotRestoreCommand_noTabs (0.00s)
=== CONT  TestSnapshotRestoreCommand_Validation
--- PASS: TestSnapshotRestoreCommand_Validation (0.01s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotRestoreCommand - 2019/11/27 02:30:04.759623 [WARN] agent: Node name "Node 29dff750-bc64-cd4b-b077-71a1d77025c2" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotRestoreCommand - 2019/11/27 02:30:04.760782 [DEBUG] tlsutil: Update with version 1
TestSnapshotRestoreCommand - 2019/11/27 02:30:04.760990 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSnapshotRestoreCommand - 2019/11/27 02:30:04.761355 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestSnapshotRestoreCommand - 2019/11/27 02:30:04.761748 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:06 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:29dff750-bc64-cd4b-b077-71a1d77025c2 Address:127.0.0.1:46006}]
2019/11/27 02:30:06 [INFO]  raft: Node at 127.0.0.1:46006 [Follower] entering Follower state (Leader: "")
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.271286 [INFO] serf: EventMemberJoin: Node 29dff750-bc64-cd4b-b077-71a1d77025c2.dc1 127.0.0.1
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.275245 [INFO] serf: EventMemberJoin: Node 29dff750-bc64-cd4b-b077-71a1d77025c2 127.0.0.1
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.277525 [INFO] agent: Started DNS server 127.0.0.1:46001 (udp)
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.277977 [INFO] consul: Adding LAN server Node 29dff750-bc64-cd4b-b077-71a1d77025c2 (Addr: tcp/127.0.0.1:46006) (DC: dc1)
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.278108 [INFO] consul: Handled member-join event for server "Node 29dff750-bc64-cd4b-b077-71a1d77025c2.dc1" in area "wan"
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.278680 [INFO] agent: Started DNS server 127.0.0.1:46001 (tcp)
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.282755 [INFO] agent: Started HTTP server on 127.0.0.1:46002 (tcp)
TestSnapshotRestoreCommand - 2019/11/27 02:30:06.282928 [INFO] agent: started state syncer
2019/11/27 02:30:06 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:06 [INFO]  raft: Node at 127.0.0.1:46006 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:07 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:07 [INFO]  raft: Node at 127.0.0.1:46006 [Leader] entering Leader state
TestSnapshotRestoreCommand - 2019/11/27 02:30:07.077363 [INFO] consul: cluster leadership acquired
TestSnapshotRestoreCommand - 2019/11/27 02:30:07.078048 [INFO] consul: New leader elected: Node 29dff750-bc64-cd4b-b077-71a1d77025c2
TestSnapshotRestoreCommand - 2019/11/27 02:30:07.533387 [INFO] agent: Synced node info
TestSnapshotRestoreCommand - 2019/11/27 02:30:08.344258 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotRestoreCommand - 2019/11/27 02:30:08.344854 [DEBUG] consul: Skipping self join check for "Node 29dff750-bc64-cd4b-b077-71a1d77025c2" since the cluster is too small
TestSnapshotRestoreCommand - 2019/11/27 02:30:08.345039 [INFO] consul: member 'Node 29dff750-bc64-cd4b-b077-71a1d77025c2' joined, marking health alive
2019/11/27 02:30:08 [INFO] consul.fsm: snapshot created in 179.34µs
2019/11/27 02:30:08 [INFO]  raft: Starting snapshot up to 9
2019/11/27 02:30:08 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotRestoreCommand-agent670745526/raft/snapshots/2-9-1574821808477.tmp
2019/11/27 02:30:08 [INFO]  raft: Snapshot to 9 complete
TestSnapshotRestoreCommand - 2019/11/27 02:30:08.757090 [DEBUG] http: Request GET /v1/snapshot (1.632278861s) from=127.0.0.1:50336
2019/11/27 02:30:08 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotRestoreCommand-agent670745526/raft/snapshots/2-11-1574821808832.tmp
2019/11/27 02:30:09 [INFO]  raft: Copied 3506 bytes to local snapshot
2019/11/27 02:30:09 [INFO]  raft: Restored user snapshot (index 11)
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.479855 [DEBUG] http: Request PUT /v1/snapshot (716.536253ms) from=127.0.0.1:50338
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.481737 [INFO] agent: Requesting shutdown
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.481823 [INFO] consul: shutting down server
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.481870 [WARN] serf: Shutdown without a Leave
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.621622 [WARN] serf: Shutdown without a Leave
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.676950 [INFO] manager: shutting down
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.677368 [INFO] agent: consul server down
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.677415 [INFO] agent: shutdown complete
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.677465 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (tcp)
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.677594 [INFO] agent: Stopping DNS server 127.0.0.1:46001 (udp)
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.677754 [INFO] agent: Stopping HTTP server 127.0.0.1:46002 (tcp)
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.678397 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotRestoreCommand - 2019/11/27 02:30:09.678480 [INFO] agent: Endpoints down
--- PASS: TestSnapshotRestoreCommand (5.02s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/restore	5.196s
=== RUN   TestSnapshotSaveCommand_noTabs
=== PAUSE TestSnapshotSaveCommand_noTabs
=== RUN   TestSnapshotSaveCommand_Validation
=== PAUSE TestSnapshotSaveCommand_Validation
=== RUN   TestSnapshotSaveCommand
=== PAUSE TestSnapshotSaveCommand
=== CONT  TestSnapshotSaveCommand_noTabs
=== CONT  TestSnapshotSaveCommand
=== CONT  TestSnapshotSaveCommand_Validation
--- PASS: TestSnapshotSaveCommand_Validation (0.00s)
--- PASS: TestSnapshotSaveCommand_noTabs (0.02s)
WARNING: bootstrap = true: do not enable unless necessary
TestSnapshotSaveCommand - 2019/11/27 02:30:10.638377 [WARN] agent: Node name "Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestSnapshotSaveCommand - 2019/11/27 02:30:10.639698 [DEBUG] tlsutil: Update with version 1
TestSnapshotSaveCommand - 2019/11/27 02:30:10.639939 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestSnapshotSaveCommand - 2019/11/27 02:30:10.640243 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestSnapshotSaveCommand - 2019/11/27 02:30:10.640526 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:11 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:34cbd78b-462a-dd0b-8df6-e4bb072694fb Address:127.0.0.1:20506}]
2019/11/27 02:30:11 [INFO]  raft: Node at 127.0.0.1:20506 [Follower] entering Follower state (Leader: "")
TestSnapshotSaveCommand - 2019/11/27 02:30:11.526308 [INFO] serf: EventMemberJoin: Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb.dc1 127.0.0.1
TestSnapshotSaveCommand - 2019/11/27 02:30:11.530219 [INFO] serf: EventMemberJoin: Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb 127.0.0.1
TestSnapshotSaveCommand - 2019/11/27 02:30:11.535187 [INFO] consul: Handled member-join event for server "Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb.dc1" in area "wan"
TestSnapshotSaveCommand - 2019/11/27 02:30:11.535740 [INFO] consul: Adding LAN server Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb (Addr: tcp/127.0.0.1:20506) (DC: dc1)
TestSnapshotSaveCommand - 2019/11/27 02:30:11.536325 [INFO] agent: Started DNS server 127.0.0.1:20501 (tcp)
TestSnapshotSaveCommand - 2019/11/27 02:30:11.537314 [INFO] agent: Started DNS server 127.0.0.1:20501 (udp)
TestSnapshotSaveCommand - 2019/11/27 02:30:11.539631 [INFO] agent: Started HTTP server on 127.0.0.1:20502 (tcp)
TestSnapshotSaveCommand - 2019/11/27 02:30:11.539737 [INFO] agent: started state syncer
2019/11/27 02:30:11 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:11 [INFO]  raft: Node at 127.0.0.1:20506 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:11 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:11 [INFO]  raft: Node at 127.0.0.1:20506 [Leader] entering Leader state
TestSnapshotSaveCommand - 2019/11/27 02:30:11.977368 [INFO] consul: cluster leadership acquired
TestSnapshotSaveCommand - 2019/11/27 02:30:11.978224 [INFO] consul: New leader elected: Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb
TestSnapshotSaveCommand - 2019/11/27 02:30:12.266511 [INFO] agent: Synced node info
TestSnapshotSaveCommand - 2019/11/27 02:30:13.088546 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestSnapshotSaveCommand - 2019/11/27 02:30:13.088997 [DEBUG] consul: Skipping self join check for "Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb" since the cluster is too small
TestSnapshotSaveCommand - 2019/11/27 02:30:13.089956 [INFO] consul: member 'Node 34cbd78b-462a-dd0b-8df6-e4bb072694fb' joined, marking health alive
2019/11/27 02:30:13 [INFO] consul.fsm: snapshot created in 202.007µs
2019/11/27 02:30:13 [INFO]  raft: Starting snapshot up to 9
2019/11/27 02:30:13 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotSaveCommand-agent643208867/raft/snapshots/2-9-1574821813221.tmp
2019/11/27 02:30:13 [INFO]  raft: Snapshot to 9 complete
TestSnapshotSaveCommand - 2019/11/27 02:30:13.981492 [DEBUG] http: Request GET /v1/snapshot (1.950941411s) from=127.0.0.1:49878
TestSnapshotSaveCommand - 2019/11/27 02:30:14.136157 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestSnapshotSaveCommand - 2019/11/27 02:30:14.136241 [DEBUG] agent: Node info in sync
TestSnapshotSaveCommand - 2019/11/27 02:30:14.136318 [DEBUG] agent: Node info in sync
2019/11/27 02:30:14 [INFO] snapshot: Creating new snapshot at /tmp/TestSnapshotSaveCommand-agent643208867/raft/snapshots/2-11-1574821814143.tmp
2019/11/27 02:30:14 [INFO]  raft: Copied 3506 bytes to local snapshot
2019/11/27 02:30:14 [INFO]  raft: Restored user snapshot (index 11)
TestSnapshotSaveCommand - 2019/11/27 02:30:14.834606 [DEBUG] http: Request PUT /v1/snapshot (839.150234ms) from=127.0.0.1:49882
TestSnapshotSaveCommand - 2019/11/27 02:30:14.836514 [INFO] agent: Requesting shutdown
TestSnapshotSaveCommand - 2019/11/27 02:30:14.836593 [INFO] consul: shutting down server
TestSnapshotSaveCommand - 2019/11/27 02:30:14.836641 [WARN] serf: Shutdown without a Leave
TestSnapshotSaveCommand - 2019/11/27 02:30:14.943077 [WARN] serf: Shutdown without a Leave
TestSnapshotSaveCommand - 2019/11/27 02:30:14.998711 [INFO] manager: shutting down
TestSnapshotSaveCommand - 2019/11/27 02:30:14.999431 [INFO] agent: consul server down
TestSnapshotSaveCommand - 2019/11/27 02:30:14.999492 [INFO] agent: shutdown complete
TestSnapshotSaveCommand - 2019/11/27 02:30:14.999549 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (tcp)
TestSnapshotSaveCommand - 2019/11/27 02:30:14.999705 [INFO] agent: Stopping DNS server 127.0.0.1:20501 (udp)
TestSnapshotSaveCommand - 2019/11/27 02:30:14.999852 [INFO] agent: Stopping HTTP server 127.0.0.1:20502 (tcp)
TestSnapshotSaveCommand - 2019/11/27 02:30:15.000409 [INFO] agent: Waiting for endpoints to shut down
TestSnapshotSaveCommand - 2019/11/27 02:30:15.000522 [INFO] agent: Endpoints down
--- PASS: TestSnapshotSaveCommand (4.52s)
PASS
ok  	github.com/hashicorp/consul/command/snapshot/save	4.807s
=== RUN   TestValidateCommand_noTabs
=== PAUSE TestValidateCommand_noTabs
=== RUN   TestValidateCommand_FailOnEmptyFile
=== PAUSE TestValidateCommand_FailOnEmptyFile
=== RUN   TestValidateCommand_SucceedOnMinimalConfigFile
=== PAUSE TestValidateCommand_SucceedOnMinimalConfigFile
=== RUN   TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== PAUSE TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== RUN   TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== PAUSE TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== RUN   TestValidateCommand_SucceedWithJSONAsHCL
=== PAUSE TestValidateCommand_SucceedWithJSONAsHCL
=== RUN   TestValidateCommand_SucceedOnMinimalConfigDir
=== PAUSE TestValidateCommand_SucceedOnMinimalConfigDir
=== RUN   TestValidateCommand_FailForInvalidJSONConfigFormat
=== PAUSE TestValidateCommand_FailForInvalidJSONConfigFormat
=== RUN   TestValidateCommand_Quiet
=== PAUSE TestValidateCommand_Quiet
=== CONT  TestValidateCommand_noTabs
=== CONT  TestValidateCommand_SucceedWithJSONAsHCL
--- PASS: TestValidateCommand_noTabs (0.00s)
=== CONT  TestValidateCommand_SucceedWithMinimalHCLConfigFormat
=== CONT  TestValidateCommand_SucceedWithMinimalJSONConfigFormat
=== CONT  TestValidateCommand_SucceedOnMinimalConfigFile
--- PASS: TestValidateCommand_SucceedWithJSONAsHCL (0.09s)
=== CONT  TestValidateCommand_Quiet
--- PASS: TestValidateCommand_SucceedOnMinimalConfigFile (0.08s)
=== CONT  TestValidateCommand_FailOnEmptyFile
--- PASS: TestValidateCommand_SucceedWithMinimalHCLConfigFormat (0.12s)
--- PASS: TestValidateCommand_SucceedWithMinimalJSONConfigFormat (0.13s)
=== CONT  TestValidateCommand_SucceedOnMinimalConfigDir
=== CONT  TestValidateCommand_FailForInvalidJSONConfigFormat
--- PASS: TestValidateCommand_Quiet (0.10s)
--- PASS: TestValidateCommand_FailForInvalidJSONConfigFormat (0.05s)
--- PASS: TestValidateCommand_SucceedOnMinimalConfigDir (0.09s)
--- PASS: TestValidateCommand_FailOnEmptyFile (0.13s)
PASS
ok  	github.com/hashicorp/consul/command/validate	0.297s
=== RUN   TestVersionCommand_noTabs
=== PAUSE TestVersionCommand_noTabs
=== CONT  TestVersionCommand_noTabs
--- PASS: TestVersionCommand_noTabs (0.00s)
PASS
ok  	github.com/hashicorp/consul/command/version	0.078s
=== RUN   TestWatchCommand_noTabs
=== PAUSE TestWatchCommand_noTabs
=== RUN   TestWatchCommand
=== PAUSE TestWatchCommand
=== RUN   TestWatchCommandNoConnect
=== PAUSE TestWatchCommandNoConnect
=== RUN   TestWatchCommandNoAgentService
=== PAUSE TestWatchCommandNoAgentService
=== CONT  TestWatchCommandNoAgentService
=== CONT  TestWatchCommandNoConnect
=== CONT  TestWatchCommand
=== CONT  TestWatchCommand_noTabs
--- PASS: TestWatchCommand_noTabs (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommand - 2019/11/27 02:30:46.604231 [WARN] agent: Node name "Node 68b51a04-9478-d065-2126-88b5b4476e5e" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommand - 2019/11/27 02:30:46.604955 [DEBUG] tlsutil: Update with version 1
TestWatchCommand - 2019/11/27 02:30:46.605192 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestWatchCommand - 2019/11/27 02:30:46.605422 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestWatchCommand - 2019/11/27 02:30:46.605516 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommandNoAgentService - 2019/11/27 02:30:46.636590 [WARN] agent: Node name "Node d9f772f3-11e8-f435-5cac-98ff1fee7284" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommandNoAgentService - 2019/11/27 02:30:46.637244 [DEBUG] tlsutil: Update with version 1
TestWatchCommandNoAgentService - 2019/11/27 02:30:46.637309 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestWatchCommandNoAgentService - 2019/11/27 02:30:46.637642 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestWatchCommandNoAgentService - 2019/11/27 02:30:46.637732 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestWatchCommandNoConnect - 2019/11/27 02:30:46.656119 [WARN] agent: Node name "Node 3d6e0012-c497-1077-ed6a-8231ebe33e34" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestWatchCommandNoConnect - 2019/11/27 02:30:46.656741 [DEBUG] tlsutil: Update with version 1
TestWatchCommandNoConnect - 2019/11/27 02:30:46.656812 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestWatchCommandNoConnect - 2019/11/27 02:30:46.657022 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestWatchCommandNoConnect - 2019/11/27 02:30:46.660187 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:47 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:3d6e0012-c497-1077-ed6a-8231ebe33e34 Address:127.0.0.1:44506}]
2019/11/27 02:30:47 [INFO]  raft: Node at 127.0.0.1:44506 [Follower] entering Follower state (Leader: "")
TestWatchCommandNoConnect - 2019/11/27 02:30:47.924367 [INFO] serf: EventMemberJoin: Node 3d6e0012-c497-1077-ed6a-8231ebe33e34.dc1 127.0.0.1
TestWatchCommandNoConnect - 2019/11/27 02:30:47.932497 [INFO] serf: EventMemberJoin: Node 3d6e0012-c497-1077-ed6a-8231ebe33e34 127.0.0.1
TestWatchCommandNoConnect - 2019/11/27 02:30:47.934285 [INFO] consul: Adding LAN server Node 3d6e0012-c497-1077-ed6a-8231ebe33e34 (Addr: tcp/127.0.0.1:44506) (DC: dc1)
TestWatchCommandNoConnect - 2019/11/27 02:30:47.935016 [INFO] consul: Handled member-join event for server "Node 3d6e0012-c497-1077-ed6a-8231ebe33e34.dc1" in area "wan"
TestWatchCommandNoConnect - 2019/11/27 02:30:47.937852 [INFO] agent: Started DNS server 127.0.0.1:44501 (tcp)
TestWatchCommandNoConnect - 2019/11/27 02:30:47.938538 [INFO] agent: Started DNS server 127.0.0.1:44501 (udp)
TestWatchCommandNoConnect - 2019/11/27 02:30:47.941475 [INFO] agent: Started HTTP server on 127.0.0.1:44502 (tcp)
TestWatchCommandNoConnect - 2019/11/27 02:30:47.941588 [INFO] agent: started state syncer
2019/11/27 02:30:47 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:47 [INFO]  raft: Node at 127.0.0.1:44506 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:68b51a04-9478-d065-2126-88b5b4476e5e Address:127.0.0.1:44518}]
2019/11/27 02:30:48 [INFO]  raft: Node at 127.0.0.1:44518 [Follower] entering Follower state (Leader: "")
2019/11/27 02:30:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:d9f772f3-11e8-f435-5cac-98ff1fee7284 Address:127.0.0.1:44512}]
2019/11/27 02:30:48 [INFO]  raft: Node at 127.0.0.1:44512 [Follower] entering Follower state (Leader: "")
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.189936 [INFO] serf: EventMemberJoin: Node d9f772f3-11e8-f435-5cac-98ff1fee7284.dc1 127.0.0.1
TestWatchCommand - 2019/11/27 02:30:48.189936 [INFO] serf: EventMemberJoin: Node 68b51a04-9478-d065-2126-88b5b4476e5e.dc1 127.0.0.1
TestWatchCommand - 2019/11/27 02:30:48.193422 [INFO] serf: EventMemberJoin: Node 68b51a04-9478-d065-2126-88b5b4476e5e 127.0.0.1
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.194381 [INFO] serf: EventMemberJoin: Node d9f772f3-11e8-f435-5cac-98ff1fee7284 127.0.0.1
TestWatchCommand - 2019/11/27 02:30:48.195116 [INFO] consul: Handled member-join event for server "Node 68b51a04-9478-d065-2126-88b5b4476e5e.dc1" in area "wan"
TestWatchCommand - 2019/11/27 02:30:48.195398 [INFO] consul: Adding LAN server Node 68b51a04-9478-d065-2126-88b5b4476e5e (Addr: tcp/127.0.0.1:44518) (DC: dc1)
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.195687 [INFO] consul: Adding LAN server Node d9f772f3-11e8-f435-5cac-98ff1fee7284 (Addr: tcp/127.0.0.1:44512) (DC: dc1)
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.196010 [INFO] consul: Handled member-join event for server "Node d9f772f3-11e8-f435-5cac-98ff1fee7284.dc1" in area "wan"
2019/11/27 02:30:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:48 [INFO]  raft: Node at 127.0.0.1:44512 [Candidate] entering Candidate state in term 2
TestWatchCommand - 2019/11/27 02:30:48.232946 [INFO] agent: Started DNS server 127.0.0.1:44513 (udp)
TestWatchCommand - 2019/11/27 02:30:48.233378 [INFO] agent: Started DNS server 127.0.0.1:44513 (tcp)
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.235126 [INFO] agent: Started DNS server 127.0.0.1:44507 (tcp)
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.235668 [INFO] agent: Started DNS server 127.0.0.1:44507 (udp)
TestWatchCommand - 2019/11/27 02:30:48.235296 [INFO] agent: Started HTTP server on 127.0.0.1:44514 (tcp)
TestWatchCommand - 2019/11/27 02:30:48.236271 [INFO] agent: started state syncer
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.241268 [INFO] agent: Started HTTP server on 127.0.0.1:44508 (tcp)
TestWatchCommandNoAgentService - 2019/11/27 02:30:48.241475 [INFO] agent: started state syncer
2019/11/27 02:30:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:48 [INFO]  raft: Node at 127.0.0.1:44518 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:48 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:48 [INFO]  raft: Node at 127.0.0.1:44506 [Leader] entering Leader state
TestWatchCommandNoConnect - 2019/11/27 02:30:48.542274 [INFO] consul: cluster leadership acquired
TestWatchCommandNoConnect - 2019/11/27 02:30:48.542941 [INFO] consul: New leader elected: Node 3d6e0012-c497-1077-ed6a-8231ebe33e34
2019/11/27 02:30:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:49 [INFO]  raft: Node at 127.0.0.1:44518 [Leader] entering Leader state
2019/11/27 02:30:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:49 [INFO]  raft: Node at 127.0.0.1:44512 [Leader] entering Leader state
TestWatchCommand - 2019/11/27 02:30:49.400031 [INFO] consul: cluster leadership acquired
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.400548 [INFO] consul: cluster leadership acquired
TestWatchCommand - 2019/11/27 02:30:49.400631 [INFO] consul: New leader elected: Node 68b51a04-9478-d065-2126-88b5b4476e5e
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.400988 [INFO] consul: New leader elected: Node d9f772f3-11e8-f435-5cac-98ff1fee7284
TestWatchCommandNoConnect - 2019/11/27 02:30:49.487273 [INFO] agent: Synced node info
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.569172 [INFO] agent: Requesting shutdown
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.569315 [INFO] consul: shutting down server
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.569369 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.569602 [ERR] agent: failed to sync remote state: No cluster leader
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.653968 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/11/27 02:30:49.731943 [INFO] agent: Synced node info
TestWatchCommand - 2019/11/27 02:30:49.732262 [DEBUG] agent: Node info in sync
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.734815 [INFO] manager: shutting down
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909067 [INFO] agent: consul server down
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909142 [INFO] agent: shutdown complete
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909197 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (tcp)
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909341 [INFO] agent: Stopping DNS server 127.0.0.1:44507 (udp)
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909489 [INFO] agent: Stopping HTTP server 127.0.0.1:44508 (tcp)
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909684 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909753 [INFO] agent: Endpoints down
--- PASS: TestWatchCommandNoAgentService (3.42s)
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.909902 [ERR] autopilot: failed to initialize config: leadership lost while committing log
TestWatchCommandNoAgentService - 2019/11/27 02:30:49.910166 [ERR] consul: failed to establish leadership: raft is already shutdown
TestWatchCommandNoConnect - 2019/11/27 02:30:50.146105 [DEBUG] agent: Node info in sync
TestWatchCommandNoConnect - 2019/11/27 02:30:50.146224 [DEBUG] agent: Node info in sync
TestWatchCommandNoConnect - 2019/11/27 02:30:50.597101 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommandNoConnect - 2019/11/27 02:30:50.597527 [DEBUG] consul: Skipping self join check for "Node 3d6e0012-c497-1077-ed6a-8231ebe33e34" since the cluster is too small
TestWatchCommandNoConnect - 2019/11/27 02:30:50.597701 [INFO] consul: member 'Node 3d6e0012-c497-1077-ed6a-8231ebe33e34' joined, marking health alive
TestWatchCommand - 2019/11/27 02:30:50.874987 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestWatchCommand - 2019/11/27 02:30:50.875460 [DEBUG] consul: Skipping self join check for "Node 68b51a04-9478-d065-2126-88b5b4476e5e" since the cluster is too small
TestWatchCommand - 2019/11/27 02:30:50.875652 [INFO] consul: member 'Node 68b51a04-9478-d065-2126-88b5b4476e5e' joined, marking health alive
TestWatchCommandNoConnect - 2019/11/27 02:30:50.879952 [INFO] agent: Requesting shutdown
TestWatchCommandNoConnect - 2019/11/27 02:30:50.880077 [INFO] consul: shutting down server
TestWatchCommandNoConnect - 2019/11/27 02:30:50.880130 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/11/27 02:30:50.963175 [WARN] serf: Shutdown without a Leave
TestWatchCommandNoConnect - 2019/11/27 02:30:51.040883 [INFO] manager: shutting down
TestWatchCommandNoConnect - 2019/11/27 02:30:51.041296 [INFO] agent: consul server down
TestWatchCommandNoConnect - 2019/11/27 02:30:51.041346 [INFO] agent: shutdown complete
TestWatchCommandNoConnect - 2019/11/27 02:30:51.041402 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (tcp)
TestWatchCommandNoConnect - 2019/11/27 02:30:51.041549 [INFO] agent: Stopping DNS server 127.0.0.1:44501 (udp)
TestWatchCommandNoConnect - 2019/11/27 02:30:51.041892 [INFO] agent: Stopping HTTP server 127.0.0.1:44502 (tcp)
TestWatchCommandNoConnect - 2019/11/27 02:30:51.042114 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommandNoConnect - 2019/11/27 02:30:51.042185 [INFO] agent: Endpoints down
--- PASS: TestWatchCommandNoConnect (4.55s)
TestWatchCommand - 2019/11/27 02:30:51.084088 [DEBUG] http: Request GET /v1/agent/self (20.326381ms) from=127.0.0.1:40098
TestWatchCommand - 2019/11/27 02:30:51.103272 [DEBUG] http: Request GET /v1/catalog/nodes (1.385716ms) from=127.0.0.1:40100
TestWatchCommand - 2019/11/27 02:30:51.105867 [INFO] agent: Requesting shutdown
TestWatchCommand - 2019/11/27 02:30:51.105959 [INFO] consul: shutting down server
TestWatchCommand - 2019/11/27 02:30:51.106014 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/11/27 02:30:51.163098 [WARN] serf: Shutdown without a Leave
TestWatchCommand - 2019/11/27 02:30:51.263172 [INFO] manager: shutting down
TestWatchCommand - 2019/11/27 02:30:51.263648 [INFO] agent: consul server down
TestWatchCommand - 2019/11/27 02:30:51.263697 [INFO] agent: shutdown complete
TestWatchCommand - 2019/11/27 02:30:51.263794 [INFO] agent: Stopping DNS server 127.0.0.1:44513 (tcp)
TestWatchCommand - 2019/11/27 02:30:51.263992 [INFO] agent: Stopping DNS server 127.0.0.1:44513 (udp)
TestWatchCommand - 2019/11/27 02:30:51.264172 [INFO] agent: Stopping HTTP server 127.0.0.1:44514 (tcp)
TestWatchCommand - 2019/11/27 02:30:51.264851 [INFO] agent: Waiting for endpoints to shut down
TestWatchCommand - 2019/11/27 02:30:51.265065 [INFO] agent: Endpoints down
--- PASS: TestWatchCommand (4.77s)
PASS
ok  	github.com/hashicorp/consul/command/watch	4.942s
=== RUN   TestStaticResolver_Resolve
=== RUN   TestStaticResolver_Resolve/simples
--- PASS: TestStaticResolver_Resolve (0.00s)
    --- PASS: TestStaticResolver_Resolve/simples (0.00s)
=== RUN   TestConsulResolver_Resolve
WARNING: bootstrap = true: do not enable unless necessary
test-consul - 2019/11/27 02:30:49.548613 [WARN] agent: Node name "Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test-consul - 2019/11/27 02:30:49.549612 [DEBUG] tlsutil: Update with version 1
test-consul - 2019/11/27 02:30:49.549692 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
test-consul - 2019/11/27 02:30:49.549942 [DEBUG] tlsutil: IncomingRPCConfig with version 1
test-consul - 2019/11/27 02:30:49.550102 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:50 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c4a5f54e-272f-096d-b2ea-0769cc3c5a8b Address:127.0.0.1:41506}]
2019/11/27 02:30:50 [INFO]  raft: Node at 127.0.0.1:41506 [Follower] entering Follower state (Leader: "")
test-consul - 2019/11/27 02:30:50.504230 [INFO] serf: EventMemberJoin: Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b.dc1 127.0.0.1
test-consul - 2019/11/27 02:30:50.507826 [INFO] serf: EventMemberJoin: Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b 127.0.0.1
test-consul - 2019/11/27 02:30:50.508893 [INFO] consul: Adding LAN server Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b (Addr: tcp/127.0.0.1:41506) (DC: dc1)
test-consul - 2019/11/27 02:30:50.509160 [INFO] consul: Handled member-join event for server "Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b.dc1" in area "wan"
test-consul - 2019/11/27 02:30:50.509988 [INFO] agent: Started DNS server 127.0.0.1:41501 (tcp)
test-consul - 2019/11/27 02:30:50.510060 [INFO] agent: Started DNS server 127.0.0.1:41501 (udp)
test-consul - 2019/11/27 02:30:50.512765 [INFO] agent: Started HTTP server on 127.0.0.1:41502 (tcp)
test-consul - 2019/11/27 02:30:50.512948 [INFO] agent: started state syncer
2019/11/27 02:30:50 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:50 [INFO]  raft: Node at 127.0.0.1:41506 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:51 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:51 [INFO]  raft: Node at 127.0.0.1:41506 [Leader] entering Leader state
test-consul - 2019/11/27 02:30:51.108690 [INFO] consul: cluster leadership acquired
test-consul - 2019/11/27 02:30:51.109354 [INFO] consul: New leader elected: Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b
test-consul - 2019/11/27 02:30:51.475148 [INFO] agent: Synced node info
test-consul - 2019/11/27 02:30:51.831155 [INFO] agent: Synced service "web"
test-consul - 2019/11/27 02:30:51.831271 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:51.831368 [DEBUG] http: Request PUT /v1/agent/service/register (525.947504ms) from=127.0.0.1:37432
test-consul - 2019/11/27 02:30:51.964404 [DEBUG] agent: Service "web" in sync
test-consul - 2019/11/27 02:30:52.220042 [INFO] agent: Synced service "web-proxy"
test-consul - 2019/11/27 02:30:52.220137 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:52.220223 [DEBUG] http: Request PUT /v1/agent/service/register (386.679604ms) from=127.0.0.1:37432
test-consul - 2019/11/27 02:30:52.420711 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/11/27 02:30:52.619963 [INFO] agent: Synced service "web-proxy-2"
test-consul - 2019/11/27 02:30:52.620043 [DEBUG] agent: Service "web" in sync
test-consul - 2019/11/27 02:30:52.620079 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:52.620156 [DEBUG] http: Request PUT /v1/agent/service/register (397.771994ms) from=127.0.0.1:37432
test-consul - 2019/11/27 02:30:52.764670 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/11/27 02:30:52.764754 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/11/27 02:30:53.142100 [INFO] agent: Synced service "db"
test-consul - 2019/11/27 02:30:53.142185 [DEBUG] agent: Service "web" in sync
test-consul - 2019/11/27 02:30:53.142222 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:53.142309 [DEBUG] http: Request PUT /v1/agent/service/register (520.280971ms) from=127.0.0.1:37432
test-consul - 2019/11/27 02:30:53.142452 [INFO] connect: initialized primary datacenter CA with provider "consul"
test-consul - 2019/11/27 02:30:53.142654 [DEBUG] agent: Service "web" in sync
test-consul - 2019/11/27 02:30:53.142709 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/11/27 02:30:53.142744 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/11/27 02:30:53.142876 [DEBUG] consul: Skipping self join check for "Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b" since the cluster is too small
test-consul - 2019/11/27 02:30:53.143041 [INFO] consul: member 'Node c4a5f54e-272f-096d-b2ea-0769cc3c5a8b' joined, marking health alive
test-consul - 2019/11/27 02:30:53.934202 [INFO] agent: Synced service "db"
test-consul - 2019/11/27 02:30:53.934266 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:53.934344 [DEBUG] agent: Service "web" in sync
test-consul - 2019/11/27 02:30:53.934385 [DEBUG] agent: Service "web-proxy" in sync
test-consul - 2019/11/27 02:30:53.934417 [DEBUG] agent: Service "web-proxy-2" in sync
test-consul - 2019/11/27 02:30:53.934464 [DEBUG] agent: Service "db" in sync
test-consul - 2019/11/27 02:30:53.934508 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:53.938833 [DEBUG] http: Request POST /v1/query (794.273276ms) from=127.0.0.1:37432
=== RUN   TestConsulResolver_Resolve/basic_service_discovery
test-consul - 2019/11/27 02:30:53.950272 [DEBUG] http: Request GET /v1/health/connect/web?connect=true&passing=1&stale= (4.411822ms) from=127.0.0.1:37432
=== RUN   TestConsulResolver_Resolve/basic_service_with_native_service
test-consul - 2019/11/27 02:30:53.973920 [DEBUG] http: Request GET /v1/health/connect/db?connect=true&passing=1&stale= (1.277045ms) from=127.0.0.1:37432
=== RUN   TestConsulResolver_Resolve/Bad_Type_errors
=== RUN   TestConsulResolver_Resolve/Non-existent_service_errors
test-consul - 2019/11/27 02:30:53.979508 [DEBUG] http: Request GET /v1/health/connect/foo?connect=true&passing=1&stale= (1.215376ms) from=127.0.0.1:37432
=== RUN   TestConsulResolver_Resolve/timeout_errors
=== RUN   TestConsulResolver_Resolve/prepared_query_by_id
test-consul - 2019/11/27 02:30:53.984038 [DEBUG] http: Request GET /v1/query/b27136f9-104f-4e3c-99fb-99405cbcca1d/execute?connect=true&stale= (1.919067ms) from=127.0.0.1:37432
=== RUN   TestConsulResolver_Resolve/prepared_query_by_name
test-consul - 2019/11/27 02:30:53.990113 [DEBUG] http: Request GET /v1/query/test-query/execute?connect=true&stale= (1.31938ms) from=127.0.0.1:37432
test-consul - 2019/11/27 02:30:53.993815 [INFO] agent: Requesting shutdown
test-consul - 2019/11/27 02:30:53.993921 [INFO] consul: shutting down server
test-consul - 2019/11/27 02:30:53.993967 [WARN] serf: Shutdown without a Leave
test-consul - 2019/11/27 02:30:54.041210 [WARN] serf: Shutdown without a Leave
test-consul - 2019/11/27 02:30:54.096390 [INFO] manager: shutting down
test-consul - 2019/11/27 02:30:54.097045 [INFO] agent: consul server down
test-consul - 2019/11/27 02:30:54.097106 [INFO] agent: shutdown complete
test-consul - 2019/11/27 02:30:54.097165 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (tcp)
test-consul - 2019/11/27 02:30:54.097303 [INFO] agent: Stopping DNS server 127.0.0.1:41501 (udp)
test-consul - 2019/11/27 02:30:54.097446 [INFO] agent: Stopping HTTP server 127.0.0.1:41502 (tcp)
test-consul - 2019/11/27 02:30:54.097910 [INFO] agent: Waiting for endpoints to shut down
test-consul - 2019/11/27 02:30:54.098002 [INFO] agent: Endpoints down
--- PASS: TestConsulResolver_Resolve (4.63s)
    --- PASS: TestConsulResolver_Resolve/basic_service_discovery (0.03s)
    --- PASS: TestConsulResolver_Resolve/basic_service_with_native_service (0.01s)
    --- PASS: TestConsulResolver_Resolve/Bad_Type_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/Non-existent_service_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/timeout_errors (0.00s)
    --- PASS: TestConsulResolver_Resolve/prepared_query_by_id (0.01s)
    --- PASS: TestConsulResolver_Resolve/prepared_query_by_name (0.01s)
=== RUN   TestConsulResolverFromAddrFunc
=== RUN   TestConsulResolverFromAddrFunc/service
=== RUN   TestConsulResolverFromAddrFunc/query
=== RUN   TestConsulResolverFromAddrFunc/service_with_dc
=== RUN   TestConsulResolverFromAddrFunc/query_with_dc
=== RUN   TestConsulResolverFromAddrFunc/invalid_host:port
=== RUN   TestConsulResolverFromAddrFunc/custom_domain
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter
=== RUN   TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter#01
=== RUN   TestConsulResolverFromAddrFunc/unsupported_tag_filter
=== RUN   TestConsulResolverFromAddrFunc/unsupported_tag_filter_with_DC
--- PASS: TestConsulResolverFromAddrFunc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/service (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/query (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/service_with_dc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/query_with_dc (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/invalid_host:port (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/custom_domain (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_query_type_and_datacenter#01 (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_tag_filter (0.00s)
    --- PASS: TestConsulResolverFromAddrFunc/unsupported_tag_filter_with_DC (0.00s)
=== RUN   TestService_Name
--- PASS: TestService_Name (0.05s)
=== RUN   TestService_Dial
--- SKIP: TestService_Dial (0.00s)
    service_test.go:36: DM-skipped
=== RUN   TestService_ServerTLSConfig
--- SKIP: TestService_ServerTLSConfig (0.00s)
    service_test.go:129: DM-skipped
=== RUN   TestService_HTTPClient
2019/11/27 02:30:54 starting test connect HTTPS server on 127.0.0.1:41507
2019/11/27 02:30:54 test connect service listening on 127.0.0.1:41507
2019/11/27 02:30:54 [DEBUG] resolved service instance: 127.0.0.1:41507 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/backend)
2019/11/27 02:30:54 [DEBUG] successfully connected to 127.0.0.1:41507 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/backend)
--- PASS: TestService_HTTPClient (0.22s)
=== RUN   TestService_HasDefaultHTTPResolverFromAddr
--- PASS: TestService_HasDefaultHTTPResolverFromAddr (0.00s)
=== RUN   Test_verifyServerCertMatchesURI
2019/11/27 02:30:54 [ERR] consul.watch: Watch (type: connect_roots) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/roots: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 5s
2019/11/27 02:30:54 [ERR] consul.watch: Watch (type: connect_leaf) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/leaf/foo: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 5s
=== RUN   Test_verifyServerCertMatchesURI/simple_match
=== RUN   Test_verifyServerCertMatchesURI/different_trust-domain_allowed
=== RUN   Test_verifyServerCertMatchesURI/mismatch
=== RUN   Test_verifyServerCertMatchesURI/no_certs
=== RUN   Test_verifyServerCertMatchesURI/nil_certs
--- PASS: Test_verifyServerCertMatchesURI (0.14s)
    --- PASS: Test_verifyServerCertMatchesURI/simple_match (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/different_trust-domain_allowed (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/mismatch (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/no_certs (0.00s)
    --- PASS: Test_verifyServerCertMatchesURI/nil_certs (0.00s)
=== RUN   TestClientSideVerifier
=== RUN   TestClientSideVerifier/ok_service_ca1
=== RUN   TestClientSideVerifier/untrusted_CA
=== RUN   TestClientSideVerifier/cross_signed_intermediate
=== RUN   TestClientSideVerifier/cross_signed_without_intermediate
--- PASS: TestClientSideVerifier (0.22s)
    --- PASS: TestClientSideVerifier/ok_service_ca1 (0.02s)
    --- PASS: TestClientSideVerifier/untrusted_CA (0.01s)
    --- PASS: TestClientSideVerifier/cross_signed_intermediate (0.04s)
    --- PASS: TestClientSideVerifier/cross_signed_without_intermediate (0.00s)
=== RUN   TestServerSideVerifier
WARNING: bootstrap = true: do not enable unless necessary
test-consul - 2019/11/27 02:30:54.915327 [WARN] agent: Node name "Node 1efa7c0c-99f7-0175-d37f-daff659b19f3" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
test-consul - 2019/11/27 02:30:54.915887 [DEBUG] tlsutil: Update with version 1
test-consul - 2019/11/27 02:30:54.915953 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
test-consul - 2019/11/27 02:30:54.916209 [DEBUG] tlsutil: IncomingRPCConfig with version 1
test-consul - 2019/11/27 02:30:54.916353 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:1efa7c0c-99f7-0175-d37f-daff659b19f3 Address:127.0.0.1:41513}]
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:41513 [Follower] entering Follower state (Leader: "")
test-consul - 2019/11/27 02:30:56.189470 [INFO] serf: EventMemberJoin: Node 1efa7c0c-99f7-0175-d37f-daff659b19f3.dc1 127.0.0.1
test-consul - 2019/11/27 02:30:56.193758 [INFO] serf: EventMemberJoin: Node 1efa7c0c-99f7-0175-d37f-daff659b19f3 127.0.0.1
test-consul - 2019/11/27 02:30:56.194942 [INFO] consul: Handled member-join event for server "Node 1efa7c0c-99f7-0175-d37f-daff659b19f3.dc1" in area "wan"
test-consul - 2019/11/27 02:30:56.195019 [INFO] consul: Adding LAN server Node 1efa7c0c-99f7-0175-d37f-daff659b19f3 (Addr: tcp/127.0.0.1:41513) (DC: dc1)
test-consul - 2019/11/27 02:30:56.196332 [INFO] agent: Started DNS server 127.0.0.1:41508 (tcp)
test-consul - 2019/11/27 02:30:56.196403 [INFO] agent: Started DNS server 127.0.0.1:41508 (udp)
test-consul - 2019/11/27 02:30:56.198617 [INFO] agent: Started HTTP server on 127.0.0.1:41509 (tcp)
test-consul - 2019/11/27 02:30:56.198738 [INFO] agent: started state syncer
2019/11/27 02:30:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:41513 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:56 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:41513 [Leader] entering Leader state
test-consul - 2019/11/27 02:30:56.820131 [INFO] consul: cluster leadership acquired
test-consul - 2019/11/27 02:30:56.820630 [INFO] consul: New leader elected: Node 1efa7c0c-99f7-0175-d37f-daff659b19f3
test-consul - 2019/11/27 02:30:57.141610 [INFO] agent: Synced node info
test-consul - 2019/11/27 02:30:58.342413 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:58.342540 [DEBUG] agent: Node info in sync
test-consul - 2019/11/27 02:30:58.441378 [INFO] connect: initialized primary datacenter CA with provider "consul"
test-consul - 2019/11/27 02:30:58.441891 [DEBUG] consul: Skipping self join check for "Node 1efa7c0c-99f7-0175-d37f-daff659b19f3" since the cluster is too small
test-consul - 2019/11/27 02:30:58.442055 [INFO] consul: member 'Node 1efa7c0c-99f7-0175-d37f-daff659b19f3' joined, marking health alive
test-consul - 2019/11/27 02:30:59.065518 [DEBUG] http: Request POST /v1/connect/intentions (191.179058ms) from=127.0.0.1:59906
test-consul - 2019/11/27 02:30:59.381581 [DEBUG] http: Request POST /v1/connect/intentions (313.259685ms) from=127.0.0.1:59906
2019/11/27 02:30:59 [ERR] consul.watch: Watch (type: connect_roots) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/roots: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 20s
2019/11/27 02:30:59 [ERR] consul.watch: Watch (type: connect_leaf) errored: Get http://127.0.0.1:8500/v1/agent/connect/ca/leaf/foo: dial tcp 127.0.0.1:8500: connect: connection refused, retry in 20s
=== RUN   TestServerSideVerifier/ok_service_ca1,_allow
test-consul - 2019/11/27 02:30:59.586002 [DEBUG] http: Request POST /v1/agent/connect/authorize (2.349416ms) from=127.0.0.1:59906
=== RUN   TestServerSideVerifier/untrusted_CA
2019/11/27 02:30:59 connect: failed TLS verification: x509: certificate signed by unknown authority
=== RUN   TestServerSideVerifier/cross_signed_intermediate,_allow
test-consul - 2019/11/27 02:30:59.629102 [DEBUG] http: Request POST /v1/agent/connect/authorize (1.767728ms) from=127.0.0.1:59906
=== RUN   TestServerSideVerifier/cross_signed_without_intermediate
2019/11/27 02:30:59 connect: failed TLS verification: x509: certificate signed by unknown authority
=== RUN   TestServerSideVerifier/ok_service_ca1,_deny
test-consul - 2019/11/27 02:30:59.653836 [DEBUG] http: Request POST /v1/agent/connect/authorize (1.033703ms) from=127.0.0.1:59906
2019/11/27 02:30:59 connect: authz call denied: Matched intention: DENY default/* => default/db (ID: 3b07c4b4-683c-2760-f84a-9eba44aa46f6, Precedence: 8)
=== RUN   TestServerSideVerifier/cross_signed_intermediate,_deny
test-consul - 2019/11/27 02:30:59.687170 [DEBUG] http: Request POST /v1/agent/connect/authorize (1.202709ms) from=127.0.0.1:59906
2019/11/27 02:30:59 connect: authz call denied: Matched intention: DENY default/* => default/db (ID: 3b07c4b4-683c-2760-f84a-9eba44aa46f6, Precedence: 8)
test-consul - 2019/11/27 02:30:59.689150 [INFO] agent: Requesting shutdown
test-consul - 2019/11/27 02:30:59.689219 [INFO] consul: shutting down server
test-consul - 2019/11/27 02:30:59.689266 [WARN] serf: Shutdown without a Leave
test-consul - 2019/11/27 02:30:59.751487 [WARN] serf: Shutdown without a Leave
test-consul - 2019/11/27 02:30:59.807083 [INFO] manager: shutting down
test-consul - 2019/11/27 02:30:59.808131 [INFO] agent: consul server down
test-consul - 2019/11/27 02:30:59.808184 [INFO] agent: shutdown complete
test-consul - 2019/11/27 02:30:59.808234 [INFO] agent: Stopping DNS server 127.0.0.1:41508 (tcp)
test-consul - 2019/11/27 02:30:59.808356 [INFO] agent: Stopping DNS server 127.0.0.1:41508 (udp)
test-consul - 2019/11/27 02:30:59.808492 [INFO] agent: Stopping HTTP server 127.0.0.1:41509 (tcp)
test-consul - 2019/11/27 02:30:59.809195 [INFO] agent: Waiting for endpoints to shut down
test-consul - 2019/11/27 02:30:59.809243 [INFO] agent: Endpoints down
--- PASS: TestServerSideVerifier (5.07s)
    --- PASS: TestServerSideVerifier/ok_service_ca1,_allow (0.02s)
    --- PASS: TestServerSideVerifier/untrusted_CA (0.00s)
    --- PASS: TestServerSideVerifier/cross_signed_intermediate,_allow (0.04s)
    --- PASS: TestServerSideVerifier/cross_signed_without_intermediate (0.00s)
    --- PASS: TestServerSideVerifier/ok_service_ca1,_deny (0.02s)
    --- PASS: TestServerSideVerifier/cross_signed_intermediate,_deny (0.03s)
=== RUN   TestDynamicTLSConfig
--- PASS: TestDynamicTLSConfig (0.11s)
=== RUN   TestDynamicTLSConfig_Ready
--- PASS: TestDynamicTLSConfig_Ready (0.09s)
PASS
ok  	github.com/hashicorp/consul/connect	10.703s
?   	github.com/hashicorp/consul/connect/certgen	[no test files]
=== RUN   TestUpstreamResolverFuncFromClient
=== PAUSE TestUpstreamResolverFuncFromClient
=== RUN   TestAgentConfigWatcherManagedProxy
=== PAUSE TestAgentConfigWatcherManagedProxy
=== RUN   TestAgentConfigWatcherSidecarProxy
=== PAUSE TestAgentConfigWatcherSidecarProxy
=== RUN   TestConn
--- SKIP: TestConn (0.00s)
    conn_test.go:67: DM-skipped
=== RUN   TestConnSrcClosing
=== PAUSE TestConnSrcClosing
=== RUN   TestConnDstClosing
=== PAUSE TestConnDstClosing
=== RUN   TestPublicListener
2019/11/27 02:30:54 test tcp server listening on localhost:53502
2019/11/27 02:30:54 [DEBUG] resolved service instance: localhost:53501 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db)
2019/11/27 02:30:54 [DEBUG] successfully connected to localhost:53501 (spiffe://11111111-2222-3333-4444-555555555555.consul/ns/default/dc/dc1/svc/db)
2019/11/27 02:30:54 connect: nil client
2019/11/27 02:30:54 test tcp echo server 127.0.0.1:53502 stopped
--- PASS: TestPublicListener (0.21s)
=== RUN   TestUpstreamListener
--- SKIP: TestUpstreamListener (0.00s)
    listener_test.go:161: DM-skipped
=== RUN   TestProxy_public
--- SKIP: TestProxy_public (0.00s)
    proxy_test.go:22: DM-skipped
=== CONT  TestUpstreamResolverFuncFromClient
=== RUN   TestUpstreamResolverFuncFromClient/service
=== CONT  TestConnDstClosing
=== RUN   TestUpstreamResolverFuncFromClient/prepared_query
=== RUN   TestUpstreamResolverFuncFromClient/unknown_behaves_like_service
=== CONT  TestAgentConfigWatcherSidecarProxy
--- PASS: TestUpstreamResolverFuncFromClient (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/service (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/prepared_query (0.00s)
    --- PASS: TestUpstreamResolverFuncFromClient/unknown_behaves_like_service (0.00s)
=== CONT  TestAgentConfigWatcherManagedProxy
=== CONT  TestConnSrcClosing
--- PASS: TestConnDstClosing (0.04s)
--- PASS: TestConnSrcClosing (0.05s)
WARNING: bootstrap = true: do not enable unless necessary
agent_smith - 2019/11/27 02:30:54.868453 [WARN] agent: Node name "Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
agent_smith - 2019/11/27 02:30:54.870504 [DEBUG] tlsutil: Update with version 1
agent_smith - 2019/11/27 02:30:54.871083 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
agent_smith - 2019/11/27 02:30:54.871827 [DEBUG] tlsutil: IncomingRPCConfig with version 1
agent_smith - 2019/11/27 02:30:54.872322 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
agent_smith - 2019/11/27 02:30:54.876596 [WARN] agent: Node name "Node 6fcb63a8-4b41-7d09-7010-232d77c19026" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
agent_smith - 2019/11/27 02:30:54.877163 [DEBUG] tlsutil: Update with version 1
agent_smith - 2019/11/27 02:30:54.877240 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
agent_smith - 2019/11/27 02:30:54.877526 [DEBUG] tlsutil: IncomingRPCConfig with version 1
agent_smith - 2019/11/27 02:30:54.877638 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:30:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:c8121bb6-83ec-1f43-88f2-f24fcb7c9241 Address:127.0.0.1:53508}]
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:53508 [Follower] entering Follower state (Leader: "")
agent_smith - 2019/11/27 02:30:56.078607 [INFO] serf: EventMemberJoin: Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241.dc1 127.0.0.1
agent_smith - 2019/11/27 02:30:56.082547 [INFO] serf: EventMemberJoin: Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241 127.0.0.1
agent_smith - 2019/11/27 02:30:56.083760 [INFO] consul: Adding LAN server Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241 (Addr: tcp/127.0.0.1:53508) (DC: dc1)
agent_smith - 2019/11/27 02:30:56.083870 [INFO] consul: Handled member-join event for server "Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241.dc1" in area "wan"
agent_smith - 2019/11/27 02:30:56.084679 [INFO] agent: Started DNS server 127.0.0.1:53503 (tcp)
agent_smith - 2019/11/27 02:30:56.084772 [INFO] agent: Started DNS server 127.0.0.1:53503 (udp)
agent_smith - 2019/11/27 02:30:56.086666 [INFO] agent: Started HTTP server on 127.0.0.1:53504 (tcp)
agent_smith - 2019/11/27 02:30:56.086855 [INFO] agent: started state syncer
2019/11/27 02:30:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:53508 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6fcb63a8-4b41-7d09-7010-232d77c19026 Address:127.0.0.1:53514}]
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:53514 [Follower] entering Follower state (Leader: "")
agent_smith - 2019/11/27 02:30:56.192438 [INFO] serf: EventMemberJoin: Node 6fcb63a8-4b41-7d09-7010-232d77c19026.dc1 127.0.0.1
agent_smith - 2019/11/27 02:30:56.197899 [INFO] serf: EventMemberJoin: Node 6fcb63a8-4b41-7d09-7010-232d77c19026 127.0.0.1
agent_smith - 2019/11/27 02:30:56.199574 [INFO] consul: Adding LAN server Node 6fcb63a8-4b41-7d09-7010-232d77c19026 (Addr: tcp/127.0.0.1:53514) (DC: dc1)
agent_smith - 2019/11/27 02:30:56.200169 [INFO] consul: Handled member-join event for server "Node 6fcb63a8-4b41-7d09-7010-232d77c19026.dc1" in area "wan"
agent_smith - 2019/11/27 02:30:56.201319 [INFO] agent: Started DNS server 127.0.0.1:53509 (tcp)
agent_smith - 2019/11/27 02:30:56.201549 [INFO] agent: Started DNS server 127.0.0.1:53509 (udp)
agent_smith - 2019/11/27 02:30:56.203646 [INFO] agent: Started HTTP server on 127.0.0.1:53510 (tcp)
agent_smith - 2019/11/27 02:30:56.203747 [INFO] agent: started state syncer
2019/11/27 02:30:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:53514 [Candidate] entering Candidate state in term 2
2019/11/27 02:30:56 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:53508 [Leader] entering Leader state
agent_smith - 2019/11/27 02:30:56.720700 [INFO] consul: cluster leadership acquired
agent_smith - 2019/11/27 02:30:56.721437 [INFO] consul: New leader elected: Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241
2019/11/27 02:30:56 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:30:56 [INFO]  raft: Node at 127.0.0.1:53514 [Leader] entering Leader state
agent_smith - 2019/11/27 02:30:56.820078 [INFO] consul: cluster leadership acquired
agent_smith - 2019/11/27 02:30:56.820745 [INFO] consul: New leader elected: Node 6fcb63a8-4b41-7d09-7010-232d77c19026
agent_smith - 2019/11/27 02:30:57.142318 [INFO] agent: Synced node info
agent_smith - 2019/11/27 02:30:57.142547 [DEBUG] agent: Node info in sync
agent_smith - 2019/11/27 02:30:57.220763 [INFO] agent: Synced node info
agent_smith - 2019/11/27 02:30:57.988844 [INFO] agent: Synced service "web"
agent_smith - 2019/11/27 02:30:58.098530 [INFO] agent: Synced service "web"
agent_smith - 2019/11/27 02:30:58.205529 [INFO] agent: Synced service "web-proxy"
agent_smith - 2019/11/27 02:30:58.205636 [DEBUG] agent: Check "service:web-proxy" in sync
agent_smith - 2019/11/27 02:30:58.205732 [DEBUG] agent: Node info in sync
agent_smith - 2019/11/27 02:30:58.205821 [DEBUG] http: Request PUT /v1/agent/service/register (1.227618848s) from=127.0.0.1:54968
agent_smith - 2019/11/27 02:30:58.218655 [DEBUG] http: Request GET /v1/agent/service/web-proxy (5.991544ms) from=127.0.0.1:54968
agent_smith - 2019/11/27 02:30:58.288545 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/11/27 02:30:58.288812 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/11/27 02:30:58.288969 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/11/27 02:30:58.289097 [DEBUG] agent: Node info in sync
agent_smith - 2019/11/27 02:30:58.289256 [DEBUG] http: Request PUT /v1/agent/service/register (1.480171065s) from=127.0.0.1:48806
agent_smith - 2019/11/27 02:30:58.292974 [DEBUG] http: Request GET /v1/agent/service/web-sidecar-proxy (1.498052ms) from=127.0.0.1:48806
agent_smith - 2019/11/27 02:30:58.614782 [INFO] connect: initialized primary datacenter CA with provider "consul"
agent_smith - 2019/11/27 02:30:58.615294 [DEBUG] consul: Skipping self join check for "Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241" since the cluster is too small
agent_smith - 2019/11/27 02:30:58.615491 [INFO] consul: member 'Node c8121bb6-83ec-1f43-88f2-f24fcb7c9241' joined, marking health alive
agent_smith - 2019/11/27 02:30:58.619904 [DEBUG] http: Request GET /v1/agent/service/web-proxy?hash=f2bd3d844bf5efb (388.693005ms) from=127.0.0.1:54968
agent_smith - 2019/11/27 02:30:58.623227 [INFO] agent: Requesting shutdown
agent_smith - 2019/11/27 02:30:58.863988 [INFO] connect: initialized primary datacenter CA with provider "consul"
agent_smith - 2019/11/27 02:30:58.864437 [DEBUG] consul: Skipping self join check for "Node 6fcb63a8-4b41-7d09-7010-232d77c19026" since the cluster is too small
agent_smith - 2019/11/27 02:30:58.864650 [INFO] consul: member 'Node 6fcb63a8-4b41-7d09-7010-232d77c19026' joined, marking health alive
agent_smith - 2019/11/27 02:30:58.865227 [INFO] consul: shutting down server
agent_smith - 2019/11/27 02:30:58.865303 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/11/27 02:30:58.973794 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/11/27 02:30:59.062761 [INFO] manager: shutting down
agent_smith - 2019/11/27 02:30:59.063068 [WARN] agent: Syncing service "web" failed. raft is already shutdown
agent_smith - 2019/11/27 02:30:59.063158 [ERR] agent: failed to sync changes: raft is already shutdown
agent_smith - 2019/11/27 02:30:59.063239 [DEBUG] http: Request PUT /v1/agent/service/register (810.424506ms) from=127.0.0.1:54970
agent_smith - 2019/11/27 02:30:59.064541 [INFO] agent: consul server down
agent_smith - 2019/11/27 02:30:59.064619 [INFO] agent: shutdown complete
agent_smith - 2019/11/27 02:30:59.064687 [INFO] agent: Stopping DNS server 127.0.0.1:53509 (tcp)
agent_smith - 2019/11/27 02:30:59.064886 [INFO] agent: Stopping DNS server 127.0.0.1:53509 (udp)
agent_smith - 2019/11/27 02:30:59.065113 [INFO] agent: Stopping HTTP server 127.0.0.1:53510 (tcp)
agent_smith - 2019/11/27 02:30:59.066161 [ERR] consul: failed to reconcile member: {Node 6fcb63a8-4b41-7d09-7010-232d77c19026 127.0.0.1 53512 map[acls:0 bootstrap:1 build:1.4.4: dc:dc1 id:6fcb63a8-4b41-7d09-7010-232d77c19026 port:53514 raft_vsn:3 role:consul segment: vsn:2 vsn_max:3 vsn_min:2 wan_join_port:53513] alive 1 5 2 2 5 4}: leadership lost while committing log
agent_smith - 2019/11/27 02:30:59.141567 [INFO] agent: Synced service "web"
agent_smith - 2019/11/27 02:30:59.434220 [INFO] agent: Synced service "web-sidecar-proxy"
agent_smith - 2019/11/27 02:30:59.434310 [DEBUG] agent: Check "service:web-sidecar-proxy:2" in sync
agent_smith - 2019/11/27 02:30:59.434362 [DEBUG] agent: Check "service:web-sidecar-proxy:1" in sync
agent_smith - 2019/11/27 02:30:59.434392 [DEBUG] agent: Node info in sync
agent_smith - 2019/11/27 02:30:59.434458 [DEBUG] http: Request PUT /v1/agent/service/register (1.116266263s) from=127.0.0.1:48812
agent_smith - 2019/11/27 02:30:59.444143 [DEBUG] http: Request GET /v1/agent/service/web-sidecar-proxy?hash=4a87c9bd1a9bd791 (1.147823373s) from=127.0.0.1:48806
agent_smith - 2019/11/27 02:30:59.454697 [INFO] agent: Requesting shutdown
agent_smith - 2019/11/27 02:30:59.454868 [INFO] consul: shutting down server
agent_smith - 2019/11/27 02:30:59.454934 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/11/27 02:30:59.495850 [WARN] serf: Shutdown without a Leave
agent_smith - 2019/11/27 02:30:59.551546 [INFO] manager: shutting down
agent_smith - 2019/11/27 02:30:59.552243 [INFO] agent: consul server down
agent_smith - 2019/11/27 02:30:59.552309 [INFO] agent: shutdown complete
agent_smith - 2019/11/27 02:30:59.552373 [INFO] agent: Stopping DNS server 127.0.0.1:53503 (tcp)
agent_smith - 2019/11/27 02:30:59.552504 [INFO] agent: Stopping DNS server 127.0.0.1:53503 (udp)
agent_smith - 2019/11/27 02:30:59.552657 [INFO] agent: Stopping HTTP server 127.0.0.1:53504 (tcp)
agent_smith - 2019/11/27 02:31:00.065521 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:53510 (tcp)
agent_smith - 2019/11/27 02:31:00.065609 [INFO] agent: Waiting for endpoints to shut down
agent_smith - 2019/11/27 02:31:00.065655 [INFO] agent: Endpoints down
--- PASS: TestAgentConfigWatcherManagedProxy (5.31s)
agent_smith - 2019/11/27 02:31:00.553087 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:53504 (tcp)
agent_smith - 2019/11/27 02:31:00.553181 [INFO] agent: Waiting for endpoints to shut down
agent_smith - 2019/11/27 02:31:00.553221 [INFO] agent: Endpoints down
--- PASS: TestAgentConfigWatcherSidecarProxy (5.80s)
PASS
ok  	github.com/hashicorp/consul/connect/proxy	6.244s
=== RUN   TestIsPrivateIP
=== RUN   TestIsPrivateIP/10.0.0.1
=== RUN   TestIsPrivateIP/100.64.0.1
=== RUN   TestIsPrivateIP/172.16.0.1
=== RUN   TestIsPrivateIP/192.168.0.1
=== RUN   TestIsPrivateIP/192.0.0.1
=== RUN   TestIsPrivateIP/192.0.2.1
=== RUN   TestIsPrivateIP/127.0.0.1
=== RUN   TestIsPrivateIP/169.254.0.1
=== RUN   TestIsPrivateIP/1.2.3.4
=== RUN   TestIsPrivateIP/::1
=== RUN   TestIsPrivateIP/fe80::1
=== RUN   TestIsPrivateIP/fc00::1
=== RUN   TestIsPrivateIP/fec0::1
=== RUN   TestIsPrivateIP/2001:db8::1
=== RUN   TestIsPrivateIP/2004:db6::1
--- PASS: TestIsPrivateIP (0.01s)
    --- PASS: TestIsPrivateIP/10.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/100.64.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/172.16.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.168.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/192.0.2.1 (0.00s)
    --- PASS: TestIsPrivateIP/127.0.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/169.254.0.1 (0.00s)
    --- PASS: TestIsPrivateIP/1.2.3.4 (0.00s)
    --- PASS: TestIsPrivateIP/::1 (0.00s)
    --- PASS: TestIsPrivateIP/fe80::1 (0.00s)
    --- PASS: TestIsPrivateIP/fc00::1 (0.00s)
    --- PASS: TestIsPrivateIP/fec0::1 (0.00s)
    --- PASS: TestIsPrivateIP/2001:db8::1 (0.00s)
    --- PASS: TestIsPrivateIP/2004:db6::1 (0.00s)
PASS
ok  	github.com/hashicorp/consul/ipaddr	0.114s
=== RUN   TestDurationMinusBuffer
--- PASS: TestDurationMinusBuffer (0.00s)
=== RUN   TestDurationMinusBufferDomain
--- PASS: TestDurationMinusBufferDomain (0.00s)
=== RUN   TestRandomStagger
--- PASS: TestRandomStagger (0.00s)
=== RUN   TestRateScaledInterval
--- PASS: TestRateScaledInterval (0.00s)
=== RUN   TestRTT_ComputeDistance
=== RUN   TestRTT_ComputeDistance/10_ms
=== RUN   TestRTT_ComputeDistance/0_ms
=== RUN   TestRTT_ComputeDistance/2_ms
=== RUN   TestRTT_ComputeDistance/2_ms_reversed
=== RUN   TestRTT_ComputeDistance/a_nil
=== RUN   TestRTT_ComputeDistance/b_nil
=== RUN   TestRTT_ComputeDistance/both_nil
--- PASS: TestRTT_ComputeDistance (0.00s)
    --- PASS: TestRTT_ComputeDistance/10_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/0_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/2_ms (0.00s)
    --- PASS: TestRTT_ComputeDistance/2_ms_reversed (0.00s)
    --- PASS: TestRTT_ComputeDistance/a_nil (0.00s)
    --- PASS: TestRTT_ComputeDistance/b_nil (0.00s)
    --- PASS: TestRTT_ComputeDistance/both_nil (0.00s)
=== RUN   TestRTT_Intersect
=== RUN   TestRTT_Intersect/nil_maps
=== RUN   TestRTT_Intersect/two_servers
=== RUN   TestRTT_Intersect/two_clients
=== RUN   TestRTT_Intersect/server1_and_client_alpha
=== RUN   TestRTT_Intersect/server1_and_client_beta_1
=== RUN   TestRTT_Intersect/server1_and_client_alpha_reversed
=== RUN   TestRTT_Intersect/server1_and_client_beta_1_reversed
=== RUN   TestRTT_Intersect/nothing_in_common
=== RUN   TestRTT_Intersect/nothing_in_common_reversed
--- PASS: TestRTT_Intersect (0.01s)
    --- PASS: TestRTT_Intersect/nil_maps (0.00s)
    --- PASS: TestRTT_Intersect/two_servers (0.00s)
    --- PASS: TestRTT_Intersect/two_clients (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_alpha (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_beta_1 (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_alpha_reversed (0.00s)
    --- PASS: TestRTT_Intersect/server1_and_client_beta_1_reversed (0.00s)
    --- PASS: TestRTT_Intersect/nothing_in_common (0.00s)
    --- PASS: TestRTT_Intersect/nothing_in_common_reversed (0.00s)
=== RUN   TestStrContains
--- PASS: TestStrContains (0.00s)
=== RUN   TestTelemetryConfig_MergeDefaults
=== RUN   TestTelemetryConfig_MergeDefaults/basic_merge
=== RUN   TestTelemetryConfig_MergeDefaults/exhaustive
--- PASS: TestTelemetryConfig_MergeDefaults (0.00s)
    --- PASS: TestTelemetryConfig_MergeDefaults/basic_merge (0.00s)
    --- PASS: TestTelemetryConfig_MergeDefaults/exhaustive (0.00s)
=== RUN   TestUserAgent
--- PASS: TestUserAgent (0.00s)
=== RUN   TestMathAbsInt
--- PASS: TestMathAbsInt (0.00s)
=== RUN   TestMathMaxInt
--- PASS: TestMathMaxInt (0.00s)
=== RUN   TestMathMinInt
--- PASS: TestMathMinInt (0.00s)
PASS
ok  	github.com/hashicorp/consul/lib	0.082s
=== RUN   TestWriteAtomic
--- PASS: TestWriteAtomic (0.13s)
PASS
ok  	github.com/hashicorp/consul/lib/file	0.189s
?   	github.com/hashicorp/consul/lib/freeport	[no test files]
=== RUN   TestDynamic
=== PAUSE TestDynamic
=== RUN   TestDynamicPanic
=== PAUSE TestDynamicPanic
=== RUN   TestDynamicAcquire
=== PAUSE TestDynamicAcquire
=== CONT  TestDynamic
=== CONT  TestDynamicPanic
--- PASS: TestDynamicPanic (0.00s)
=== CONT  TestDynamicAcquire
--- PASS: TestDynamicAcquire (0.05s)
--- PASS: TestDynamic (1.79s)
PASS
ok  	github.com/hashicorp/consul/lib/semaphore	1.819s
=== RUN   TestGatedWriter_impl
--- PASS: TestGatedWriter_impl (0.00s)
=== RUN   TestGatedWriter
--- PASS: TestGatedWriter (0.00s)
=== RUN   TestGRPCLogger
--- PASS: TestGRPCLogger (0.00s)
=== RUN   TestGRPCLogger_V
=== RUN   TestGRPCLogger_V/ERR,-1
=== RUN   TestGRPCLogger_V/ERR,0
=== RUN   TestGRPCLogger_V/ERR,1
=== RUN   TestGRPCLogger_V/ERR,2
=== RUN   TestGRPCLogger_V/ERR,3
=== RUN   TestGRPCLogger_V/WARN,-1
=== RUN   TestGRPCLogger_V/WARN,0
=== RUN   TestGRPCLogger_V/WARN,1
=== RUN   TestGRPCLogger_V/WARN,2
=== RUN   TestGRPCLogger_V/WARN,3
=== RUN   TestGRPCLogger_V/INFO,-1
=== RUN   TestGRPCLogger_V/INFO,0
=== RUN   TestGRPCLogger_V/INFO,1
=== RUN   TestGRPCLogger_V/INFO,2
=== RUN   TestGRPCLogger_V/INFO,3
=== RUN   TestGRPCLogger_V/DEBUG,-1
=== RUN   TestGRPCLogger_V/DEBUG,0
=== RUN   TestGRPCLogger_V/DEBUG,1
=== RUN   TestGRPCLogger_V/DEBUG,2
=== RUN   TestGRPCLogger_V/DEBUG,3
=== RUN   TestGRPCLogger_V/TRACE,-1
=== RUN   TestGRPCLogger_V/TRACE,0
=== RUN   TestGRPCLogger_V/TRACE,1
=== RUN   TestGRPCLogger_V/TRACE,2
=== RUN   TestGRPCLogger_V/TRACE,3
--- PASS: TestGRPCLogger_V (0.01s)
    --- PASS: TestGRPCLogger_V/ERR,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,0 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,1 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,2 (0.00s)
    --- PASS: TestGRPCLogger_V/ERR,3 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,0 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,1 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,2 (0.00s)
    --- PASS: TestGRPCLogger_V/WARN,3 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,0 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,1 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,2 (0.00s)
    --- PASS: TestGRPCLogger_V/INFO,3 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,0 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,1 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,2 (0.00s)
    --- PASS: TestGRPCLogger_V/DEBUG,3 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,-1 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,0 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,1 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,2 (0.00s)
    --- PASS: TestGRPCLogger_V/TRACE,3 (0.00s)
=== RUN   TestLogWriter
--- PASS: TestLogWriter (0.00s)
=== RUN   TestLogFile_timeRotation
=== PAUSE TestLogFile_timeRotation
=== RUN   TestLogFile_openNew
=== PAUSE TestLogFile_openNew
=== RUN   TestLogFile_byteRotation
=== PAUSE TestLogFile_byteRotation
=== CONT  TestLogFile_timeRotation
=== CONT  TestLogFile_byteRotation
=== CONT  TestLogFile_openNew
--- PASS: TestLogFile_openNew (0.00s)
--- PASS: TestLogFile_byteRotation (0.00s)
--- PASS: TestLogFile_timeRotation (2.00s)
PASS
ok  	github.com/hashicorp/consul/logger	2.046s
?   	github.com/hashicorp/consul/sentinel	[no test files]
?   	github.com/hashicorp/consul/service_os	[no test files]
=== RUN   TestArchive
--- PASS: TestArchive (0.00s)
=== RUN   TestArchive_GoodData
--- PASS: TestArchive_GoodData (0.02s)
=== RUN   TestArchive_BadData
--- PASS: TestArchive_BadData (0.06s)
=== RUN   TestArchive_hashList
--- PASS: TestArchive_hashList (0.00s)
=== RUN   TestSnapshot
2019-11-27T02:30:59.970Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-9dc37228-8d6d-98af-20f6-764d14a7181f Address:9dc37228-8d6d-98af-20f6-764d14a7181f}]
2019-11-27T02:30:59.971Z [INFO]  raft: Node at 9dc37228-8d6d-98af-20f6-764d14a7181f [Follower] entering Follower state (Leader: "")
2019-11-27T02:31:01.450Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-11-27T02:31:01.450Z [INFO]  raft: Node at 9dc37228-8d6d-98af-20f6-764d14a7181f [Candidate] entering Candidate state in term 2
2019-11-27T02:31:01.450Z [DEBUG] raft: Votes needed: 1
2019-11-27T02:31:01.450Z [DEBUG] raft: Vote granted from server-9dc37228-8d6d-98af-20f6-764d14a7181f in term 2. Tally: 1
2019-11-27T02:31:01.450Z [INFO]  raft: Election won. Tally: 1
2019-11-27T02:31:01.450Z [INFO]  raft: Node at 9dc37228-8d6d-98af-20f6-764d14a7181f [Leader] entering Leader state
2019-11-27T02:31:17.565Z [INFO]  raft: Starting snapshot up to 65538
2019/11/27 02:31:17 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot-snapshot139544392/before/snapshots/2-65538-1574821877565.tmp
2019-11-27T02:31:18.840Z [INFO]  raft: Compacting logs from 1 to 55298
2019-11-27T02:31:18.878Z [INFO]  raft: Snapshot to 65538 complete
2019-11-27T02:31:33.186Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-f8f9778f-9418-7de4-0889-047aa9645035 Address:f8f9778f-9418-7de4-0889-047aa9645035}]
2019-11-27T02:31:33.186Z [INFO]  raft: Node at f8f9778f-9418-7de4-0889-047aa9645035 [Follower] entering Follower state (Leader: "")
2019-11-27T02:31:35.111Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-11-27T02:31:35.111Z [INFO]  raft: Node at f8f9778f-9418-7de4-0889-047aa9645035 [Candidate] entering Candidate state in term 2
2019-11-27T02:31:35.111Z [DEBUG] raft: Votes needed: 1
2019-11-27T02:31:35.111Z [DEBUG] raft: Vote granted from server-f8f9778f-9418-7de4-0889-047aa9645035 in term 2. Tally: 1
2019-11-27T02:31:35.112Z [INFO]  raft: Election won. Tally: 1
2019-11-27T02:31:35.112Z [INFO]  raft: Node at f8f9778f-9418-7de4-0889-047aa9645035 [Leader] entering Leader state
2019/11/27 02:31:37 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot-snapshot139544392/after/snapshots/2-65539-1574821897515.tmp
2019-11-27T02:31:38.494Z [INFO]  raft: Copied 16973829 bytes to local snapshot
2019-11-27T02:31:39.319Z [INFO]  raft: Restored user snapshot (index 65539)
--- PASS: TestSnapshot (39.52s)
=== RUN   TestSnapshot_Nil
--- PASS: TestSnapshot_Nil (0.00s)
=== RUN   TestSnapshot_BadVerify
--- PASS: TestSnapshot_BadVerify (0.00s)
=== RUN   TestSnapshot_BadRestore
2019-11-27T02:31:39.488Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-e58d4127-dcd4-737a-cb39-39f336b043de Address:e58d4127-dcd4-737a-cb39-39f336b043de}]
2019-11-27T02:31:39.488Z [INFO]  raft: Node at e58d4127-dcd4-737a-cb39-39f336b043de [Follower] entering Follower state (Leader: "")
2019-11-27T02:31:40.977Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-11-27T02:31:40.977Z [INFO]  raft: Node at e58d4127-dcd4-737a-cb39-39f336b043de [Candidate] entering Candidate state in term 2
2019-11-27T02:31:40.977Z [DEBUG] raft: Votes needed: 1
2019-11-27T02:31:40.977Z [DEBUG] raft: Vote granted from server-e58d4127-dcd4-737a-cb39-39f336b043de in term 2. Tally: 1
2019-11-27T02:31:40.977Z [INFO]  raft: Election won. Tally: 1
2019-11-27T02:31:40.978Z [INFO]  raft: Node at e58d4127-dcd4-737a-cb39-39f336b043de [Leader] entering Leader state
2019-11-27T02:31:44.139Z [INFO]  raft: Starting snapshot up to 16386
2019/11/27 02:31:44 [INFO] snapshot: Creating new snapshot at /tmp/consul-test/TestSnapshot_BadRestore-snapshot634485201/before/snapshots/2-16386-1574821904139.tmp
2019-11-27T02:31:44.682Z [INFO]  raft: Compacting logs from 1 to 6146
2019-11-27T02:31:44.685Z [INFO]  raft: Snapshot to 16386 complete
2019-11-27T02:31:47.555Z [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:server-9829720b-abd3-8853-dd34-867bca31e0cf Address:9829720b-abd3-8853-dd34-867bca31e0cf}]
2019-11-27T02:31:47.556Z [INFO]  raft: Node at 9829720b-abd3-8853-dd34-867bca31e0cf [Follower] entering Follower state (Leader: "")
2019-11-27T02:31:48.876Z [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019-11-27T02:31:48.876Z [INFO]  raft: Node at 9829720b-abd3-8853-dd34-867bca31e0cf [Candidate] entering Candidate state in term 2
2019-11-27T02:31:48.876Z [DEBUG] raft: Votes needed: 1
2019-11-27T02:31:48.877Z [DEBUG] raft: Vote granted from server-9829720b-abd3-8853-dd34-867bca31e0cf in term 2. Tally: 1
2019-11-27T02:31:48.877Z [INFO]  raft: Election won. Tally: 1
2019-11-27T02:31:48.877Z [INFO]  raft: Node at 9829720b-abd3-8853-dd34-867bca31e0cf [Leader] entering Leader state
[ERR] snapshot: Failed to close snapshot decompressor: unexpected EOF
--- PASS: TestSnapshot_BadRestore (9.45s)
PASS
ok  	github.com/hashicorp/consul/snapshot	49.152s
?   	github.com/hashicorp/consul/testrpc	[no test files]
?   	github.com/hashicorp/consul/testutil	[no test files]
=== RUN   TestRetryer
--- SKIP: TestRetryer (0.00s)
    retry_test.go:12: DM-skipped
PASS
ok  	github.com/hashicorp/consul/testutil/retry	0.043s
=== RUN   TestConfigurator_outgoingWrapper_OK
--- PASS: TestConfigurator_outgoingWrapper_OK (0.16s)
=== RUN   TestConfigurator_outgoingWrapper_noverify_OK
--- PASS: TestConfigurator_outgoingWrapper_noverify_OK (0.18s)
=== RUN   TestConfigurator_outgoingWrapper_BadDC
--- PASS: TestConfigurator_outgoingWrapper_BadDC (0.19s)
=== RUN   TestConfigurator_outgoingWrapper_BadCert
--- PASS: TestConfigurator_outgoingWrapper_BadCert (0.17s)
=== RUN   TestConfigurator_wrapTLS_OK
--- PASS: TestConfigurator_wrapTLS_OK (0.16s)
=== RUN   TestConfigurator_wrapTLS_BadCert
--- PASS: TestConfigurator_wrapTLS_BadCert (0.22s)
=== RUN   TestConfig_ParseCiphers
--- PASS: TestConfig_ParseCiphers (0.00s)
=== RUN   TestConfigurator_loadKeyPair
--- PASS: TestConfigurator_loadKeyPair (0.01s)
=== RUN   TestConfig_SpecifyDC
--- PASS: TestConfig_SpecifyDC (0.00s)
=== RUN   TestConfigurator_NewConfigurator
--- PASS: TestConfigurator_NewConfigurator (0.00s)
=== RUN   TestConfigurator_ErrorPropagation
--- PASS: TestConfigurator_ErrorPropagation (0.08s)
=== RUN   TestConfigurator_CommonTLSConfigServerNameNodeName
--- PASS: TestConfigurator_CommonTLSConfigServerNameNodeName (0.00s)
=== RUN   TestConfigurator_loadCAs
--- PASS: TestConfigurator_loadCAs (0.02s)
=== RUN   TestConfigurator_CommonTLSConfigInsecureSkipVerify
--- PASS: TestConfigurator_CommonTLSConfigInsecureSkipVerify (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigPreferServerCipherSuites
--- PASS: TestConfigurator_CommonTLSConfigPreferServerCipherSuites (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigCipherSuites
--- PASS: TestConfigurator_CommonTLSConfigCipherSuites (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigGetClientCertificate
--- PASS: TestConfigurator_CommonTLSConfigGetClientCertificate (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigCAs
--- PASS: TestConfigurator_CommonTLSConfigCAs (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigTLSMinVersion
--- PASS: TestConfigurator_CommonTLSConfigTLSMinVersion (0.00s)
=== RUN   TestConfigurator_CommonTLSConfigVerifyIncoming
--- PASS: TestConfigurator_CommonTLSConfigVerifyIncoming (0.00s)
=== RUN   TestConfigurator_OutgoingRPCTLSDisabled
--- PASS: TestConfigurator_OutgoingRPCTLSDisabled (0.03s)
=== RUN   TestConfigurator_SomeValuesFromConfig
--- PASS: TestConfigurator_SomeValuesFromConfig (0.00s)
=== RUN   TestConfigurator_VerifyIncomingRPC
--- PASS: TestConfigurator_VerifyIncomingRPC (0.00s)
=== RUN   TestConfigurator_VerifyIncomingHTTPS
--- PASS: TestConfigurator_VerifyIncomingHTTPS (0.00s)
=== RUN   TestConfigurator_EnableAgentTLSForChecks
--- PASS: TestConfigurator_EnableAgentTLSForChecks (0.00s)
=== RUN   TestConfigurator_IncomingRPCConfig
--- PASS: TestConfigurator_IncomingRPCConfig (0.03s)
=== RUN   TestConfigurator_IncomingHTTPSConfig
--- PASS: TestConfigurator_IncomingHTTPSConfig (0.00s)
=== RUN   TestConfigurator_OutgoingTLSConfigForChecks
--- PASS: TestConfigurator_OutgoingTLSConfigForChecks (0.00s)
=== RUN   TestConfigurator_OutgoingRPCConfig
--- PASS: TestConfigurator_OutgoingRPCConfig (0.00s)
=== RUN   TestConfigurator_OutgoingRPCWrapper
--- PASS: TestConfigurator_OutgoingRPCWrapper (0.00s)
    config_test.go:704: TODO: actually call wrap here eventually
=== RUN   TestConfigurator_UpdateChecks
--- PASS: TestConfigurator_UpdateChecks (0.00s)
=== RUN   TestConfigurator_UpdateSetsStuff
--- PASS: TestConfigurator_UpdateSetsStuff (0.01s)
=== RUN   TestConfigurator_ServerNameOrNodeName
--- PASS: TestConfigurator_ServerNameOrNodeName (0.00s)
PASS
ok  	github.com/hashicorp/consul/tlsutil	1.351s
?   	github.com/hashicorp/consul/types	[no test files]
?   	github.com/hashicorp/consul/version	[no test files]
=== RUN   TestRun_Stop
=== PAUSE TestRun_Stop
=== RUN   TestRun_Stop_Hybrid
=== PAUSE TestRun_Stop_Hybrid
=== RUN   TestParseBasic
=== PAUSE TestParseBasic
=== RUN   TestParse_exempt
=== PAUSE TestParse_exempt
=== RUN   TestKeyWatch
=== PAUSE TestKeyWatch
=== RUN   TestKeyWatch_With_PrefixDelete
=== PAUSE TestKeyWatch_With_PrefixDelete
=== RUN   TestKeyPrefixWatch
=== PAUSE TestKeyPrefixWatch
=== RUN   TestServicesWatch
=== PAUSE TestServicesWatch
=== RUN   TestNodesWatch
=== PAUSE TestNodesWatch
=== RUN   TestServiceWatch
=== PAUSE TestServiceWatch
=== RUN   TestChecksWatch_State
=== PAUSE TestChecksWatch_State
=== RUN   TestChecksWatch_Service
=== PAUSE TestChecksWatch_Service
=== RUN   TestEventWatch
=== PAUSE TestEventWatch
=== RUN   TestConnectRootsWatch
=== PAUSE TestConnectRootsWatch
=== RUN   TestConnectLeafWatch
=== PAUSE TestConnectLeafWatch
=== RUN   TestConnectProxyConfigWatch
=== PAUSE TestConnectProxyConfigWatch
=== RUN   TestAgentServiceWatch
=== PAUSE TestAgentServiceWatch
=== CONT  TestRun_Stop
=== CONT  TestConnectLeafWatch
=== CONT  TestAgentServiceWatch
=== CONT  TestEventWatch
--- PASS: TestRun_Stop (0.00s)
=== CONT  TestConnectRootsWatch
WARNING: bootstrap = true: do not enable unless necessary
TestConnectRootsWatch - 2019/11/27 02:31:47.186781 [WARN] agent: Node name "Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectRootsWatch - 2019/11/27 02:31:47.187827 [DEBUG] tlsutil: Update with version 1
TestConnectRootsWatch - 2019/11/27 02:31:47.187904 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectRootsWatch - 2019/11/27 02:31:47.190446 [DEBUG] tlsutil: IncomingRPCConfig with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestConnectRootsWatch - 2019/11/27 02:31:47.196965 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventWatch - 2019/11/27 02:31:47.207192 [WARN] agent: Node name "Node 794ec55a-0749-80c8-dd62-882132e80cb0" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestEventWatch - 2019/11/27 02:31:47.207931 [DEBUG] tlsutil: Update with version 1
TestEventWatch - 2019/11/27 02:31:47.208367 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestEventWatch - 2019/11/27 02:31:47.212358 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestEventWatch - 2019/11/27 02:31:47.213156 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestConnectLeafWatch - 2019/11/27 02:31:47.216179 [WARN] agent: Node name "Node 83032929-68f3-259e-bf4d-3fd08b42a7fa" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectLeafWatch - 2019/11/27 02:31:47.216646 [DEBUG] tlsutil: Update with version 1
TestConnectLeafWatch - 2019/11/27 02:31:47.216811 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectLeafWatch - 2019/11/27 02:31:47.217114 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestConnectLeafWatch - 2019/11/27 02:31:47.217232 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestAgentServiceWatch - 2019/11/27 02:31:47.221600 [WARN] agent: Node name "Node e949a5fb-b82c-7136-edfc-5039a3b79159" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestAgentServiceWatch - 2019/11/27 02:31:47.222122 [DEBUG] tlsutil: Update with version 1
TestAgentServiceWatch - 2019/11/27 02:31:47.222267 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentServiceWatch - 2019/11/27 02:31:47.222466 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestAgentServiceWatch - 2019/11/27 02:31:47.222567 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:31:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:5a5e65e3-ea9f-b919-3cae-c21d486b247f Address:127.0.0.1:34012}]
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34012 [Follower] entering Follower state (Leader: "")
TestConnectRootsWatch - 2019/11/27 02:31:48.142305 [INFO] serf: EventMemberJoin: Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f.dc1 127.0.0.1
TestConnectRootsWatch - 2019/11/27 02:31:48.146438 [INFO] serf: EventMemberJoin: Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f 127.0.0.1
TestConnectRootsWatch - 2019/11/27 02:31:48.147595 [INFO] consul: Handled member-join event for server "Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f.dc1" in area "wan"
TestConnectRootsWatch - 2019/11/27 02:31:48.148128 [INFO] consul: Adding LAN server Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f (Addr: tcp/127.0.0.1:34012) (DC: dc1)
TestConnectRootsWatch - 2019/11/27 02:31:48.148554 [INFO] agent: Started DNS server 127.0.0.1:34007 (udp)
TestConnectRootsWatch - 2019/11/27 02:31:48.148676 [INFO] agent: Started DNS server 127.0.0.1:34007 (tcp)
TestConnectRootsWatch - 2019/11/27 02:31:48.150593 [INFO] agent: Started HTTP server on 127.0.0.1:34008 (tcp)
TestConnectRootsWatch - 2019/11/27 02:31:48.150677 [INFO] agent: started state syncer
2019/11/27 02:31:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34012 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:794ec55a-0749-80c8-dd62-882132e80cb0 Address:127.0.0.1:34018}]
2019/11/27 02:31:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:83032929-68f3-259e-bf4d-3fd08b42a7fa Address:127.0.0.1:34006}]
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34018 [Follower] entering Follower state (Leader: "")
2019/11/27 02:31:48 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:e949a5fb-b82c-7136-edfc-5039a3b79159 Address:127.0.0.1:34024}]
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34006 [Follower] entering Follower state (Leader: "")
TestConnectLeafWatch - 2019/11/27 02:31:48.367238 [INFO] serf: EventMemberJoin: Node 83032929-68f3-259e-bf4d-3fd08b42a7fa.dc1 127.0.0.1
TestEventWatch - 2019/11/27 02:31:48.368883 [INFO] serf: EventMemberJoin: Node 794ec55a-0749-80c8-dd62-882132e80cb0.dc1 127.0.0.1
TestAgentServiceWatch - 2019/11/27 02:31:48.369624 [INFO] serf: EventMemberJoin: Node e949a5fb-b82c-7136-edfc-5039a3b79159.dc1 127.0.0.1
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34024 [Follower] entering Follower state (Leader: "")
TestConnectLeafWatch - 2019/11/27 02:31:48.395562 [INFO] serf: EventMemberJoin: Node 83032929-68f3-259e-bf4d-3fd08b42a7fa 127.0.0.1
TestConnectLeafWatch - 2019/11/27 02:31:48.400397 [INFO] agent: Started DNS server 127.0.0.1:34001 (udp)
TestConnectLeafWatch - 2019/11/27 02:31:48.400914 [INFO] consul: Adding LAN server Node 83032929-68f3-259e-bf4d-3fd08b42a7fa (Addr: tcp/127.0.0.1:34006) (DC: dc1)
TestAgentServiceWatch - 2019/11/27 02:31:48.401229 [INFO] serf: EventMemberJoin: Node e949a5fb-b82c-7136-edfc-5039a3b79159 127.0.0.1
TestConnectLeafWatch - 2019/11/27 02:31:48.401484 [INFO] consul: Handled member-join event for server "Node 83032929-68f3-259e-bf4d-3fd08b42a7fa.dc1" in area "wan"
2019/11/27 02:31:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34018 [Candidate] entering Candidate state in term 2
TestConnectLeafWatch - 2019/11/27 02:31:48.401504 [INFO] agent: Started DNS server 127.0.0.1:34001 (tcp)
TestAgentServiceWatch - 2019/11/27 02:31:48.402544 [INFO] agent: Started DNS server 127.0.0.1:34019 (udp)
TestAgentServiceWatch - 2019/11/27 02:31:48.402800 [INFO] consul: Handled member-join event for server "Node e949a5fb-b82c-7136-edfc-5039a3b79159.dc1" in area "wan"
TestAgentServiceWatch - 2019/11/27 02:31:48.403295 [INFO] agent: Started DNS server 127.0.0.1:34019 (tcp)
TestConnectLeafWatch - 2019/11/27 02:31:48.404983 [INFO] agent: Started HTTP server on 127.0.0.1:34002 (tcp)
TestConnectLeafWatch - 2019/11/27 02:31:48.405204 [INFO] agent: started state syncer
TestAgentServiceWatch - 2019/11/27 02:31:48.405073 [INFO] agent: Started HTTP server on 127.0.0.1:34020 (tcp)
TestAgentServiceWatch - 2019/11/27 02:31:48.405415 [INFO] consul: Adding LAN server Node e949a5fb-b82c-7136-edfc-5039a3b79159 (Addr: tcp/127.0.0.1:34024) (DC: dc1)
TestAgentServiceWatch - 2019/11/27 02:31:48.406009 [INFO] agent: started state syncer
TestEventWatch - 2019/11/27 02:31:48.406888 [INFO] serf: EventMemberJoin: Node 794ec55a-0749-80c8-dd62-882132e80cb0 127.0.0.1
TestEventWatch - 2019/11/27 02:31:48.408017 [INFO] consul: Adding LAN server Node 794ec55a-0749-80c8-dd62-882132e80cb0 (Addr: tcp/127.0.0.1:34018) (DC: dc1)
2019/11/27 02:31:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
TestEventWatch - 2019/11/27 02:31:48.408029 [INFO] consul: Handled member-join event for server "Node 794ec55a-0749-80c8-dd62-882132e80cb0.dc1" in area "wan"
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34006 [Candidate] entering Candidate state in term 2
TestEventWatch - 2019/11/27 02:31:48.410108 [INFO] agent: Started DNS server 127.0.0.1:34013 (tcp)
TestEventWatch - 2019/11/27 02:31:48.410302 [INFO] agent: Started DNS server 127.0.0.1:34013 (udp)
TestEventWatch - 2019/11/27 02:31:48.412462 [INFO] agent: Started HTTP server on 127.0.0.1:34014 (tcp)
TestEventWatch - 2019/11/27 02:31:48.412567 [INFO] agent: started state syncer
2019/11/27 02:31:48 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34024 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:48 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:48 [INFO]  raft: Node at 127.0.0.1:34012 [Leader] entering Leader state
TestConnectRootsWatch - 2019/11/27 02:31:48.851946 [INFO] consul: cluster leadership acquired
TestConnectRootsWatch - 2019/11/27 02:31:48.852646 [INFO] consul: New leader elected: Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f
2019/11/27 02:31:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:49 [INFO]  raft: Node at 127.0.0.1:34024 [Leader] entering Leader state
2019/11/27 02:31:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:49 [INFO]  raft: Node at 127.0.0.1:34018 [Leader] entering Leader state
2019/11/27 02:31:49 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:49 [INFO]  raft: Node at 127.0.0.1:34006 [Leader] entering Leader state
TestAgentServiceWatch - 2019/11/27 02:31:49.116273 [INFO] consul: cluster leadership acquired
TestConnectLeafWatch - 2019/11/27 02:31:49.116413 [INFO] consul: cluster leadership acquired
TestAgentServiceWatch - 2019/11/27 02:31:49.116750 [INFO] consul: New leader elected: Node e949a5fb-b82c-7136-edfc-5039a3b79159
TestConnectLeafWatch - 2019/11/27 02:31:49.116818 [INFO] consul: New leader elected: Node 83032929-68f3-259e-bf4d-3fd08b42a7fa
TestEventWatch - 2019/11/27 02:31:49.117118 [INFO] consul: cluster leadership acquired
TestEventWatch - 2019/11/27 02:31:49.117535 [INFO] consul: New leader elected: Node 794ec55a-0749-80c8-dd62-882132e80cb0
TestConnectRootsWatch - 2019/11/27 02:31:49.194058 [INFO] agent: Synced node info
TestConnectLeafWatch - 2019/11/27 02:31:49.427105 [INFO] agent: Synced node info
TestConnectLeafWatch - 2019/11/27 02:31:49.427254 [DEBUG] agent: Node info in sync
TestConnectRootsWatch - 2019/11/27 02:31:49.508969 [DEBUG] agent: Node info in sync
TestConnectRootsWatch - 2019/11/27 02:31:49.509070 [DEBUG] agent: Node info in sync
TestAgentServiceWatch - 2019/11/27 02:31:49.571390 [INFO] agent: Synced node info
TestAgentServiceWatch - 2019/11/27 02:31:49.571547 [DEBUG] agent: Node info in sync
TestEventWatch - 2019/11/27 02:31:49.571389 [INFO] agent: Synced node info
TestConnectRootsWatch - 2019/11/27 02:31:50.128246 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectRootsWatch - 2019/11/27 02:31:50.128631 [DEBUG] consul: Skipping self join check for "Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f" since the cluster is too small
TestConnectRootsWatch - 2019/11/27 02:31:50.128763 [INFO] consul: member 'Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f' joined, marking health alive
TestConnectRootsWatch - 2019/11/27 02:31:50.421980 [DEBUG] http: Request GET /v1/agent/connect/ca/roots (4.373828ms) from=127.0.0.1:59724
TestConnectLeafWatch - 2019/11/27 02:31:50.560902 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectLeafWatch - 2019/11/27 02:31:50.561315 [DEBUG] consul: Skipping self join check for "Node 83032929-68f3-259e-bf4d-3fd08b42a7fa" since the cluster is too small
TestConnectLeafWatch - 2019/11/27 02:31:50.561457 [INFO] consul: member 'Node 83032929-68f3-259e-bf4d-3fd08b42a7fa' joined, marking health alive
TestAgentServiceWatch - 2019/11/27 02:31:50.837865 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestEventWatch - 2019/11/27 02:31:50.838132 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestAgentServiceWatch - 2019/11/27 02:31:50.838294 [DEBUG] consul: Skipping self join check for "Node e949a5fb-b82c-7136-edfc-5039a3b79159" since the cluster is too small
TestAgentServiceWatch - 2019/11/27 02:31:50.838629 [INFO] consul: member 'Node e949a5fb-b82c-7136-edfc-5039a3b79159' joined, marking health alive
TestEventWatch - 2019/11/27 02:31:50.842064 [DEBUG] consul: Skipping self join check for "Node 794ec55a-0749-80c8-dd62-882132e80cb0" since the cluster is too small
TestEventWatch - 2019/11/27 02:31:50.842372 [INFO] consul: member 'Node 794ec55a-0749-80c8-dd62-882132e80cb0' joined, marking health alive
TestConnectLeafWatch - 2019/11/27 02:31:51.038197 [INFO] agent: Synced service "web"
TestConnectLeafWatch - 2019/11/27 02:31:51.038283 [DEBUG] agent: Node info in sync
TestConnectLeafWatch - 2019/11/27 02:31:51.038353 [DEBUG] http: Request PUT /v1/agent/service/register (311.459121ms) from=127.0.0.1:47398
TestEventWatch - 2019/11/27 02:31:51.044736 [DEBUG] http: Request GET /v1/event/list?name=foo (1.250712ms) from=127.0.0.1:55510
TestEventWatch - 2019/11/27 02:31:51.068360 [DEBUG] http: Request PUT /v1/event/fire/foo (2.206081ms) from=127.0.0.1:55514
TestEventWatch - 2019/11/27 02:31:51.069294 [DEBUG] consul: User event: foo
TestEventWatch - 2019/11/27 02:31:51.069530 [DEBUG] agent: new event: foo (2b2b8091-2db3-5761-a681-3d828fed7ceb)
TestEventWatch - 2019/11/27 02:31:51.070123 [DEBUG] http: Request GET /v1/event/list?index=1&name=foo (23.627868ms) from=127.0.0.1:55510
TestEventWatch - 2019/11/27 02:31:51.072472 [INFO] agent: Requesting shutdown
TestEventWatch - 2019/11/27 02:31:51.072561 [INFO] consul: shutting down server
TestEventWatch - 2019/11/27 02:31:51.072607 [WARN] serf: Shutdown without a Leave
TestEventWatch - 2019/11/27 02:31:51.192818 [WARN] serf: Shutdown without a Leave
TestConnectRootsWatch - 2019/11/27 02:31:51.206279 [DEBUG] http: Request GET /v1/agent/connect/ca/roots?index=9 (777.14858ms) from=127.0.0.1:59724
TestEventWatch - 2019/11/27 02:31:51.270632 [INFO] manager: shutting down
TestEventWatch - 2019/11/27 02:31:51.271120 [INFO] agent: consul server down
TestEventWatch - 2019/11/27 02:31:51.271180 [INFO] agent: shutdown complete
TestEventWatch - 2019/11/27 02:31:51.271238 [INFO] agent: Stopping DNS server 127.0.0.1:34013 (tcp)
TestEventWatch - 2019/11/27 02:31:51.271413 [INFO] agent: Stopping DNS server 127.0.0.1:34013 (udp)
TestEventWatch - 2019/11/27 02:31:51.271584 [INFO] agent: Stopping HTTP server 127.0.0.1:34014 (tcp)
TestEventWatch - 2019/11/27 02:31:51.272237 [INFO] agent: Waiting for endpoints to shut down
TestEventWatch - 2019/11/27 02:31:51.272473 [INFO] agent: Endpoints down
--- PASS: TestEventWatch (4.20s)
=== CONT  TestConnectProxyConfigWatch
WARNING: bootstrap = true: do not enable unless necessary
TestConnectProxyConfigWatch - 2019/11/27 02:31:51.327133 [WARN] agent: Node name "Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestConnectProxyConfigWatch - 2019/11/27 02:31:51.327525 [DEBUG] tlsutil: Update with version 1
TestConnectProxyConfigWatch - 2019/11/27 02:31:51.327589 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectProxyConfigWatch - 2019/11/27 02:31:51.327742 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestConnectProxyConfigWatch - 2019/11/27 02:31:51.327840 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestAgentServiceWatch - 2019/11/27 02:31:51.349706 [INFO] agent: Synced service "web"
TestAgentServiceWatch - 2019/11/27 02:31:51.349778 [DEBUG] agent: Node info in sync
TestAgentServiceWatch - 2019/11/27 02:31:51.349856 [DEBUG] http: Request PUT /v1/agent/service/register (288.222931ms) from=127.0.0.1:58920
TestAgentServiceWatch - 2019/11/27 02:31:51.354501 [DEBUG] http: Request GET /v1/agent/service/web (2.082076ms) from=127.0.0.1:58924
TestConnectRootsWatch - 2019/11/27 02:31:51.571077 [INFO] connect: CA rotated to new root under provider "consul"
TestConnectRootsWatch - 2019/11/27 02:31:51.571402 [INFO] agent: Requesting shutdown
TestConnectRootsWatch - 2019/11/27 02:31:51.571558 [INFO] consul: shutting down server
TestConnectRootsWatch - 2019/11/27 02:31:51.571770 [WARN] serf: Shutdown without a Leave
TestConnectRootsWatch - 2019/11/27 02:31:51.576410 [WARN] consul: error getting server health from "Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f": rpc error making call: EOF
TestConnectRootsWatch - 2019/11/27 02:31:51.648242 [WARN] serf: Shutdown without a Leave
TestAgentServiceWatch - 2019/11/27 02:31:51.649436 [INFO] agent: Synced service "web"
TestAgentServiceWatch - 2019/11/27 02:31:51.649510 [DEBUG] agent: Node info in sync
TestAgentServiceWatch - 2019/11/27 02:31:51.649587 [DEBUG] http: Request PUT /v1/agent/service/register (278.046558ms) from=127.0.0.1:58920
TestAgentServiceWatch - 2019/11/27 02:31:51.650312 [INFO] agent: Requesting shutdown
TestAgentServiceWatch - 2019/11/27 02:31:51.650398 [INFO] consul: shutting down server
TestAgentServiceWatch - 2019/11/27 02:31:51.650445 [WARN] serf: Shutdown without a Leave
TestAgentServiceWatch - 2019/11/27 02:31:51.726164 [WARN] serf: Shutdown without a Leave
TestConnectRootsWatch - 2019/11/27 02:31:51.728972 [INFO] manager: shutting down
TestConnectRootsWatch - 2019/11/27 02:31:51.729609 [INFO] agent: consul server down
TestConnectRootsWatch - 2019/11/27 02:31:51.731918 [INFO] agent: shutdown complete
TestConnectRootsWatch - 2019/11/27 02:31:51.732089 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (tcp)
TestConnectRootsWatch - 2019/11/27 02:31:51.732365 [INFO] agent: Stopping DNS server 127.0.0.1:34007 (udp)
TestConnectRootsWatch - 2019/11/27 02:31:51.732608 [INFO] agent: Stopping HTTP server 127.0.0.1:34008 (tcp)
TestConnectRootsWatch - 2019/11/27 02:31:51.732885 [INFO] agent: Waiting for endpoints to shut down
TestConnectRootsWatch - 2019/11/27 02:31:51.733044 [INFO] agent: Endpoints down
--- PASS: TestConnectRootsWatch (4.65s)
=== CONT  TestChecksWatch_Service
TestConnectLeafWatch - 2019/11/27 02:31:51.735088 [DEBUG] http: Request GET /v1/agent/connect/ca/leaf/web (694.043855ms) from=127.0.0.1:47400
TestAgentServiceWatch - 2019/11/27 02:31:51.771140 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestAgentServiceWatch - 2019/11/27 02:31:51.771223 [DEBUG] agent: Service "web" in sync
TestAgentServiceWatch - 2019/11/27 02:31:51.771257 [DEBUG] agent: Node info in sync
TestAgentServiceWatch - 2019/11/27 02:31:51.771339 [DEBUG] agent: Service "web" in sync
TestAgentServiceWatch - 2019/11/27 02:31:51.771385 [DEBUG] agent: Node info in sync
TestAgentServiceWatch - 2019/11/27 02:31:51.781759 [INFO] manager: shutting down
TestAgentServiceWatch - 2019/11/27 02:31:51.782037 [INFO] agent: consul server down
TestAgentServiceWatch - 2019/11/27 02:31:51.782088 [INFO] agent: shutdown complete
TestAgentServiceWatch - 2019/11/27 02:31:51.782143 [INFO] agent: Stopping DNS server 127.0.0.1:34019 (tcp)
TestAgentServiceWatch - 2019/11/27 02:31:51.782277 [INFO] agent: Stopping DNS server 127.0.0.1:34019 (udp)
TestAgentServiceWatch - 2019/11/27 02:31:51.782430 [INFO] agent: Stopping HTTP server 127.0.0.1:34020 (tcp)
TestAgentServiceWatch - 2019/11/27 02:31:51.782857 [INFO] agent: Waiting for endpoints to shut down
TestAgentServiceWatch - 2019/11/27 02:31:51.783024 [INFO] agent: Endpoints down
--- PASS: TestAgentServiceWatch (4.71s)
=== CONT  TestKeyWatch_With_PrefixDelete
WARNING: bootstrap = true: do not enable unless necessary
TestChecksWatch_Service - 2019/11/27 02:31:51.788101 [WARN] agent: Node name "Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestChecksWatch_Service - 2019/11/27 02:31:51.788417 [DEBUG] tlsutil: Update with version 1
TestChecksWatch_Service - 2019/11/27 02:31:51.788479 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestChecksWatch_Service - 2019/11/27 02:31:51.788638 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestChecksWatch_Service - 2019/11/27 02:31:51.788758 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
WARNING: bootstrap = true: do not enable unless necessary
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:51.836016 [WARN] agent: Node name "Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:51.836362 [DEBUG] tlsutil: Update with version 1
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:51.836423 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:51.836576 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:51.836669 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectLeafWatch - 2019/11/27 02:31:52.229959 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestConnectLeafWatch - 2019/11/27 02:31:52.230035 [DEBUG] agent: Service "web" in sync
TestConnectLeafWatch - 2019/11/27 02:31:52.230069 [DEBUG] agent: Node info in sync
TestConnectLeafWatch - 2019/11/27 02:31:52.230141 [DEBUG] agent: Service "web" in sync
TestConnectLeafWatch - 2019/11/27 02:31:52.230181 [DEBUG] agent: Node info in sync
2019/11/27 02:31:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:24dc25f4-9f81-53dd-c5d8-f1d29973a94f Address:127.0.0.1:34030}]
2019/11/27 02:31:52 [INFO]  raft: Node at 127.0.0.1:34030 [Follower] entering Follower state (Leader: "")
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.319722 [INFO] serf: EventMemberJoin: Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f.dc1 127.0.0.1
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.323658 [INFO] serf: EventMemberJoin: Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f 127.0.0.1
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.324291 [INFO] consul: Adding LAN server Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f (Addr: tcp/127.0.0.1:34030) (DC: dc1)
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.324507 [INFO] consul: Handled member-join event for server "Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f.dc1" in area "wan"
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.325216 [INFO] agent: Started DNS server 127.0.0.1:34025 (tcp)
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.325438 [INFO] agent: Started DNS server 127.0.0.1:34025 (udp)
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.327464 [INFO] agent: Started HTTP server on 127.0.0.1:34026 (tcp)
TestConnectProxyConfigWatch - 2019/11/27 02:31:52.327550 [INFO] agent: started state syncer
2019/11/27 02:31:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:52 [INFO]  raft: Node at 127.0.0.1:34030 [Candidate] entering Candidate state in term 2
TestConnectRootsWatch - 2019/11/27 02:31:52.572264 [WARN] consul: error getting server health from "Node 5a5e65e3-ea9f-b919-3cae-c21d486b247f": context deadline exceeded
2019/11/27 02:31:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a Address:127.0.0.1:34036}]
TestConnectLeafWatch - 2019/11/27 02:31:52.793651 [INFO] connect: CA rotated to new root under provider "consul"
2019/11/27 02:31:52 [INFO]  raft: Node at 127.0.0.1:34036 [Follower] entering Follower state (Leader: "")
TestChecksWatch_Service - 2019/11/27 02:31:52.797444 [INFO] serf: EventMemberJoin: Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a.dc1 127.0.0.1
TestChecksWatch_Service - 2019/11/27 02:31:52.801010 [INFO] serf: EventMemberJoin: Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a 127.0.0.1
TestChecksWatch_Service - 2019/11/27 02:31:52.801885 [INFO] consul: Handled member-join event for server "Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a.dc1" in area "wan"
TestChecksWatch_Service - 2019/11/27 02:31:52.801921 [INFO] consul: Adding LAN server Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a (Addr: tcp/127.0.0.1:34036) (DC: dc1)
TestChecksWatch_Service - 2019/11/27 02:31:52.802842 [INFO] agent: Started DNS server 127.0.0.1:34031 (tcp)
TestChecksWatch_Service - 2019/11/27 02:31:52.803151 [INFO] agent: Started DNS server 127.0.0.1:34031 (udp)
TestChecksWatch_Service - 2019/11/27 02:31:52.804985 [INFO] agent: Started HTTP server on 127.0.0.1:34032 (tcp)
TestChecksWatch_Service - 2019/11/27 02:31:52.805052 [INFO] agent: started state syncer
2019/11/27 02:31:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:52 [INFO]  raft: Node at 127.0.0.1:34036 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:52 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:523faf59-fe53-abfc-a7b8-65f18cc65e7f Address:127.0.0.1:34042}]
2019/11/27 02:31:52 [INFO]  raft: Node at 127.0.0.1:34042 [Follower] entering Follower state (Leader: "")
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.909523 [INFO] serf: EventMemberJoin: Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f.dc1 127.0.0.1
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.914176 [INFO] serf: EventMemberJoin: Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f 127.0.0.1
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.915126 [INFO] consul: Adding LAN server Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f (Addr: tcp/127.0.0.1:34042) (DC: dc1)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.915144 [INFO] consul: Handled member-join event for server "Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f.dc1" in area "wan"
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.915949 [INFO] agent: Started DNS server 127.0.0.1:34037 (tcp)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.916021 [INFO] agent: Started DNS server 127.0.0.1:34037 (udp)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.917991 [INFO] agent: Started HTTP server on 127.0.0.1:34038 (tcp)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:52.918078 [INFO] agent: started state syncer
2019/11/27 02:31:52 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:52 [INFO]  raft: Node at 127.0.0.1:34042 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:53 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:53 [INFO]  raft: Node at 127.0.0.1:34030 [Leader] entering Leader state
TestConnectProxyConfigWatch - 2019/11/27 02:31:53.017026 [INFO] consul: cluster leadership acquired
TestConnectProxyConfigWatch - 2019/11/27 02:31:53.017450 [INFO] consul: New leader elected: Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f
TestConnectLeafWatch - 2019/11/27 02:31:53.019780 [DEBUG] http: Request GET /v1/agent/connect/ca/leaf/web?index=14 (1.280797757s) from=127.0.0.1:47400
TestConnectLeafWatch - 2019/11/27 02:31:53.023733 [INFO] agent: Requesting shutdown
TestConnectLeafWatch - 2019/11/27 02:31:53.023812 [INFO] consul: shutting down server
TestConnectLeafWatch - 2019/11/27 02:31:53.023856 [WARN] serf: Shutdown without a Leave
TestConnectLeafWatch - 2019/11/27 02:31:53.172099 [WARN] serf: Shutdown without a Leave
TestConnectLeafWatch - 2019/11/27 02:31:53.260141 [INFO] manager: shutting down
TestConnectLeafWatch - 2019/11/27 02:31:53.262305 [INFO] agent: consul server down
TestConnectLeafWatch - 2019/11/27 02:31:53.262488 [INFO] agent: shutdown complete
TestConnectLeafWatch - 2019/11/27 02:31:53.262708 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (tcp)
TestConnectLeafWatch - 2019/11/27 02:31:53.262877 [INFO] agent: Stopping DNS server 127.0.0.1:34001 (udp)
TestConnectLeafWatch - 2019/11/27 02:31:53.263038 [INFO] agent: Stopping HTTP server 127.0.0.1:34002 (tcp)
TestConnectLeafWatch - 2019/11/27 02:31:53.263580 [INFO] agent: Waiting for endpoints to shut down
TestConnectLeafWatch - 2019/11/27 02:31:53.263700 [INFO] agent: Endpoints down
--- PASS: TestConnectLeafWatch (6.19s)
=== CONT  TestNodesWatch
WARNING: bootstrap = true: do not enable unless necessary
TestNodesWatch - 2019/11/27 02:31:53.315637 [WARN] agent: Node name "Node 7f18bc70-1aa0-1162-2f93-eb7233e52769" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestNodesWatch - 2019/11/27 02:31:53.315985 [DEBUG] tlsutil: Update with version 1
TestNodesWatch - 2019/11/27 02:31:53.316041 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestNodesWatch - 2019/11/27 02:31:53.322009 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestNodesWatch - 2019/11/27 02:31:53.322181 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:31:53 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:53 [INFO]  raft: Node at 127.0.0.1:34042 [Leader] entering Leader state
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:53.494448 [INFO] consul: cluster leadership acquired
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:53.494861 [INFO] consul: New leader elected: Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f
2019/11/27 02:31:53 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:53 [INFO]  raft: Node at 127.0.0.1:34036 [Leader] entering Leader state
TestChecksWatch_Service - 2019/11/27 02:31:53.582711 [INFO] consul: cluster leadership acquired
TestChecksWatch_Service - 2019/11/27 02:31:53.583120 [INFO] consul: New leader elected: Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a
TestConnectProxyConfigWatch - 2019/11/27 02:31:53.671293 [INFO] agent: Synced node info
TestConnectProxyConfigWatch - 2019/11/27 02:31:53.671778 [DEBUG] agent: Node info in sync
TestChecksWatch_Service - 2019/11/27 02:31:54.028738 [INFO] agent: Synced node info
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:54.305687 [INFO] agent: Synced node info
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:54.305818 [DEBUG] agent: Node info in sync
2019/11/27 02:31:54 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:7f18bc70-1aa0-1162-2f93-eb7233e52769 Address:127.0.0.1:34048}]
2019/11/27 02:31:54 [INFO]  raft: Node at 127.0.0.1:34048 [Follower] entering Follower state (Leader: "")
TestNodesWatch - 2019/11/27 02:31:54.478343 [INFO] serf: EventMemberJoin: Node 7f18bc70-1aa0-1162-2f93-eb7233e52769.dc1 127.0.0.1
TestNodesWatch - 2019/11/27 02:31:54.481241 [INFO] serf: EventMemberJoin: Node 7f18bc70-1aa0-1162-2f93-eb7233e52769 127.0.0.1
TestNodesWatch - 2019/11/27 02:31:54.481824 [INFO] consul: Adding LAN server Node 7f18bc70-1aa0-1162-2f93-eb7233e52769 (Addr: tcp/127.0.0.1:34048) (DC: dc1)
TestNodesWatch - 2019/11/27 02:31:54.481977 [INFO] consul: Handled member-join event for server "Node 7f18bc70-1aa0-1162-2f93-eb7233e52769.dc1" in area "wan"
TestNodesWatch - 2019/11/27 02:31:54.482417 [INFO] agent: Started DNS server 127.0.0.1:34043 (udp)
TestNodesWatch - 2019/11/27 02:31:54.482491 [INFO] agent: Started DNS server 127.0.0.1:34043 (tcp)
TestNodesWatch - 2019/11/27 02:31:54.484366 [INFO] agent: Started HTTP server on 127.0.0.1:34044 (tcp)
TestNodesWatch - 2019/11/27 02:31:54.484455 [INFO] agent: started state syncer
2019/11/27 02:31:54 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:54 [INFO]  raft: Node at 127.0.0.1:34048 [Candidate] entering Candidate state in term 2
TestConnectProxyConfigWatch - 2019/11/27 02:31:54.759807 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestConnectProxyConfigWatch - 2019/11/27 02:31:54.760439 [DEBUG] consul: Skipping self join check for "Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f" since the cluster is too small
TestConnectProxyConfigWatch - 2019/11/27 02:31:54.760586 [INFO] consul: member 'Node 24dc25f4-9f81-53dd-c5d8-f1d29973a94f' joined, marking health alive
2019/11/27 02:31:55 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:55 [INFO]  raft: Node at 127.0.0.1:34048 [Leader] entering Leader state
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.016930 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.017000 [DEBUG] agent: Node info in sync
TestNodesWatch - 2019/11/27 02:31:55.017123 [INFO] consul: cluster leadership acquired
TestNodesWatch - 2019/11/27 02:31:55.017513 [INFO] consul: New leader elected: Node 7f18bc70-1aa0-1162-2f93-eb7233e52769
TestChecksWatch_Service - 2019/11/27 02:31:55.237792 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestChecksWatch_Service - 2019/11/27 02:31:55.238305 [DEBUG] consul: Skipping self join check for "Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a" since the cluster is too small
TestChecksWatch_Service - 2019/11/27 02:31:55.238545 [INFO] consul: member 'Node 629d50bc-37ee-b3fe-e6b7-fe7e74d2e75a' joined, marking health alive
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.238887 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.239215 [DEBUG] consul: Skipping self join check for "Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f" since the cluster is too small
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.239353 [INFO] consul: member 'Node 523faf59-fe53-abfc-a7b8-65f18cc65e7f' joined, marking health alive
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.317083 [INFO] agent: Synced service "web"
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.317277 [DEBUG] agent: Node info in sync
TestChecksWatch_Service - 2019/11/27 02:31:55.361834 [DEBUG] agent: Node info in sync
TestChecksWatch_Service - 2019/11/27 02:31:55.361936 [DEBUG] agent: Node info in sync
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.432431 [DEBUG] http: Request GET /v1/kv/foo/bar/baz (467.017µs) from=127.0.0.1:60116
TestChecksWatch_Service - 2019/11/27 02:31:55.434237 [DEBUG] http: Request GET /v1/health/checks/foobar (2.07041ms) from=127.0.0.1:58858
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.505252 [DEBUG] agent: Service "web" in sync
TestNodesWatch - 2019/11/27 02:31:55.660967 [INFO] agent: Synced node info
TestNodesWatch - 2019/11/27 02:31:55.661134 [DEBUG] agent: Node info in sync
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.669154 [DEBUG] http: Request PUT /v1/kv/foo/bar/baz (215.743261ms) from=127.0.0.1:60118
TestChecksWatch_Service - 2019/11/27 02:31:55.674838 [DEBUG] http: Request GET /v1/health/checks/foobar?index=10 (238.991448ms) from=127.0.0.1:58858
TestChecksWatch_Service - 2019/11/27 02:31:55.676170 [DEBUG] http: Request PUT /v1/catalog/register (223.239869ms) from=127.0.0.1:58864
TestChecksWatch_Service - 2019/11/27 02:31:55.683387 [INFO] agent: Requesting shutdown
TestChecksWatch_Service - 2019/11/27 02:31:55.683466 [INFO] consul: shutting down server
TestChecksWatch_Service - 2019/11/27 02:31:55.683515 [WARN] serf: Shutdown without a Leave
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.685390 [DEBUG] http: Request GET /v1/kv/foo/bar/baz?index=1 (250.64421ms) from=127.0.0.1:60116
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.688661 [INFO] agent: Requesting shutdown
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.688746 [INFO] consul: shutting down server
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.688799 [WARN] serf: Shutdown without a Leave
TestChecksWatch_Service - 2019/11/27 02:31:55.805548 [WARN] serf: Shutdown without a Leave
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.805851 [WARN] serf: Shutdown without a Leave
TestChecksWatch_Service - 2019/11/27 02:31:55.892662 [INFO] manager: shutting down
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.892669 [INFO] manager: shutting down
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.893083 [INFO] agent: consul server down
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.893139 [INFO] agent: shutdown complete
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.893193 [INFO] agent: Stopping DNS server 127.0.0.1:34037 (tcp)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.893333 [INFO] agent: Stopping DNS server 127.0.0.1:34037 (udp)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.893491 [INFO] agent: Stopping HTTP server 127.0.0.1:34038 (tcp)
TestChecksWatch_Service - 2019/11/27 02:31:55.893662 [INFO] agent: consul server down
TestChecksWatch_Service - 2019/11/27 02:31:55.893801 [INFO] agent: shutdown complete
TestChecksWatch_Service - 2019/11/27 02:31:55.893950 [INFO] agent: Stopping DNS server 127.0.0.1:34031 (tcp)
TestChecksWatch_Service - 2019/11/27 02:31:55.894199 [INFO] agent: Stopping DNS server 127.0.0.1:34031 (udp)
TestChecksWatch_Service - 2019/11/27 02:31:55.894475 [INFO] agent: Stopping HTTP server 127.0.0.1:34032 (tcp)
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.893973 [INFO] agent: Waiting for endpoints to shut down
TestKeyWatch_With_PrefixDelete - 2019/11/27 02:31:55.895012 [INFO] agent: Endpoints down
--- PASS: TestKeyWatch_With_PrefixDelete (4.11s)
=== CONT  TestServicesWatch
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.898348 [INFO] agent: Synced service "web-proxy"
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899053 [DEBUG] agent: Check "service:web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899107 [DEBUG] agent: Node info in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899286 [DEBUG] agent: Service "web" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899336 [DEBUG] agent: Service "web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899383 [DEBUG] agent: Check "service:web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899418 [DEBUG] agent: Node info in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.899476 [DEBUG] http: Request PUT /v1/agent/service/register (860.335281ms) from=127.0.0.1:51716
TestConnectProxyConfigWatch - 2019/11/27 02:31:55.903675 [DEBUG] http: Request GET /v1/agent/connect/proxy/web-proxy (2.060743ms) from=127.0.0.1:51726
WARNING: bootstrap = true: do not enable unless necessary
TestServicesWatch - 2019/11/27 02:31:55.954012 [WARN] agent: Node name "Node 6bca61e1-ad2f-8182-3879-289e867b77ab" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServicesWatch - 2019/11/27 02:31:55.954352 [DEBUG] tlsutil: Update with version 1
TestServicesWatch - 2019/11/27 02:31:55.954415 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestServicesWatch - 2019/11/27 02:31:55.954559 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestServicesWatch - 2019/11/27 02:31:55.954661 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.182884 [INFO] agent: Synced service "web"
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.182960 [DEBUG] agent: Service "web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.183019 [DEBUG] agent: Check "service:web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.183092 [DEBUG] agent: Node info in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.405777 [DEBUG] agent: Service "web" in sync
TestNodesWatch - 2019/11/27 02:31:56.537504 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestNodesWatch - 2019/11/27 02:31:56.538073 [DEBUG] consul: Skipping self join check for "Node 7f18bc70-1aa0-1162-2f93-eb7233e52769" since the cluster is too small
TestNodesWatch - 2019/11/27 02:31:56.538316 [INFO] consul: member 'Node 7f18bc70-1aa0-1162-2f93-eb7233e52769' joined, marking health alive
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.539010 [INFO] agent: Synced service "web-proxy"
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.539286 [DEBUG] agent: Check "service:web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.539473 [DEBUG] agent: Node info in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.539731 [DEBUG] agent: Service "web" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.539888 [DEBUG] agent: Service "web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.540045 [DEBUG] agent: Check "service:web-proxy" in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.540201 [DEBUG] agent: Node info in sync
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.540362 [DEBUG] http: Request PUT /v1/agent/service/register (619.06041ms) from=127.0.0.1:51716
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.541184 [INFO] agent: Requesting shutdown
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.541264 [INFO] consul: shutting down server
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.541306 [WARN] serf: Shutdown without a Leave
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.714639 [WARN] serf: Shutdown without a Leave
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.803543 [INFO] manager: shutting down
2019/11/27 02:31:56 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:6bca61e1-ad2f-8182-3879-289e867b77ab Address:127.0.0.1:34054}]
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.804161 [INFO] agent: consul server down
2019/11/27 02:31:56 [INFO]  raft: Node at 127.0.0.1:34054 [Follower] entering Follower state (Leader: "")
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.804216 [INFO] agent: shutdown complete
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.804268 [INFO] agent: Stopping DNS server 127.0.0.1:34025 (tcp)
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.804427 [INFO] agent: Stopping DNS server 127.0.0.1:34025 (udp)
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.804595 [INFO] agent: Stopping HTTP server 127.0.0.1:34026 (tcp)
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.805082 [INFO] agent: Waiting for endpoints to shut down
TestConnectProxyConfigWatch - 2019/11/27 02:31:56.805174 [INFO] agent: Endpoints down
--- PASS: TestConnectProxyConfigWatch (5.53s)
=== CONT  TestKeyPrefixWatch
TestServicesWatch - 2019/11/27 02:31:56.807422 [INFO] serf: EventMemberJoin: Node 6bca61e1-ad2f-8182-3879-289e867b77ab.dc1 127.0.0.1
TestServicesWatch - 2019/11/27 02:31:56.811085 [INFO] serf: EventMemberJoin: Node 6bca61e1-ad2f-8182-3879-289e867b77ab 127.0.0.1
TestServicesWatch - 2019/11/27 02:31:56.811877 [INFO] consul: Adding LAN server Node 6bca61e1-ad2f-8182-3879-289e867b77ab (Addr: tcp/127.0.0.1:34054) (DC: dc1)
TestServicesWatch - 2019/11/27 02:31:56.812337 [INFO] consul: Handled member-join event for server "Node 6bca61e1-ad2f-8182-3879-289e867b77ab.dc1" in area "wan"
TestServicesWatch - 2019/11/27 02:31:56.813156 [INFO] agent: Started DNS server 127.0.0.1:34049 (tcp)
TestServicesWatch - 2019/11/27 02:31:56.813417 [INFO] agent: Started DNS server 127.0.0.1:34049 (udp)
TestServicesWatch - 2019/11/27 02:31:56.815241 [INFO] agent: Started HTTP server on 127.0.0.1:34050 (tcp)
TestServicesWatch - 2019/11/27 02:31:56.815432 [INFO] agent: started state syncer
TestNodesWatch - 2019/11/27 02:31:56.829452 [DEBUG] http: Request GET /v1/catalog/nodes (1.637394ms) from=127.0.0.1:43010
WARNING: bootstrap = true: do not enable unless necessary
TestKeyPrefixWatch - 2019/11/27 02:31:56.859700 [WARN] agent: Node name "Node 39bf3a96-e053-9b23-d047-64fa41cb3625" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKeyPrefixWatch - 2019/11/27 02:31:56.860036 [DEBUG] tlsutil: Update with version 1
TestKeyPrefixWatch - 2019/11/27 02:31:56.860098 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyPrefixWatch - 2019/11/27 02:31:56.860332 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKeyPrefixWatch - 2019/11/27 02:31:56.860428 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:31:56 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:56 [INFO]  raft: Node at 127.0.0.1:34054 [Candidate] entering Candidate state in term 2
TestChecksWatch_Service - 2019/11/27 02:31:56.894912 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:34032 (tcp)
TestChecksWatch_Service - 2019/11/27 02:31:56.894970 [INFO] agent: Waiting for endpoints to shut down
TestChecksWatch_Service - 2019/11/27 02:31:56.895004 [INFO] agent: Endpoints down
--- PASS: TestChecksWatch_Service (5.16s)
=== CONT  TestParse_exempt
=== CONT  TestKeyWatch
--- PASS: TestParse_exempt (0.00s)
WARNING: bootstrap = true: do not enable unless necessary
TestKeyWatch - 2019/11/27 02:31:56.970757 [WARN] agent: Node name "Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestKeyWatch - 2019/11/27 02:31:56.971292 [DEBUG] tlsutil: Update with version 1
TestKeyWatch - 2019/11/27 02:31:56.971437 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyWatch - 2019/11/27 02:31:56.971674 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestKeyWatch - 2019/11/27 02:31:56.971993 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestNodesWatch - 2019/11/27 02:31:57.062014 [DEBUG] http: Request PUT /v1/catalog/register (214.252204ms) from=127.0.0.1:43012
TestNodesWatch - 2019/11/27 02:31:57.063456 [INFO] agent: Requesting shutdown
TestNodesWatch - 2019/11/27 02:31:57.063564 [INFO] consul: shutting down server
TestNodesWatch - 2019/11/27 02:31:57.063617 [WARN] serf: Shutdown without a Leave
TestNodesWatch - 2019/11/27 02:31:57.192356 [WARN] serf: Shutdown without a Leave
TestNodesWatch - 2019/11/27 02:31:57.270222 [INFO] manager: shutting down
TestNodesWatch - 2019/11/27 02:31:57.271272 [INFO] agent: consul server down
TestNodesWatch - 2019/11/27 02:31:57.271465 [INFO] agent: shutdown complete
TestNodesWatch - 2019/11/27 02:31:57.271633 [INFO] agent: Stopping DNS server 127.0.0.1:34043 (tcp)
TestNodesWatch - 2019/11/27 02:31:57.272041 [INFO] agent: Stopping DNS server 127.0.0.1:34043 (udp)
TestNodesWatch - 2019/11/27 02:31:57.272607 [INFO] agent: Stopping HTTP server 127.0.0.1:34044 (tcp)
TestNodesWatch - 2019/11/27 02:31:57.272972 [INFO] agent: Waiting for endpoints to shut down
TestNodesWatch - 2019/11/27 02:31:57.273122 [INFO] agent: Endpoints down
--- PASS: TestNodesWatch (4.01s)
=== CONT  TestParseBasic
--- PASS: TestParseBasic (0.00s)
=== CONT  TestChecksWatch_State
WARNING: bootstrap = true: do not enable unless necessary
TestChecksWatch_State - 2019/11/27 02:31:57.325439 [WARN] agent: Node name "Node 38d58c16-ae3e-ad56-d5ca-5571307f6794" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestChecksWatch_State - 2019/11/27 02:31:57.325937 [DEBUG] tlsutil: Update with version 1
TestChecksWatch_State - 2019/11/27 02:31:57.326100 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestChecksWatch_State - 2019/11/27 02:31:57.326380 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestChecksWatch_State - 2019/11/27 02:31:57.326622 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
2019/11/27 02:31:57 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:57 [INFO]  raft: Node at 127.0.0.1:34054 [Leader] entering Leader state
TestServicesWatch - 2019/11/27 02:31:57.593508 [INFO] consul: cluster leadership acquired
TestServicesWatch - 2019/11/27 02:31:57.594032 [INFO] consul: New leader elected: Node 6bca61e1-ad2f-8182-3879-289e867b77ab
2019/11/27 02:31:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:39bf3a96-e053-9b23-d047-64fa41cb3625 Address:127.0.0.1:34060}]
TestServicesWatch - 2019/11/27 02:31:57.904229 [INFO] agent: Synced node info
2019/11/27 02:31:57 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:192e668d-d0b7-e87b-c6ff-a2dcc30ebe20 Address:127.0.0.1:34066}]
2019/11/27 02:31:57 [INFO]  raft: Node at 127.0.0.1:34060 [Follower] entering Follower state (Leader: "")
2019/11/27 02:31:57 [INFO]  raft: Node at 127.0.0.1:34066 [Follower] entering Follower state (Leader: "")
TestServicesWatch - 2019/11/27 02:31:57.907403 [DEBUG] agent: Node info in sync
TestKeyPrefixWatch - 2019/11/27 02:31:57.907499 [INFO] serf: EventMemberJoin: Node 39bf3a96-e053-9b23-d047-64fa41cb3625.dc1 127.0.0.1
TestKeyWatch - 2019/11/27 02:31:57.909582 [INFO] serf: EventMemberJoin: Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20.dc1 127.0.0.1
TestKeyPrefixWatch - 2019/11/27 02:31:57.910554 [INFO] serf: EventMemberJoin: Node 39bf3a96-e053-9b23-d047-64fa41cb3625 127.0.0.1
TestKeyPrefixWatch - 2019/11/27 02:31:57.911672 [INFO] agent: Started DNS server 127.0.0.1:34055 (udp)
TestKeyPrefixWatch - 2019/11/27 02:31:57.911676 [INFO] consul: Adding LAN server Node 39bf3a96-e053-9b23-d047-64fa41cb3625 (Addr: tcp/127.0.0.1:34060) (DC: dc1)
TestKeyPrefixWatch - 2019/11/27 02:31:57.911918 [INFO] consul: Handled member-join event for server "Node 39bf3a96-e053-9b23-d047-64fa41cb3625.dc1" in area "wan"
TestKeyPrefixWatch - 2019/11/27 02:31:57.912230 [INFO] agent: Started DNS server 127.0.0.1:34055 (tcp)
TestKeyWatch - 2019/11/27 02:31:57.913964 [INFO] serf: EventMemberJoin: Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20 127.0.0.1
TestKeyPrefixWatch - 2019/11/27 02:31:57.914341 [INFO] agent: Started HTTP server on 127.0.0.1:34056 (tcp)
TestKeyPrefixWatch - 2019/11/27 02:31:57.914420 [INFO] agent: started state syncer
TestKeyWatch - 2019/11/27 02:31:57.914552 [INFO] consul: Adding LAN server Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20 (Addr: tcp/127.0.0.1:34066) (DC: dc1)
TestKeyWatch - 2019/11/27 02:31:57.914958 [INFO] consul: Handled member-join event for server "Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20.dc1" in area "wan"
TestKeyWatch - 2019/11/27 02:31:57.915120 [INFO] agent: Started DNS server 127.0.0.1:34061 (udp)
TestKeyWatch - 2019/11/27 02:31:57.915178 [INFO] agent: Started DNS server 127.0.0.1:34061 (tcp)
TestKeyWatch - 2019/11/27 02:31:57.917152 [INFO] agent: Started HTTP server on 127.0.0.1:34062 (tcp)
TestKeyWatch - 2019/11/27 02:31:57.917344 [INFO] agent: started state syncer
2019/11/27 02:31:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:57 [INFO]  raft: Node at 127.0.0.1:34060 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:57 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:57 [INFO]  raft: Node at 127.0.0.1:34066 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:58 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:38d58c16-ae3e-ad56-d5ca-5571307f6794 Address:127.0.0.1:34072}]
2019/11/27 02:31:58 [INFO]  raft: Node at 127.0.0.1:34072 [Follower] entering Follower state (Leader: "")
TestChecksWatch_State - 2019/11/27 02:31:58.141143 [INFO] serf: EventMemberJoin: Node 38d58c16-ae3e-ad56-d5ca-5571307f6794.dc1 127.0.0.1
TestChecksWatch_State - 2019/11/27 02:31:58.144111 [INFO] serf: EventMemberJoin: Node 38d58c16-ae3e-ad56-d5ca-5571307f6794 127.0.0.1
TestChecksWatch_State - 2019/11/27 02:31:58.144664 [INFO] consul: Adding LAN server Node 38d58c16-ae3e-ad56-d5ca-5571307f6794 (Addr: tcp/127.0.0.1:34072) (DC: dc1)
TestChecksWatch_State - 2019/11/27 02:31:58.144847 [INFO] consul: Handled member-join event for server "Node 38d58c16-ae3e-ad56-d5ca-5571307f6794.dc1" in area "wan"
TestChecksWatch_State - 2019/11/27 02:31:58.145312 [INFO] agent: Started DNS server 127.0.0.1:34067 (udp)
TestChecksWatch_State - 2019/11/27 02:31:58.145374 [INFO] agent: Started DNS server 127.0.0.1:34067 (tcp)
TestChecksWatch_State - 2019/11/27 02:31:58.147290 [INFO] agent: Started HTTP server on 127.0.0.1:34068 (tcp)
TestChecksWatch_State - 2019/11/27 02:31:58.147377 [INFO] agent: started state syncer
2019/11/27 02:31:58 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:31:58 [INFO]  raft: Node at 127.0.0.1:34072 [Candidate] entering Candidate state in term 2
2019/11/27 02:31:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:58 [INFO]  raft: Node at 127.0.0.1:34066 [Leader] entering Leader state
2019/11/27 02:31:58 [INFO]  raft: Node at 127.0.0.1:34060 [Leader] entering Leader state
TestKeyPrefixWatch - 2019/11/27 02:31:58.459842 [INFO] consul: cluster leadership acquired
TestKeyPrefixWatch - 2019/11/27 02:31:58.460207 [INFO] consul: New leader elected: Node 39bf3a96-e053-9b23-d047-64fa41cb3625
TestKeyWatch - 2019/11/27 02:31:58.462573 [INFO] consul: cluster leadership acquired
TestKeyWatch - 2019/11/27 02:31:58.462916 [INFO] consul: New leader elected: Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20
2019/11/27 02:31:58 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:31:58 [INFO]  raft: Node at 127.0.0.1:34072 [Leader] entering Leader state
TestChecksWatch_State - 2019/11/27 02:31:58.694265 [INFO] consul: cluster leadership acquired
TestChecksWatch_State - 2019/11/27 02:31:58.694648 [INFO] consul: New leader elected: Node 38d58c16-ae3e-ad56-d5ca-5571307f6794
TestServicesWatch - 2019/11/27 02:31:58.859543 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKeyPrefixWatch - 2019/11/27 02:31:58.859740 [INFO] agent: Synced node info
TestKeyPrefixWatch - 2019/11/27 02:31:58.859860 [DEBUG] agent: Node info in sync
TestServicesWatch - 2019/11/27 02:31:58.860090 [DEBUG] consul: Skipping self join check for "Node 6bca61e1-ad2f-8182-3879-289e867b77ab" since the cluster is too small
TestServicesWatch - 2019/11/27 02:31:58.860284 [INFO] consul: member 'Node 6bca61e1-ad2f-8182-3879-289e867b77ab' joined, marking health alive
TestKeyWatch - 2019/11/27 02:31:59.017429 [INFO] agent: Synced node info
TestServicesWatch - 2019/11/27 02:31:59.048630 [DEBUG] http: Request GET /v1/catalog/services (5.056852ms) from=127.0.0.1:55926
TestKeyWatch - 2019/11/27 02:31:59.559264 [DEBUG] agent: Node info in sync
TestKeyWatch - 2019/11/27 02:31:59.559359 [DEBUG] agent: Node info in sync
TestServicesWatch - 2019/11/27 02:31:59.760149 [INFO] agent: Synced service "foo"
TestServicesWatch - 2019/11/27 02:31:59.760227 [DEBUG] agent: Node info in sync
TestServicesWatch - 2019/11/27 02:31:59.760300 [DEBUG] http: Request PUT /v1/agent/service/register (697.027929ms) from=127.0.0.1:55928
TestServicesWatch - 2019/11/27 02:31:59.760992 [INFO] agent: Requesting shutdown
TestServicesWatch - 2019/11/27 02:31:59.761068 [INFO] consul: shutting down server
TestServicesWatch - 2019/11/27 02:31:59.761125 [WARN] serf: Shutdown without a Leave
TestServicesWatch - 2019/11/27 02:31:59.762702 [DEBUG] http: Request GET /v1/catalog/services?index=10 (711.213449ms) from=127.0.0.1:55926
TestServicesWatch - 2019/11/27 02:31:59.848480 [DEBUG] agent: Skipping remote check "serfHealth" since it is managed automatically
TestServicesWatch - 2019/11/27 02:31:59.848553 [DEBUG] agent: Service "foo" in sync
TestServicesWatch - 2019/11/27 02:31:59.848587 [DEBUG] agent: Node info in sync
TestServicesWatch - 2019/11/27 02:31:59.848661 [DEBUG] agent: Service "foo" in sync
TestServicesWatch - 2019/11/27 02:31:59.848696 [DEBUG] agent: Node info in sync
TestServicesWatch - 2019/11/27 02:31:59.881193 [WARN] serf: Shutdown without a Leave
TestChecksWatch_State - 2019/11/27 02:31:59.883496 [INFO] agent: Synced node info
TestChecksWatch_State - 2019/11/27 02:31:59.883592 [DEBUG] agent: Node info in sync
TestKeyPrefixWatch - 2019/11/27 02:31:59.947419 [DEBUG] agent: Node info in sync
TestServicesWatch - 2019/11/27 02:31:59.958985 [INFO] manager: shutting down
TestServicesWatch - 2019/11/27 02:31:59.959514 [INFO] agent: consul server down
TestServicesWatch - 2019/11/27 02:31:59.959962 [INFO] agent: shutdown complete
TestServicesWatch - 2019/11/27 02:31:59.960043 [INFO] agent: Stopping DNS server 127.0.0.1:34049 (tcp)
TestServicesWatch - 2019/11/27 02:31:59.960209 [INFO] agent: Stopping DNS server 127.0.0.1:34049 (udp)
TestServicesWatch - 2019/11/27 02:31:59.960381 [INFO] agent: Stopping HTTP server 127.0.0.1:34050 (tcp)
TestServicesWatch - 2019/11/27 02:31:59.960822 [INFO] agent: Waiting for endpoints to shut down
TestServicesWatch - 2019/11/27 02:31:59.960905 [INFO] agent: Endpoints down
--- PASS: TestServicesWatch (4.07s)
=== CONT  TestRun_Stop_Hybrid
--- PASS: TestRun_Stop_Hybrid (0.00s)
=== CONT  TestServiceWatch
WARNING: bootstrap = true: do not enable unless necessary
TestServiceWatch - 2019/11/27 02:32:00.013102 [WARN] agent: Node name "Node 2fa32e65-937a-0736-2514-c4c79f0d4535" will not be discoverable via DNS due to invalid characters. Valid characters include all alpha-numerics and dashes.
TestServiceWatch - 2019/11/27 02:32:00.013441 [DEBUG] tlsutil: Update with version 1
TestServiceWatch - 2019/11/27 02:32:00.013505 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestServiceWatch - 2019/11/27 02:32:00.013679 [DEBUG] tlsutil: IncomingRPCConfig with version 1
TestServiceWatch - 2019/11/27 02:32:00.013782 [DEBUG] tlsutil: OutgoingRPCWrapper with version 1
TestKeyPrefixWatch - 2019/11/27 02:32:00.626240 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKeyPrefixWatch - 2019/11/27 02:32:00.627027 [DEBUG] consul: Skipping self join check for "Node 39bf3a96-e053-9b23-d047-64fa41cb3625" since the cluster is too small
TestKeyPrefixWatch - 2019/11/27 02:32:00.627327 [INFO] consul: member 'Node 39bf3a96-e053-9b23-d047-64fa41cb3625' joined, marking health alive
TestKeyWatch - 2019/11/27 02:32:00.737306 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestKeyWatch - 2019/11/27 02:32:00.738167 [DEBUG] consul: Skipping self join check for "Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20" since the cluster is too small
TestKeyWatch - 2019/11/27 02:32:00.738349 [INFO] consul: member 'Node 192e668d-d0b7-e87b-c6ff-a2dcc30ebe20' joined, marking health alive
TestKeyPrefixWatch - 2019/11/27 02:32:00.898756 [DEBUG] http: Request GET /v1/kv/foo/?recurse= (419.682µs) from=127.0.0.1:54072
TestChecksWatch_State - 2019/11/27 02:32:01.081530 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestChecksWatch_State - 2019/11/27 02:32:01.082020 [DEBUG] consul: Skipping self join check for "Node 38d58c16-ae3e-ad56-d5ca-5571307f6794" since the cluster is too small
TestChecksWatch_State - 2019/11/27 02:32:01.082489 [INFO] consul: member 'Node 38d58c16-ae3e-ad56-d5ca-5571307f6794' joined, marking health alive
TestKeyWatch - 2019/11/27 02:32:01.105383 [DEBUG] http: Request GET /v1/kv/foo/bar/baz (345.679µs) from=127.0.0.1:59370
TestKeyPrefixWatch - 2019/11/27 02:32:01.606524 [DEBUG] http: Request PUT /v1/kv/foo/bar (688.080258ms) from=127.0.0.1:54074
TestKeyPrefixWatch - 2019/11/27 02:32:01.608061 [DEBUG] http: Request GET /v1/kv/foo/?index=1&recurse= (708.165329ms) from=127.0.0.1:54072
TestKeyPrefixWatch - 2019/11/27 02:32:01.610177 [INFO] agent: Requesting shutdown
TestKeyPrefixWatch - 2019/11/27 02:32:01.610248 [INFO] consul: shutting down server
TestKeyPrefixWatch - 2019/11/27 02:32:01.610358 [WARN] serf: Shutdown without a Leave
TestChecksWatch_State - 2019/11/27 02:32:01.681440 [DEBUG] agent: Node info in sync
TestKeyPrefixWatch - 2019/11/27 02:32:01.681903 [WARN] serf: Shutdown without a Leave
TestKeyWatch - 2019/11/27 02:32:01.690701 [DEBUG] http: Request PUT /v1/kv/foo/bar/baz (565.619096ms) from=127.0.0.1:59372
TestKeyWatch - 2019/11/27 02:32:01.699075 [DEBUG] http: Request GET /v1/kv/foo/bar/baz?index=1 (592.545751ms) from=127.0.0.1:59370
TestChecksWatch_State - 2019/11/27 02:32:01.700474 [DEBUG] http: Request GET /v1/health/state/warning (1.341049ms) from=127.0.0.1:36938
TestKeyWatch - 2019/11/27 02:32:01.704263 [INFO] agent: Requesting shutdown
TestKeyWatch - 2019/11/27 02:32:01.704393 [INFO] consul: shutting down server
TestKeyWatch - 2019/11/27 02:32:01.704449 [WARN] serf: Shutdown without a Leave
TestKeyPrefixWatch - 2019/11/27 02:32:01.981361 [INFO] manager: shutting down
2019/11/27 02:32:01 [INFO]  raft: Initial configuration (index=1): [{Suffrage:Voter ID:2fa32e65-937a-0736-2514-c4c79f0d4535 Address:127.0.0.1:34078}]
TestKeyPrefixWatch - 2019/11/27 02:32:01.982072 [INFO] agent: consul server down
TestKeyPrefixWatch - 2019/11/27 02:32:01.982130 [INFO] agent: shutdown complete
TestKeyPrefixWatch - 2019/11/27 02:32:01.982184 [INFO] agent: Stopping DNS server 127.0.0.1:34055 (tcp)
2019/11/27 02:32:01 [INFO]  raft: Node at 127.0.0.1:34078 [Follower] entering Follower state (Leader: "")
TestKeyPrefixWatch - 2019/11/27 02:32:01.982330 [INFO] agent: Stopping DNS server 127.0.0.1:34055 (udp)
TestKeyPrefixWatch - 2019/11/27 02:32:01.982471 [INFO] agent: Stopping HTTP server 127.0.0.1:34056 (tcp)
TestServiceWatch - 2019/11/27 02:32:01.985015 [INFO] serf: EventMemberJoin: Node 2fa32e65-937a-0736-2514-c4c79f0d4535.dc1 127.0.0.1
TestServiceWatch - 2019/11/27 02:32:01.987948 [INFO] serf: EventMemberJoin: Node 2fa32e65-937a-0736-2514-c4c79f0d4535 127.0.0.1
TestServiceWatch - 2019/11/27 02:32:01.988524 [INFO] consul: Adding LAN server Node 2fa32e65-937a-0736-2514-c4c79f0d4535 (Addr: tcp/127.0.0.1:34078) (DC: dc1)
TestServiceWatch - 2019/11/27 02:32:01.988815 [INFO] consul: Handled member-join event for server "Node 2fa32e65-937a-0736-2514-c4c79f0d4535.dc1" in area "wan"
TestServiceWatch - 2019/11/27 02:32:01.989100 [INFO] agent: Started DNS server 127.0.0.1:34073 (udp)
TestServiceWatch - 2019/11/27 02:32:01.989166 [INFO] agent: Started DNS server 127.0.0.1:34073 (tcp)
TestServiceWatch - 2019/11/27 02:32:01.990994 [INFO] agent: Started HTTP server on 127.0.0.1:34074 (tcp)
TestServiceWatch - 2019/11/27 02:32:01.991110 [INFO] agent: started state syncer
2019/11/27 02:32:02 [WARN]  raft: Heartbeat timeout from "" reached, starting election
2019/11/27 02:32:02 [INFO]  raft: Node at 127.0.0.1:34078 [Candidate] entering Candidate state in term 2
TestKeyWatch - 2019/11/27 02:32:02.325886 [WARN] serf: Shutdown without a Leave
TestChecksWatch_State - 2019/11/27 02:32:02.449916 [DEBUG] http: Request PUT /v1/catalog/register (729.758118ms) from=127.0.0.1:36940
TestChecksWatch_State - 2019/11/27 02:32:02.451949 [DEBUG] http: Request GET /v1/health/state/warning?index=10 (745.897044ms) from=127.0.0.1:36938
TestKeyWatch - 2019/11/27 02:32:02.452423 [INFO] manager: shutting down
TestKeyWatch - 2019/11/27 02:32:02.454224 [INFO] agent: consul server down
TestKeyWatch - 2019/11/27 02:32:02.454300 [INFO] agent: shutdown complete
TestKeyWatch - 2019/11/27 02:32:02.454359 [INFO] agent: Stopping DNS server 127.0.0.1:34061 (tcp)
TestKeyWatch - 2019/11/27 02:32:02.454503 [INFO] agent: Stopping DNS server 127.0.0.1:34061 (udp)
TestKeyWatch - 2019/11/27 02:32:02.454672 [INFO] agent: Stopping HTTP server 127.0.0.1:34062 (tcp)
TestChecksWatch_State - 2019/11/27 02:32:02.454900 [INFO] agent: Requesting shutdown
TestChecksWatch_State - 2019/11/27 02:32:02.454974 [INFO] consul: shutting down server
TestChecksWatch_State - 2019/11/27 02:32:02.455027 [WARN] serf: Shutdown without a Leave
TestChecksWatch_State - 2019/11/27 02:32:02.625667 [WARN] serf: Shutdown without a Leave
TestChecksWatch_State - 2019/11/27 02:32:02.858785 [INFO] manager: shutting down
TestChecksWatch_State - 2019/11/27 02:32:02.859685 [INFO] agent: consul server down
TestChecksWatch_State - 2019/11/27 02:32:02.859919 [INFO] agent: shutdown complete
TestChecksWatch_State - 2019/11/27 02:32:02.860118 [INFO] agent: Stopping DNS server 127.0.0.1:34067 (tcp)
TestChecksWatch_State - 2019/11/27 02:32:02.860390 [INFO] agent: Stopping DNS server 127.0.0.1:34067 (udp)
TestChecksWatch_State - 2019/11/27 02:32:02.860615 [INFO] agent: Stopping HTTP server 127.0.0.1:34068 (tcp)
TestChecksWatch_State - 2019/11/27 02:32:02.860888 [INFO] agent: Waiting for endpoints to shut down
TestChecksWatch_State - 2019/11/27 02:32:02.861104 [INFO] agent: Endpoints down
--- PASS: TestChecksWatch_State (5.59s)
TestKeyPrefixWatch - 2019/11/27 02:32:02.982810 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:34056 (tcp)
TestKeyPrefixWatch - 2019/11/27 02:32:02.982956 [INFO] agent: Waiting for endpoints to shut down
TestKeyPrefixWatch - 2019/11/27 02:32:02.982995 [INFO] agent: Endpoints down
--- PASS: TestKeyPrefixWatch (6.18s)
TestKeyWatch - 2019/11/27 02:32:03.455011 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:34062 (tcp)
TestKeyWatch - 2019/11/27 02:32:03.455080 [INFO] agent: Waiting for endpoints to shut down
TestKeyWatch - 2019/11/27 02:32:03.455116 [INFO] agent: Endpoints down
--- PASS: TestKeyWatch (6.56s)
2019/11/27 02:32:03 [INFO]  raft: Election won. Tally: 1
2019/11/27 02:32:03 [INFO]  raft: Node at 127.0.0.1:34078 [Leader] entering Leader state
TestServiceWatch - 2019/11/27 02:32:03.470034 [INFO] consul: cluster leadership acquired
TestServiceWatch - 2019/11/27 02:32:03.470421 [INFO] consul: New leader elected: Node 2fa32e65-937a-0736-2514-c4c79f0d4535
TestServiceWatch - 2019/11/27 02:32:04.003699 [INFO] agent: Synced node info
TestServiceWatch - 2019/11/27 02:32:04.003812 [DEBUG] agent: Node info in sync
TestServiceWatch - 2019/11/27 02:32:04.909515 [DEBUG] agent: Node info in sync
TestServiceWatch - 2019/11/27 02:32:05.059219 [INFO] connect: initialized primary datacenter CA with provider "consul"
TestServiceWatch - 2019/11/27 02:32:05.059614 [DEBUG] consul: Skipping self join check for "Node 2fa32e65-937a-0736-2514-c4c79f0d4535" since the cluster is too small
TestServiceWatch - 2019/11/27 02:32:05.059761 [INFO] consul: member 'Node 2fa32e65-937a-0736-2514-c4c79f0d4535' joined, marking health alive
TestServiceWatch - 2019/11/27 02:32:05.229461 [DEBUG] http: Request GET /v1/health/service/foo?passing=1&tag=bar (5.134855ms) from=127.0.0.1:47244
TestServiceWatch - 2019/11/27 02:32:05.426194 [INFO] agent: Synced service "foo"
TestServiceWatch - 2019/11/27 02:32:05.426262 [DEBUG] agent: Node info in sync
TestServiceWatch - 2019/11/27 02:32:05.426338 [DEBUG] http: Request PUT /v1/agent/service/register (180.810965ms) from=127.0.0.1:47246
TestServiceWatch - 2019/11/27 02:32:05.426833 [DEBUG] agent: Service "foo" in sync
TestServiceWatch - 2019/11/27 02:32:05.426885 [DEBUG] agent: Node info in sync
TestServiceWatch - 2019/11/27 02:32:05.427467 [DEBUG] http: Request GET /v1/health/service/foo?index=10&passing=1&tag=bar (195.058155ms) from=127.0.0.1:47244
TestServiceWatch - 2019/11/27 02:32:05.430217 [INFO] agent: Requesting shutdown
TestServiceWatch - 2019/11/27 02:32:05.430292 [INFO] consul: shutting down server
TestServiceWatch - 2019/11/27 02:32:05.430360 [WARN] serf: Shutdown without a Leave
TestServiceWatch - 2019/11/27 02:32:05.469624 [WARN] serf: Shutdown without a Leave
TestServiceWatch - 2019/11/27 02:32:05.525239 [INFO] manager: shutting down
TestServiceWatch - 2019/11/27 02:32:05.525579 [INFO] agent: consul server down
TestServiceWatch - 2019/11/27 02:32:05.525625 [INFO] agent: shutdown complete
TestServiceWatch - 2019/11/27 02:32:05.525676 [INFO] agent: Stopping DNS server 127.0.0.1:34073 (tcp)
TestServiceWatch - 2019/11/27 02:32:05.525805 [INFO] agent: Stopping DNS server 127.0.0.1:34073 (udp)
TestServiceWatch - 2019/11/27 02:32:05.525944 [INFO] agent: Stopping HTTP server 127.0.0.1:34074 (tcp)
TestServiceWatch - 2019/11/27 02:32:06.526204 [WARN] agent: Timeout stopping HTTP server 127.0.0.1:34074 (tcp)
TestServiceWatch - 2019/11/27 02:32:06.526271 [INFO] agent: Waiting for endpoints to shut down
TestServiceWatch - 2019/11/27 02:32:06.526310 [INFO] agent: Endpoints down
--- PASS: TestServiceWatch (6.56s)
PASS
ok  	github.com/hashicorp/consul/watch	19.591s
FAIL
dh_auto_test: cd _build && go test -vet=off -v -p 4 -short -failfast -timeout 5m github.com/hashicorp/consul github.com/hashicorp/consul/acl github.com/hashicorp/consul/agent github.com/hashicorp/consul/agent/ae github.com/hashicorp/consul/agent/cache github.com/hashicorp/consul/agent/cache-types github.com/hashicorp/consul/agent/config github.com/hashicorp/consul/agent/debug github.com/hashicorp/consul/agent/exec github.com/hashicorp/consul/agent/local github.com/hashicorp/consul/agent/metadata github.com/hashicorp/consul/agent/mock github.com/hashicorp/consul/agent/pool github.com/hashicorp/consul/agent/proxycfg github.com/hashicorp/consul/agent/proxyprocess github.com/hashicorp/consul/agent/router github.com/hashicorp/consul/agent/structs github.com/hashicorp/consul/agent/systemd github.com/hashicorp/consul/agent/token github.com/hashicorp/consul/agent/xds github.com/hashicorp/consul/command github.com/hashicorp/consul/command/acl github.com/hashicorp/consul/command/acl/agenttokens github.com/hashicorp/consul/command/acl/bootstrap github.com/hashicorp/consul/command/acl/policy github.com/hashicorp/consul/command/acl/policy/create github.com/hashicorp/consul/command/acl/policy/delete github.com/hashicorp/consul/command/acl/policy/list github.com/hashicorp/consul/command/acl/policy/read github.com/hashicorp/consul/command/acl/policy/update github.com/hashicorp/consul/command/acl/rules github.com/hashicorp/consul/command/acl/token github.com/hashicorp/consul/command/acl/token/clone github.com/hashicorp/consul/command/acl/token/create github.com/hashicorp/consul/command/acl/token/delete github.com/hashicorp/consul/command/acl/token/list github.com/hashicorp/consul/command/acl/token/read github.com/hashicorp/consul/command/acl/token/update github.com/hashicorp/consul/command/agent github.com/hashicorp/consul/command/catalog github.com/hashicorp/consul/command/catalog/list/dc github.com/hashicorp/consul/command/catalog/list/nodes github.com/hashicorp/consul/command/catalog/list/services github.com/hashicorp/consul/command/connect github.com/hashicorp/consul/command/connect/ca github.com/hashicorp/consul/command/connect/ca/get github.com/hashicorp/consul/command/connect/ca/set github.com/hashicorp/consul/command/connect/envoy github.com/hashicorp/consul/command/connect/proxy github.com/hashicorp/consul/command/debug github.com/hashicorp/consul/command/event github.com/hashicorp/consul/command/exec github.com/hashicorp/consul/command/flags github.com/hashicorp/consul/command/forceleave github.com/hashicorp/consul/command/helpers github.com/hashicorp/consul/command/info github.com/hashicorp/consul/command/intention github.com/hashicorp/consul/command/intention/check github.com/hashicorp/consul/command/intention/create github.com/hashicorp/consul/command/intention/delete github.com/hashicorp/consul/command/intention/finder github.com/hashicorp/consul/command/intention/get github.com/hashicorp/consul/command/intention/match github.com/hashicorp/consul/command/join github.com/hashicorp/consul/command/keygen github.com/hashicorp/consul/command/keyring github.com/hashicorp/consul/command/kv github.com/hashicorp/consul/command/kv/del github.com/hashicorp/consul/command/kv/exp github.com/hashicorp/consul/command/kv/get github.com/hashicorp/consul/command/kv/imp github.com/hashicorp/consul/command/kv/impexp github.com/hashicorp/consul/command/kv/put github.com/hashicorp/consul/command/leave github.com/hashicorp/consul/command/lock github.com/hashicorp/consul/command/maint github.com/hashicorp/consul/command/members github.com/hashicorp/consul/command/monitor github.com/hashicorp/consul/command/operator github.com/hashicorp/consul/command/operator/autopilot github.com/hashicorp/consul/command/operator/autopilot/get github.com/hashicorp/consul/command/operator/autopilot/set github.com/hashicorp/consul/command/operator/raft github.com/hashicorp/consul/command/operator/raft/listpeers github.com/hashicorp/consul/command/operator/raft/removepeer github.com/hashicorp/consul/command/reload github.com/hashicorp/consul/command/rtt github.com/hashicorp/consul/command/services github.com/hashicorp/consul/command/services/deregister github.com/hashicorp/consul/command/services/register github.com/hashicorp/consul/command/snapshot github.com/hashicorp/consul/command/snapshot/inspect github.com/hashicorp/consul/command/snapshot/restore github.com/hashicorp/consul/command/snapshot/save github.com/hashicorp/consul/command/validate github.com/hashicorp/consul/command/version github.com/hashicorp/consul/command/watch github.com/hashicorp/consul/connect github.com/hashicorp/consul/connect/certgen github.com/hashicorp/consul/connect/proxy github.com/hashicorp/consul/ipaddr github.com/hashicorp/consul/lib github.com/hashicorp/consul/lib/file github.com/hashicorp/consul/lib/freeport github.com/hashicorp/consul/lib/semaphore github.com/hashicorp/consul/logger github.com/hashicorp/consul/sentinel github.com/hashicorp/consul/service_os github.com/hashicorp/consul/snapshot github.com/hashicorp/consul/testrpc github.com/hashicorp/consul/testutil github.com/hashicorp/consul/testutil/retry github.com/hashicorp/consul/tlsutil github.com/hashicorp/consul/types github.com/hashicorp/consul/version github.com/hashicorp/consul/watch returned exit code 1
make[1]: *** [debian/rules:50: override_dh_auto_test] Error 255
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
make: *** [debian/rules:13: build-arch] Error 2
dpkg-buildpackage: error: debian/rules build-arch subprocess returned exit status 2
--------------------------------------------------------------------------------
Build finished at 2019-11-27T02:32:10Z

Finished
--------


+------------------------------------------------------------------------------+
| Cleanup                                                                      |
+------------------------------------------------------------------------------+

Purging /<<BUILDDIR>>
Not cleaning session: cloned chroot in use
E: Build failure (dpkg-buildpackage died)

+------------------------------------------------------------------------------+
| Summary                                                                      |
+------------------------------------------------------------------------------+

Build Architecture: armhf
Build-Space: 0
Build-Time: 1725
Distribution: bullseye-staging
Fail-Stage: build
Host Architecture: armhf
Install-Time: 1802
Job: consul_1.4.4~dfsg3-5
Machine Architecture: armhf
Package: consul
Package-Time: 3589
Source-Version: 1.4.4~dfsg3-5
Space: 0
Status: failed
Version: 1.4.4~dfsg3-5
--------------------------------------------------------------------------------
Finished at 2019-11-27T02:32:10Z
Build needed 00:00:00, 0k disc space